EP4505210A1 - Positionsbestimmung-verfahren bzw. -system für eine positionsbestimmung von gegenständen - Google Patents
Positionsbestimmung-verfahren bzw. -system für eine positionsbestimmung von gegenständenInfo
- Publication number
- EP4505210A1 EP4505210A1 EP22722170.2A EP22722170A EP4505210A1 EP 4505210 A1 EP4505210 A1 EP 4505210A1 EP 22722170 A EP22722170 A EP 22722170A EP 4505210 A1 EP4505210 A1 EP 4505210A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- camera
- light signal
- scene
- shelf
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S1/00—Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith
- G01S1/70—Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith using electromagnetic waves other than radio waves
- G01S1/703—Details
- G01S1/7032—Transmitters
- G01S1/7038—Signal details
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K19/00—Record carriers for use with machines and with at least a part designed to carry digital markings
- G06K19/06—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
- G06K19/06009—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
- G06K19/06046—Constructional details
- G06K19/06112—Constructional details the marking being simulated using a light source, e.g. a barcode shown on a display or a laser beam with time-varying intensity profile
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0639—Locating goods or services, e.g. based on physical position of the goods or services within a shopping facility
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/22—Adaptations for optical transmission
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/02—Recognising information on displays, dials, clocks
Definitions
- the invention relates to a method and a system for determining the position of objects, in particular for detecting the position of products.
- the invention has therefore set itself the task of avoiding the problems discussed in a system or the method carried out with it.
- the subject of the invention is therefore a method for determining the position of at least one object in space, in particular of at least one product positioned on a shelf, the method comprises emitting a light signal, which is coded with identification information for identifying the object, from a light emitting device positioned adjacent to the object, and capturing a scene with the aid of at least one freely movable camera, wherein a digital scene generated with the aid of the camera Image represents the scene, and generating object position data based on a computerized position of the light signal occurring in the digital scene image in relation to the camera and the identification information emitted with the light signal and with knowledge of an automatically determined orientation and position of the camera in space when capturing the scene, where the object position data represents the position of the object in the scene with respect to the room.
- the invention therefore relates to a freely movable device, in particular a shopping cart or a pair of glasses or a mobile phone or a tablet computer, for determining the position of at least one object in space, in particular of at least one product positioned on a shelf, this position being determined by a light signal emitted by a light emitting device positioned adjacent to the object, which encodes identification information with the help of which the object can be identified, is displayed, the device carrying at least one camera which is designed to generate a digital scene image of a scene captured with it is, and wherein the device, in particular the camera, is designed for computerized determination of the position of the light signal occurring in the digital scene image in relation to the camera and the identification information emitted with the light signal, and wherein the device, in particular the camera, is designed for this purpose is designed to at least support an automated determination of the orientation and position of the camera in space when capturing the scene.
- the invention therefore relates to a light emitting device, in particular a shelf label, which is designed to display product and/or price information, or a Product runner, which is designed to separate different products, the light emitting device having: a storage stage for storing identification information, with the help of which an object, in particular a product positioned on a shelf, can be identified, in the vicinity of which the light emitting device is positioned, and a Light signal generation stage, which is designed to encode a light signal according to the identification information and to emit this coded light signal.
- the invention therefore relates to a system for determining the position of at least one object in space, in particular of at least one product positioned on a shelf, the system having: at least one light emitting device according to the invention and at least one freely movable device according to the invention.
- the measures according to the invention have the advantage that a more powerful and cost-effective system is created that can be implemented and operated because, in comparison to the system discussed at the beginning, fewer devices that are permanently installed on shelves and that have to be provided and maintained separately are necessary. This primarily concerns the omission of the device discussed at the beginning, which provides information for identifying the shelf on which it is mounted.
- a much more efficient method is created because the full focus is placed on the emitted light signal, which is localized corresponding to the respective location of an object or product on the shelf or has its origin there and is suitable for identifying the object .
- the freely movable camera enables a temporally dynamic and location-variable recording of the positioning of the objects in a business premises.
- the position detection of the items in the store therefore does not require any camera-based detection of the items themselves or a camera-based detection and evaluation of the product-specific information conveyed with the help of the items or a shelf label, for example in a digital recording (digital photo or video) of the item or in a digital recording of a paper-based shelf label or in a digital recording of the screen content of an electronic one Shelf labels would require a lot of digitized processing effort to be found.
- the invention makes it possible to determine the position of the object based on the light signal in the space of a business premises, i.e. to locate the object.
- the mentioned identification information is used, which is communicated with the help of the light signal.
- This identification information is obtained by previously logically linking the object to a light emitting device assigned to it. Specifically, the logical link is about the so-called “binding”.
- ESL electronic shelf display label
- the same also applies to a so-called shelf separator or product separator, which separates different objects or items on a shelf . Separates products or product groups from one another.
- the barcode that uniquely identifies the separator can be shown on a label of the separator or can also be displayed on a screen of the separator. Either the ESLs or the separators are usually installed on shelves, although mixed configurations are also possible . Accordingly, the identification information can be a unique identifier of the ESL or the isolator or can also be a unique identifier of the product.
- other information for example meta information, can also be used here, which represents the logical link that was created during binding, and this meta information can be used to draw conclusions about the product or product group.
- the light emitting device which emits the light signal, is designed here as an ESL or isolator.
- the light emitting device saves the respective identification information by means of identification data in its storage level, which is implemented, for example, by an EEPR.OM.
- the identification data can be transmitted from a central data processing device, for example by radio or by cable, to the respective light emitting device.
- the light emitting device has a light signal generation stage with the help of which the coded light signal can be emitted.
- the light generated using, for example, a light-emitting diode is modulated according to the applicable (predetermined) coding.
- the light signal can be located in the visible spectral range of light. However, the invisible spectral range will preferably be used in order not to unnecessarily distract customers and employees from their usual activities with light signals.
- the light signal is emitted during at least a period of time necessary to transmit all of the identification information. This transmission can occur once or several times in succession, in particular after an external request (for example by a control signal from a central data processing device or the camera), or several times automatically, such as controlled by an internal timer of the light emitting device.
- a still image camera or a video camera can be used as a camera, which is designed to create a two-dimensional digital image of the scene located in its detection range, i.e. a digital scene image. Regardless of whether it is a still image camera or a video camera, at least as many individual scene images or a sufficiently long video are recorded as a scene image so that the identification information, which is one, can be evaluated Period of time occurs is possible. When calculating the minimum duration of the recording, the time range that is at least necessary for sending out the identification information is taken into account.
- the identification information is encoded over a period of time
- a still image camera for example, at least as many images are captured within the time period intended for transmission that the identification information can be easily determined from the changes in the light signal in the scene image.
- the real scene that is captured by the camera includes the objects or objects in the scene and, if necessary, their changes over time or location.
- the digital scene image is the optical image of the real scene via the optics (a lens optic, also called lens) of the camera onto the digital image capture unit of the camera, for which a so-called CCD is usually used, where CCD stands for “charge- coupled device” and so that the image of the scene is digitized on it.
- the scene image is therefore imaged on a pixel matrix of this image capture unit.
- the digital scene image is a two-dimensional data structure that is obtained through the matrix of pixels of the CCD.
- a digital still image forming the digital scene image or a digital video sequence of the scene is generated.
- the resolution of the electronic image sensor and/or the digital post-processing defines the image resolution (total number of pixels or the number of columns (width) and rows (height) of a raster graphic) of the digital scene image.
- the digital scene image generated by the camera for further processing is therefore formed by a matrix of pixels, for example 1500 pixels in the x direction and orthogonally thereto 1000 pixels in the y direction. Each of these pixels has a defined (i.e. known) pixel dimension (or in other words, the center of neighboring pixels has a defined (i.e. known) pixel spacing.
- the light signal is first searched for in the digital scene image. This is relatively easy to find because the light signal is fundamentally different from the otherwise dark background of the digital scene image. For this purpose, for example, pixels of this digital scene image can have a characteristic color or a characteristic Intensity or brightness can be checked. To search for the light signal, the color change over time or the change in brightness or intensity expected according to the coding can also be used. As soon as such a light signal has been found, the temporal change (modulation) of a light signal parameter is analyzed with knowledge of a coding scheme used when emitting the light signal in order to decode the identification information therefrom.
- the position of the light signal found in the digital scene image i.e. the pixel coordinates, and also a data representation of the identified identification information are stored as belonging together.
- the determination of the position of the products or product groups is based entirely on the position determination of the light signals and their identification with the help of the identification information in the digital scene images, which are created over time with the help of the movable or moving in the business premises Camera can be obtained.
- the light signal which can be clearly identified in the respective digital scene image using the identification information, allows the assignment of a real object, which by definition is positioned in a room near the light emitting device.
- the position of the respective light signal in relation to the camera is first determined and then transformed into the coordinate system of the room (the room coordinate system), which has its origin at a defined location in the room and there with a defined orientation.
- only a single freely movable camera can be used, which is carried, for example, by an employee through the room, such as the business premises.
- the method is preferably carried out with the aid of several cameras that move independently of one another. These cameras can be moved independently by store employees, speeding up the capture process in capturing various scene images throughout the room.
- a so-called fisheye lens can be used.
- the lens itself can also be designed to be movable, i.e. pivotable in relation to the camera.
- the camera can also be pivoted relative to the freely movable device.
- an omnidirectional camera can also be used.
- employees can be equipped with (protective) glasses, for example, to which the respective camera is attached.
- the freely movable camera is attached to a shopping cart. This has the advantage that the number of image-capturing cameras is no longer limited to the number of employees moving around the store, but rather the individual movements of the number of customers, which usually far exceeds the number of employees, are used to capture images. The customers move a swarm of image-capturing cameras around the business premises.
- Involving customers also has the advantage that the image capture can be used to determine those business areas in which products attract increased customer attention or where they actually attract customer attention Products are removed from the shelves because, based on the captured scenes, it can be determined that these business areas occur more frequently or because people spend longer periods of time there. From this, important insights can be gained for warehousing, inventory recording, planning shelf supply with goods, as well as for target group marketing or product-specific marketing.
- the freely movable device is a pair of glasses or the like or a shopping cart
- several cameras are provided on the freely movable device, which are together, i.e. grouped together
- This freely movable device can be moved and generate individual digital scene images from different detection positions and / or with different detection orientations. This makes it possible for digital scene images of different scenes to be generated with each camera at the respective position in which the freely movable device is currently located.
- the shelves delimiting the shelf aisle on the left and right sides of the shelf aisle can actually be captured simultaneously if, for example, one of the cameras is aligned to the left and the other camera to the right in relation to the direction of travel of the shopping cart, i.e.
- several individual cameras can be provided on one or each of the sides, especially in the case of a shopping cart. These can be arranged one above the other in a shopping cart, for example mounted on a rod, or can also be oriented differently, i.e. cover an upper shelf area, a middle shelf area and a lower shelf area. Even with glasses, the cameras oriented towards one of the sides can have different detection areas arranged one above the other, which capture an entire image area from the lowest shelf to the highest shelf.
- wide-angle cameras can of course also be used to cover the same or a similar overall detection area.
- the individual detection areas can also overlap.
- the ESLs or the isolators that form the light emitting device are often battery powered. It has therefore proven to be particularly advantageous if a control signal transmitter that moves with the at least one freely movable camera emits a control signal, in particular a radio-based control signal, and the light emitting device located in the reception area of the control signal, which is equipped with a receiver and is designed to receive the control signal , only emits the light signal when the control signal is received.
- a relatively simple receiver is required, which may also be able to evaluate a unique identifier of the control signal in order not to react to other radio signals.
- the light emission can be limited to a limited area of space around the control signal transmitter using a wide variety of light emission devices.
- a range of the control signal of five to ten meters can be sufficient to activate only the light signal generation stages positioned in the vicinity of the at least one camera to emit the respective light signal.
- the range of the control signal should be such that the camera's detection range can be optimally utilized. At least those light signal generation stages whose light signal can reasonably be detected by the camera should be active. This has the advantage that other light signal generation stages can remain inactive, which has a positive influence on the energy balance of the respective light emitting device, in particular on the energy balance in the entire system.
- radio technology can be used in which the emitted control signal is at least partially shielded by the often metallic shelves, so that mainly those light emitting devices that are located in the same shelf aisle as the freely movable device can receive the control signal.
- the light emitting device can be designed to only emit the light signal when certain (time-dependent) criteria are met.
- the light emitting device has a timing control stage which is used to control the temporal behavior, i.e. the activity level. or the inactivity phase(s) of the light signal generation stage.
- the light signal generation stage is of course designed to be controllable by the timing control stage and can be controlled by this.
- the timing control stage can, for example, receive time data that contains information about the current time and/or have a clock generator, with which the timing control stage determines the current time or the current date.
- this information can also be obtained in other ways, such as a system clock or a system time or a system date that is globally available in the system.
- the criteria can include, for example, that the light signal is only emitted if it has not already been emitted within a certain time period.
- the criteria can, for example, also include that the light signal may only be given within a certain time range.
- This time range can, for example, contain one or more fixed predefined time windows during a day or during a week or a month.
- the system can be set so that light signals are only sent out on certain days of the week.
- the entire business location can be surveyed once a week to check the consistent positioning of the products.
- the light signal emission in different shelf aisles in the business premises can also be assigned to different days of the week, so that aisles that are close together do not emit light signals at the same time.
- the criterion can be defined in such a way that the light signal is only emitted within the time period in which the employees usually, i.e. predictably, walk through the business premises with the glasses in order to prepare them for operations the next day, for example to prepare.
- the method has proven to be particularly advantageous if the determination of the position in relation to the camera involves automatically determining a scale that can be derived from the digital scene image, the scale being determined with knowledge of the real dimension of a reference object identified in the digital scene image , in particular designed as an electronic shelf label, is determined and the scale for the conversion between position information or dimensions that are in the Digital scene image is determined, and position information or dimensions in the scene captured with the help of the camera is used.
- a reference object contained in the scene image preferably identically designed reference objects, can be identified, the actual dimensions being known and the scale for the scene image being determined with knowledge of the actual dimensions.
- the reference object is particularly preferably the light emitting device itself, which is implemented as an ESL or isolator with usually defined, i.e. previously known, dimensions.
- ESL or isolator with usually defined, i.e. previously known, dimensions.
- These reference objects can be easily found in the digital scene image due to the “inherent” light signals that they emit, i.e. localized or delimited and subsequently analyzed to determine the scale.
- pattern recognition can be used, which examines the digital scene image for the characteristic appearance (such as rectangular or square or even proportions) of the reference object.
- the starting point for the search for the reference object(s) is the light signal, which is usually emitted by the reference objects and therefore makes them particularly easy to find in the digital scene image.
- the reference object can also be a “composite” reference object, which is composed of a collection of light signals from different light emitting devices.
- the light signals from two light signal emitting devices that are positioned essentially exactly one above the other on different shelves can also be one form a "composite” reference object, in which situation the distance between the shelves is the known real dimension used to determine the scale.
- the present invention finds its preferred application in retail.
- a reference object can have a variety of forms.
- it can be an entire shelf of which the length and height are known exactly.
- Using the shelf can be advantageous because it is in often takes on a dominant role in a scene image.
- smaller objects such as a shelf strip that forms the front end of a shelf, can also be used as a reference object.
- Shopping carts that are set up to present goods in the business premises are also suitable for this, provided they are positioned within the detection range of the respective camera.
- shelves or shelf strips as well as shopping baskets are often provided by a wide variety of manufacturers with a wide variety of dimensions for the respective business premises of various retailers, often under the retailers' special structural specifications. Therefore, they are only suitable as reference objects in a very narrow application.
- an electronic shelf label as the reference object is also so advantageous because such electronic shelf labels have essentially uniform dimensions.
- electronic shelf labels exist in a wide variety of dimensions, which vary greatly. However, in practice it has been shown that the dimensions used vary little or only within a predefined range between different business premises or even different retailers. This applies in particular to the large number of electronic shelf labels that are installed on a shelf rail or the shelf rails of this shelf in a single store and come from a single manufacturer, which is usually the case.
- Such electronic shelf labels usually only come in one to two (perhaps three) different dimensions on a shelf. Since each of these electronic shelf labels must fit into the same shelf rail, their dimensions often only differ in width, whereas the height is often identical for, for example, two different types of electronic shelf labels. However, the opposite situation can also apply. Their actual dimensions can therefore essentially be classified as homogeneous across different types of shelf labels as well as across the installation locations.
- shelf labels are always attached to the front of a shelf and can therefore be easily and clearly identified in the scene image captured by the camera using digital image processing and therefore reliably analyzed.
- the basis for determining the scale is counting the relevant image points that are assigned to the reference object.
- the number of pixels determined in this way can be related to the real area or the real length of a reference object, which enables scaling from the pixel system of the CCD sensor into the real scene.
- Knowledge of the exact physical parameters of the CCD sensor can also be used when determining the scale, as a result of which the actual distances between the pixels and the pixel matrix or their size are precisely known.
- the counted pixels can be converted to an actual length or area measurement for the reference object imaged on the CCD sensor, the existing scale can be calculated from this, and then between the length and area dimensions of an image on the CCD sensor and the real one Scene can be scaled.
- the determination of those pixels in the scene image assigned to the image of the reference object can be carried out using at least one of the measures listed below, namely:
- the area occupied by the reference object image (specify the area as the sum of the counted pixels or specify the area as the total pixel area of the counted pixels) in the scene image can be determined and, knowing the actual area of the reference object (e.g. area of the front surface of the reference object given in square millimeters) calculate the scale.
- the scope of the reference object image in the scene image is determined by counting the pixels either based on the pixels just occupied by the reference object image on the edge or based on the pixels attached to the scene image Reference object image immediately adjacent pixels are determined.
- the scale can be calculated by knowing the actual circumference of the reference object (e.g. the circumference of the front of the reference object).
- the length of an edge line is used as the basis for determining the scale.
- This can in particular be a straight edge line, such as one side of a rectangular or square structure of the reference object, which can be provided, for example, by housing edges.
- the pixels occupied along such a boundary line are either counted or the pixels surrounding the reference object image adjacent to one of its boundary lines are counted.
- the scale can be calculated by knowing the actual length of the boundary line of the reference object.
- a scale can also be defined that scales along the scene image, i.e. a scale can be defined that depends on the location in the scene image. This location-dependent scale may be necessary if, for example, the scene image distorts the proportions of the reference objects depicted there. Such a situation can occur, for example, if the camera is used to record a scene that extends far to the left or right side of the camera, which can happen, for example, in aisles in retail stores if the camera is unfavorably oriented. The reference objects that are close to the camera are then imaged larger than reference objects that are positioned further away from the camera.
- a first data structure can be generated using the scale for the scene image, which creates a two-dimensional digital map of the light signals in the real scene, specifying the actual dimensions required for the two-dimensional cartography ( n) (e.g. measured in millimeters). For this purpose, for example, only linked dimensions (i.e. relative dimensions between neighboring light signals) can be created to determine the positions of the light signals. Absolute measurements can also be created measured from a reference point in the scene image. This two-dimensional map obtained in this way, which is stored digitally, is subsequently used to place it in a three-dimensional context in relation to the camera, which will be discussed below.
- determining the position with respect to the camera includes automatically determining the distance between the camera and the scene captured by it. This allows the light signals identified in the digital scene image to be located three-dimensionally in relation to the camera, i.e. in the camera coordinate system. The two-dimensional location of the light signals already known in the scene is expanded by a third coordinate.
- the distance can be estimated in an essentially known spatial arrangement, such as a shelf aisle, i.e. with approximately half the aisle width. This distance can also be easily limited using the so-called lens equation, because the scale or metric can be defined with good accuracy using the reference object found in the digital scene image, whose dimensions are known very precisely.
- the distance between the camera and the respective light-emitting light emitting device can therefore be determined by automatic calculation with knowledge of the parameters of the camera's optical imaging system. This can be done, for example, by the camera's computer fully automatically because it can obtain the parameters of the optical imaging system from its memory, where they were programmed in advance (e.g.
- the distance of the camera to the real object from which the light signal is emitted can be calculated in the scene, of course using the imaging function corresponding to the actual lens of the camera is. This allows the positions of the light signals to be determined in the spatial context of the camera coordinate system. It should also be mentioned here that the distance between the camera and the light emitting device can also be determined by automatic determination using a distance sensor, for which, for example, a LIDAR sensor or similar can be used.
- the three-dimensional coordinate information calculated in the camera coordinate system for a specific light emitting device has a variability or dynamic that is due to the movement of the camera is specified.
- the method step of emitting the light signal is carried out essentially simultaneously by a plurality of light emitting devices, and wherein the method step of computerized determination of the position of the light signal occurring in the digital scene image in relation to the camera and the identification information emitted with the light signal for all those appearing in the digital scene image Light signals are carried out.
- This has the advantage that the large number of light signals contained in a scene image is used collectively in order to determine the position of the large number of light signals in a single detection area of the Karners. This speeds up and optimizes the positioning process.
- supplementary data is taken into account, which has at least one of the types of data listed below, namely:
- This additional data can be generated with the help of a wide variety of sensors that exist on the camera or the freely movable device, such as the shopping cart or glasses, which will be discussed below.
- the orientation of the camera in the spatial coordinate system can be done by an automatic determination using an orientation sensor, for example an electronic compass can be used for this purpose and a camera computer further processes the data transmitted by the electronic compass or provides this data for conversion between the coordinate systems .
- the inclination of the camera in relation to the horizontal, represented by tilt data can also be understood as part of the orientation. It can be detected by an automatic determination using an inclination sensor, for which an electronic gyroscope can be used for this purpose and the camera's computer can use the electronic gyroscope further processes the transmitted data or provides this data for conversion between the coordinate systems.
- the position of the camera in the spatial coordinate system can be achieved by an automatic radio-based position determination, in particular with the help of “ultra-wideband radio technology” (UWB radio technology for short), with UWB transmitters that are preferably permanently installed (at different points in the relevant spatial area).
- UWB radio technology for short
- UWB transmitters that are preferably permanently installed (at different points in the relevant spatial area).
- UWB radio module with a known position in the spatial coordinate system and the camera has a UWB radio module, with the help of which the position of the camera in relation to the UWB transmitter is determined for the respective UWB transmitters in UWB radio communication with the camera and From this, the position data is generated, which is further processed by the camera's computer or provided for the conversion between the coordinate systems.
- the position of the camera determined in this way in the spatial coordinate system can be a good approximation of the origin of the Camera coordinate system can be equated. Otherwise, a fix that takes the difference into account would have to be implemented.
- the path traveled can also be recorded using a sensor (e.g. attached to the rollers of a shopping cart, which detects the rolling movement of the rollers or wheels) or a sensor for detecting accelerations, from which the path traveled and the direction taken can be determined Sensors generate data describing the path traveled, which is further processed by the camera's computer or provided for conversion between the coordinate systems.
- a sensor e.g. attached to the rollers of a shopping cart, which detects the rolling movement of the rollers or wheels
- Sensors generate data describing the path traveled, which is further processed by the camera's computer or provided for conversion between the coordinate systems.
- this requires a known starting point in the spatial coordinate system in order to be able to describe the path
- the conversion into the spatial coordinate system can be carried out directly in the camera and the object position data of the real scene in the spatial coordinate system obtained in this way can be transferred from the camera for further purposes Processing can be carried out by, for example, a central data processing device.
- the camera can also record the position data of the light signals in space Context of the camera coordinate system together with the additional data available on the camera and the conversion into the spatial coordinate system is carried out by a central data processing device, for example.
- the three-dimensional positions of the individual light emitting devices as well as the identification information conveyed by them are now known in the spatial coordinate system.
- the objects actually affected for the respective position are now queried from a database of, for example, the (central) data processing device and the (central) data processing device subsequently generates a data for the entirety of the objects from the object position data determined for each object.
- this floor plan is not generated manually but fully automatically based on the light signals from the electronic shelf labels or dividers installed in the store and with the help of freely movable cameras, which record the light signals over time as they move in the store.
- sequences of scene images are used together to determine the position of objects. These sequences can also affect overlaps in the detection areas. This makes it possible to determine positions for one and the same light emitting device in, for example, successive scene images and thus to couple the scene images with one another by superimposing the positions of the light signals in the different scene images. This coupling of the scene images leads to an expansion of the detection range of one camera or several cameras used.
- Data that has already been generated which locates light signals and provides their identification information, can be used to interpret newly generated scene images.
- image position data that has already been generated i.e. the position and identification of the light emitting device that has already been detected
- newly generated scene images can be created. Almost interpret images in which the same light emitting device is shown.
- Light emitting devices that have not yet been recorded can easily be described relative to the position of the known light emitting device.
- Dynamic light signal detection is therefore also possible, in which the imaged light signals move in a temporal sequence of scene images. The positions of new light signals appearing in the sequence of scene images are described using the already known positions of light signals. The coupling therefore allows a dynamic and at the same time safe and reliable capture and description of the scene.
- a local, fixed coordinate system can be defined in each corridor, so that the object position data is described in this coordinate system.
- the dimensions and/or positions etc. described in the individual coordinate systems can then easily be converted into one another if the relationship between the coordinate systems is known. In this way, the object position data can be transferred from the local, fixed coordinate systems into the fixed spatial coordinate system that encompasses or describes the entire business premises.
- the freely movable device can also have a rechargeable battery or a battery for supplying the electronic components with energy.
- the freely movable device can also have a generator for converting kinetic energy into electrical energy.
- the electrical power provided by the generator can be used directly to power the electronic components or in the rechargeable battery be cached.
- this generator is preferably connected to or integrated there with at least one wheel of the shopping cart, so that the rotational movement of the wheel drives the generator.
- the camera can also be operated autonomously on the freely movable device.
- the shopping cart can also provide its own drive to support the customer in a self-driving manner.
- the energy source is charged contact-free, for example at the usual collection points for such shopping carts.
- a shopping cart when using a shopping cart, it can have a screen, in particular a touch screen, and an associated data processing unit, which allow the customer to interact with the shopping cart.
- a digital shopping cart can be displayed on the screen that shows which goods have been placed in the shopping cart. It can also be used to display the customer's shopping list. It can also be used to provide detailed information about products.
- the shopping cart can have a barcode scanner or an NFC communication module, with which the products that are placed in the shopping cart can be scanned or recorded.
- the shopping cart can also have another camera or use the camera previously discussed to capture products that are placed in the shopping cart. The camera records the product that is placed in the shopping cart and creates a product image.
- the products can be identified here, for example, using a barcode or QR code or similar.
- the object position data which is essentially available in real time, also allows the product to be identified based on its structure in the product image. Using the currently generated object position data, the product image can be compared with a reduced selection of possible products compared to the overall range, which, according to the object position data, are in the scene or the environment. The object position data thus allows faster and more reliable processing of the product image and recognition of the products contained therein.
- a Such shopping cart may further include a scale designed to weigh products in the shopping cart. The data generated from this can be used, for example, to determine a weight-dependent price.
- the inventory of goods recorded in this way in the shopping cart can also be used for a reliable self-checkout process including payment for all goods, especially those that are weight-dependent.
- an NFC payment terminal can be provided on the shopping cart, with which payment can be processed, for example, using the NFC functionality of a mobile phone.
- a barcode scanner can be provided in the shopping cart, which means that the self-checkout process can be carried out in any case.
- the shopping cart can also have an NFC communication module for detecting goods on the shelves, for example to detect NFC tags attached to objects as the shopping cart drives past.
- the shopping cart can not only be used to determine the position of the items, but can also be used to record the number of items actually present at the respective (shelf) location. This means that the inventory of goods in the store can be continuously monitored via the swarm of shopping carts moving around the store and the recorded inventory data can be transmitted from the shopping cart to the server via a radio communication module, where it is entered into the floor plan and there too can be visualized.
- the customer can also be provided with location-specific marketing information using the screen, because the server 8 is always informed about the location of the shopping cart with the help of the supplementary data.
- a customer behavior analysis can be carried out in the server 8, because the movement patterns of the shopping cart are available to the server 8 in real time, possibly even with data that represent or indicate the inventory of goods in the shopping cart.
- the shopping cart can also have a visual signaling device (e.g. an LED or the screen) that can be used to convey that the shopping cart is ready to be occupied or that the user of the shopping cart needs assistance from the staff in the store.
- a visual signaling device e.g. an LED or the screen
- the shopping cart can also have a cost-effective LIDAR system, with the help of which a special attention can be drawn from the customer in a specific environment, e.g. by displaying information related to the environment on the screen.
- the electronics can be constructed discretely or through integrated electronics or a combination of both.
- ASICs application specific integrated circuits
- Many of the mentioned functionalities of the devices are implemented - possibly in conjunction with hardware components - with the help of software that is executed on an electronics processor.
- Devices designed for radio communication usually have an antenna configuration for sending and receiving radio signals as well as a modulator and/or a demodulator as part of a transceiver module.
- the electronic devices can also have an internal electrical power supply, which can be implemented, for example, with a replaceable or rechargeable battery.
- the devices can also be powered by wires, either through an external power supply or via “Power over LAN”.
- a radio-based power supply via “Power over WiFi” can also be provided.
- Fig. 3 shows an application of the system in a shelf aisle in a business premises.
- Figure 1 shows a system 1 for carrying out a method for determining the position of products PI to P6 forming objects in a sales premises, hereinafter referred to as room 2 for short.
- a stationary, orthogonal, right-hand spatial coordinate system 3 with its coordinate axes XR, YR and ZR is drawn in space 2, so that the three-dimensional positions of the products PI to P6 can be specified in this spatial coordinate system 3.
- a digital representation of these three-dimensional positions is referred to below as object position data for the respective product P1 - P6.
- each shelf label 4A - 4F also has a light-emitting diode 6 located on the front adjacent to the screen 5, which is intended to emit a light signal, whereby when the light signal is emitted, it is coded according to individual identification information and the identification information for the clear identification of the respective product Pl - P6 serves.
- a binary code that forms the identification information is transmitted, the current through the light diode 6 being either switched on or off depending on the respective bit value. This transmission can also be embedded in defined start and stop bits etc.
- the system 1 also has a server 8, which maintains the logical connection between each product P1 - P6 and the one assigned to it Shelf label 4A - 4F stores.
- the shelf labels 4A - 4F receive their respective product and/or price information to be displayed from the server 8, which is transmitted to the respective shelf label 4A - 4F using a radio shelf label access point 9. Since the server 8 knows the identity of each shelf label 4A - 4F and the product P1 - P6 assigned to the respective shelf label 4A - 4F, it is basically sufficient that the light signal only sends out the identification information of the respective shelf label 4A - 4F as identification information in order to Server 8 to be able to clearly identify the respective affected product PI-P6 based on the identification information.
- the system 1 also has a camera 10 which is freely movable in the room 2, so that it can basically be moved into all areas of the room 2 in order to capture the existing scene there using a digital scene image.
- the scene captured using the camera 10 can be described in three-dimensional coordinates has its origin, and the plane spanned by the coordinates XK and ZK runs in the image capture plane of the CCD sensor and has a granularity according to the pixels of the CCD sensor.
- each pixel can be specified by a two-dimensional pixel coordinate system with the axes XKP and ZKP, where the axis XKP runs and is oriented according to the axis XK and the axis ZKP according to the axis ZK.
- the pixels of the digital scene image i.e. the result of the digitization of the image of the scene carried out with the help of the CCD sensor, can also be addressed according to the two-dimensional pixel coordinate system.
- the third coordinate YK of the camera coordinate system 11 points through an optics (or a lens - not shown) of the camera 10 in the direction of the scene to be captured.
- the lens essentially defines a detection area E of the camera. If a digital zoom is present, this detection range can of course also depend on the digital zoom setting.
- the spatial orientation and location of the camera coordinate system 11 in space 2, i.e. in the spatial coordinate system 3, is therefore dependent on the respective position as well as the orientation of the camera 10 in space 2.
- the base area (the floor) of room 2 coincides with the plane which is defined using the coordinate axes XR and YR.
- the camera is also aligned parallel to the base area with regard to its inclination, which means that the area spanned by the coordinate axes XK and YK of the camera coordinate system 11 runs parallel to the base area. So it is aligned without any inclination. If the inclination were to be taken into account, a sensor in the camera would be used to generate 10 inclination data which represented the determined inclination.
- a reference orientation in space 2 is given by the direction of the coordinate axis XR of the spatial coordinate system 3.
- the current orientation of the camera 10 is therefore specified in relation to this direction. This can be done with the help of an electronic compass of the camera 10, which is set or programmed to this direction as a reference orientation.
- An electronic magnetic compass can also be used, with the help of which the orientation determined with respect to the north direction can be converted in relation to the reference orientation. Regardless of the type of determination of the current orientation, the orientation data obtained in this way is provided by the camera 10 for further processing.
- the system 1 also has a radio camera access point 12, which is connected to the server 8 and with the help of which the orientation data is transmitted from the camera 10 to the server 8.
- the camera 10 has a corresponding camera data radio communication stage 12A (not shown in detail), of which only a first antenna configuration is shown.
- the camera data radio communication stage 12A is used to transmit the orientation data, possibly also to transmit the inclination data if these need to be taken into account, and also other image processing data that is the result of a Image processing is to be transmitted using a computer of the camera 10, hereinafter referred to as a camera computer (not shown).
- the position of the camera 10 in space 2 is determined with the help of UWB radio and this position is assigned to the origin of the camera coordinate system 11 to a good approximation.
- a UWB radio system 13 is provided, the position of which is clearly defined in room 2, and which is coupled to the server 8 in order to transmit the camera position data determined by UWB radio, which shows the position of the camera 10 in room 2 ( specifically in relation to the position of the UWB radio system 13, from which in turn can be converted to which can be converted by the spatial coordinate system 3), to be transmitted to the server 8.
- the origin of the spatial coordinate system 3 can also be located at the location of the UWB radio system in order to avoid the conversion discussed.
- the camera 10 has a corresponding camera UWB radio communication stage 13A (not shown in detail), of which only a second antenna configuration is shown.
- the still image series which is represented by image data, also contains, in particular, the coded light signal of each shelf label 4A - 4F at the respective pixel coordinates XKP and ZKP.
- Figure 2 now visualizes the image of the scene on the level of the pixel matrix of the CCD sensor of the camera 10 and shows the brightness present at the imaging location for the respective light signal at five different times tl to t5 with the “circle” and “star” symbols.
- a “circle” symbolizes darkness, that is, that emit respective coded light signal LED 6 is currently switched off.
- a "star” symbolizes brightness, i.e. that the light-emitting diode 6 emitting the respective coded light signal is currently switched on.
- a scale is determined which results from knowledge of the real dimensions (e.g. edge length of the front border) of the shelf labels 4A - 4F present in the digital image.
- the positions of the respective shelf labels 4A - 4F can be easily determined with computer assistance in the digital scene image because the light signal is located within the visible front or front border of the respective shelf label 4A - 4F.
- the camera's computer also knows the real dimensions (i.e. the actual length of the sections of the front border) of these shelf labels 4A - 4F used as reference objects.
- the respective front border is determined in the digital image by pattern recognition and the number of associated pixels is determined along this (along the longer and/or shorter side).
- the scale for converting counted pixels i.e. distances or lengths in the pixel system of the digital scene image
- real units of length e.g. millimeters
- This allows the relative distances between the light signals in the real scene to be determined, i.e. specified in the XK-ZK plane of the camera coordinate system 11. That basically means they are Positions of the Rega I and i chains 4A - 4F and subsequently the positions of the associated products Pl - P6 are defined in this XK-ZK level.
- the distance from the camera 10 to the scene is now determined in order to supplement the two-dimensional location in the camera coordinate system 11 with the third dimension.
- the physical parameters of both the CCD sensor and the lens are known, i.e. the length of an object imaged on the CCD sensor can be determined in e.g. millimeters on the CCD sensor, and also the dimensions for the respective reference object in the real scene are known, this distance between the camera and the real scene can be easily calculated using the lens equation with the help of the computer of the camera 10, which stores the necessary information, so that the position of the light signals in the camera coordinate system 11 along the coordinates XK, YK and ZK can be specified in metric dimensions (e.g. millimeters).
- object position data determined in the camera coordinate system 11 (which are still available here in the coordinates XK, YK and ZK) for each light signal (in the broader sense of course also for each shelf label 4A - 4F) together with the respective identification information transmitted from the camera 10 to the server 8.
- the server 8 also has the camera position data that was determined for the location S1 at the time the image series was recorded using UWB radio communication.
- the orientation data and the camera position data together form supplementary data, with the help of which the position information of the light signals valid for the camera coordinate system 11 is converted into the spatial coordinate system 3 by conversion (coordinate transformation taking into account the fact that the position of the UWB radio system 13 is not included coincides with the origin of the spatial coordinate system 3) are transferred to the server 8.
- the object position data then define the respective position of the light signals in the coordinates XR, YR and ZR.
- the camera 10 can basically be freely movable. It can therefore vary depending on the implementation For example, it can be mounted on glasses or on a shopping cart 14, as shown in FIG.
- FIG. 3 shows a shelf aisle that is flanked on both sides by shelves 7 and in which the shopping cart 14, which carries the camera 10 on a towering rod 15, moves along the path S.
- the shopping cart 14, which carries the camera 10 on a towering rod 15, moves along the path S.
- the first six shelf labels 4A - 4F and the first six products Pl - P6 have been provided with reference numbers, with the products Pl - P6 only having dashed frames that indicate the position of the products P1-P6 on the shelf 7 , are indicated.
- the camera 10 is oriented towards the left side of the carriage 14, i.e. the left shelf 7 is in the detection area E of the camera 10, the length of the shelf 7 exceeding the width of the detection area E.
- the camera 10 therefore captures the respective scene located in its detection area E several times, for example a first time at a position S1 and a second time at a position S2.
- the series of still images generated at each location SI or S2 is processed by the computer of the camera 10 analogously to the previously discussed discussions and subsequently transmitted to the server 8, which uses the conversion to locate the light signals, i.e. ultimately the products completed in the spatial coordinate system. If, as can be seen in the present example, the series images captured from adjacent positions PI and P2 have an overlap of shelf areas, this can be used to improve the results of determining the object position data, because this is repeated for one and the same light signal be generated in the overlap area.
- the same also applies to multiple recordings at completely different times with one and the same camera 10 or with different cameras 10 that compete with different shopping carts 14.
- the large number of moving shopping carts 14 also ensures in larger stores that the entire store is mapped with the help of the customers, i.e. that over time the item position data for the entirety of the products is recorded.
- the shopping cart 14 In order to capture the right shelf 7 shown in FIG. 3, the shopping cart 14 must be turned around and back along the aisle can be moved or have two cameras 10, of which a first camera 10 is oriented to the left and a second camera 10 is oriented to the right.
- the entirety of the individual object position data gradually available at the server 8 in the spatial coordinate system 3 for the respective products P1, P2, etc. is ultimately used at the server s to create a digital three-dimensional map of the product positions, which in technical jargon is referred to as a floor plan.
- the construction or an update of this three-dimensional map of the products Pl, P2, etc. can take place after a certain recording period has elapsed or successively, i.e. whenever new object position data is available.
- the freely movable device i.e. the glasses or the shopping cart
- the freely movable device can have a computer that receives raw data from the camera 10 and carries out the processing of this raw data described in connection with the camera 10.
- the freely movable device can also have the camera data radio communication stage 12A and the camera UWB radio communication stage 13A.
- a mobile phone or a tablet computer can also be used as a freely web-based device that carries the camera.
- Modern devices of this type are usually already equipped with an excellent camera system including a sensor (often implemented as a flight-of-time sensor) for recording the distance between the camera and an object captured by the camera. They also have powerful computers that can easily implement the computing tasks discussed above (especially in real time) with appropriate programming.
- Such devices use a so-called app, i.e. software that provides the functionalities discussed in the context of this invention as soon as it is executed on the device's computer.
- the position of these devices in the business premises or warehouse can also be easily determined using the radio device integrated in the respective device (such as WLAN or BlueTooth ®), for example using triangulation or using UWB radio, as long as this functionality is also available in the respective device is implemented.
- the radio device integrated in the respective device such as WLAN or BlueTooth ®
- UWB radio such devices from the factory are not equipped with an integrated UWB radio device, they can be retrofitted with UWB radio devices that can be plugged into the device's USB port (e.g. designed as a UWB radio dongle or USB-capable UWB radio device).
- UWB radio devices e.g. designed as a UWB radio dongle or USB-capable UWB radio device.
- such devices are used primarily by store or warehouse staff, where they are held in the direction of the shelves, carried past or along the shelves and the respective scene is captured.
- the respective device can be used to automatically check whether the digital scene image is suitable for further processing or whether the scene should be captured again.
- This can be communicated to the staff, for example, acoustically with the help of the audio module integrated in the device, for example by means of instructions "spoken" by the device or in the form of a dialogue, in such a way that the staff are given unmistakable instructions for action, such as the device higher or to hold deeper or to change the inclination or orientation in a given direction, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Business, Economics & Management (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Electromagnetism (AREA)
- Finance (AREA)
- Accounting & Taxation (AREA)
- General Business, Economics & Management (AREA)
- Strategic Management (AREA)
- Marketing (AREA)
- Optics & Photonics (AREA)
- Economics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Development Economics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/EP2022/059508 WO2023193932A1 (de) | 2022-04-08 | 2022-04-08 | Positionsbestimmung-verfahren bzw. -system für eine positionsbestimmung von gegenständen |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP4505210A1 true EP4505210A1 (de) | 2025-02-12 |
Family
ID=81595654
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP22722170.2A Pending EP4505210A1 (de) | 2022-04-08 | 2022-04-08 | Positionsbestimmung-verfahren bzw. -system für eine positionsbestimmung von gegenständen |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US20250218190A1 (de) |
| EP (1) | EP4505210A1 (de) |
| KR (1) | KR20250005139A (de) |
| CN (1) | CN119110907A (de) |
| AU (1) | AU2022452474A1 (de) |
| WO (1) | WO2023193932A1 (de) |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114615747B (zh) * | 2022-05-09 | 2022-09-02 | 汉朔科技股份有限公司 | 基站频率的动态分配方法、价签系统和计算机设备 |
| US12597253B2 (en) * | 2023-10-18 | 2026-04-07 | Qualcomm Incorporated | Survey-based location of electronic shelf label (ESL) devices |
| US20250175762A1 (en) * | 2023-11-28 | 2025-05-29 | Qualcomm Incorporated | Iterative sub-selection of electronic shelf label (esl) devices for positioning |
| DE102024205174B3 (de) * | 2024-06-05 | 2025-10-02 | Continental Automotive Technologies GmbH | Verfahren zur lokalisation von mit etiketten versehenen aufbewahrungsmitteln, vorrichtung und fahrzeug |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2017055119A1 (en) * | 2015-10-02 | 2017-04-06 | Philips Lighting Holding B.V. | Camera based location commissioning of electronic shelf labels |
| FR3072492B1 (fr) * | 2017-10-13 | 2019-11-08 | Ses-Imagotag | Procede pour initialiser ou mettre a jour une base de donnees de realogramme pour un lineaire, en utilisant des signaux optiques emis par des etiquettes electroniques de gondole |
| CN110351678B (zh) | 2018-04-03 | 2021-08-20 | 汉朔科技股份有限公司 | 商品定位方法及装置、设备和存储介质 |
| US20220051310A1 (en) * | 2020-08-17 | 2022-02-17 | Qualcomm Incorporated | Methods Using Electronic Shelf Labels To Improve Item Gathering In Store And Warehouse Systems |
-
2022
- 2022-04-08 US US18/847,490 patent/US20250218190A1/en active Pending
- 2022-04-08 KR KR1020247034695A patent/KR20250005139A/ko active Pending
- 2022-04-08 EP EP22722170.2A patent/EP4505210A1/de active Pending
- 2022-04-08 CN CN202280094614.3A patent/CN119110907A/zh active Pending
- 2022-04-08 AU AU2022452474A patent/AU2022452474A1/en active Pending
- 2022-04-08 WO PCT/EP2022/059508 patent/WO2023193932A1/de not_active Ceased
Also Published As
| Publication number | Publication date |
|---|---|
| KR20250005139A (ko) | 2025-01-09 |
| AU2022452474A1 (en) | 2024-10-10 |
| US20250218190A1 (en) | 2025-07-03 |
| CN119110907A (zh) | 2024-12-10 |
| WO2023193932A1 (de) | 2023-10-12 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP4505210A1 (de) | Positionsbestimmung-verfahren bzw. -system für eine positionsbestimmung von gegenständen | |
| EP3529675B1 (de) | Innenraum-personenortung-basierte fertigungssteuerung in der metallverarbeitenden industrie | |
| DE112019001788B4 (de) | Verfahren, system und vorrichtung zum korrigieren von transluzenzartefakten in daten, die eine trägerstruktur darstellen | |
| EP2526378B1 (de) | Verfahren und system zum erfassen der position eines fahrzeuges | |
| DE102018120510A1 (de) | Systeme und verfahren zur verkaufsstellenerkennung mit bildsensoren zur identifizierung neuer radiofrequenz- identifikations (rfid) -etiketten-ereignisse in der nähe eines rfid- lesegeräts | |
| DE112018002314T5 (de) | Verfahren und vorrichtung zur erkennung eines objektstatus | |
| EP3682303A1 (de) | Verfahren zur fertigungssteuerung von fertigungsprozessen in der metallverarbeitenden industrie mittels bildaufnahmevorrichtungen | |
| DE112018002848T5 (de) | RFID-gesteuerte Video-Momentaufnahmen, die Ziele von Interesse erfassen | |
| EP3274735B1 (de) | Tracking-system und verfahren zum tracken eines trägers einer mobilen kommunikationseinheit | |
| DE102017120378A1 (de) | Innenraum-ortung-basierte steuerung von fertigungsprozessen in der metallverarbeitenden industrie | |
| DE112019001745T5 (de) | Verfahren und vorrichtung zum etikettieren von tragstrukturen | |
| DE102019125987A1 (de) | Verfahren, system und vorrichtung zur navigationsunterstützung | |
| DE102020209054A1 (de) | Vorrichtung und verfahren zur personenerkennung, -verfolgung und -identifizierung unter verwendung drahtloser signale und bilder | |
| DE102020126353A1 (de) | Verfahren und systeme zur erkennung von in warenträgern zurückgelassenen artikeln | |
| WO2023006171A1 (de) | Verfahren zur bestimmung einer abmessung eines produkts in einer produktpräsentationsvorrichtung | |
| DE102019112781A1 (de) | Verfahren zum Koppeln von Koordinatensystemen und computergestütztes System | |
| DE112018005119B4 (de) | Systeme und verfahren zum steuern eines oder mehrerer produktlesegeräte und zum bestimmen von produkteigenschaften | |
| DE202023107560U1 (de) | System zur Erstellung einer digitalen Abbildung eines Verkaufsbereiches | |
| EP3711392B1 (de) | Verfahren und vorrichtung zur positionsbestimmung | |
| WO2023110099A1 (de) | Verfahren zum bestimmen der position von einrichtungselementen, insbesondere von elektronischen etiketten | |
| WO2019109242A1 (en) | Systems, apparatus, and methods for identifying and tracking object based on light coding | |
| DE102005033544B4 (de) | Verfahren zum Erfassen der Position von Objekten | |
| EP4335122A1 (de) | Verfahren und system zur ortsbestimmung von regalschienen-equipment | |
| DE202021004393U1 (de) | System zur Ortsbestimmung von Regalschienen-Equipment | |
| EP4689556A1 (de) | Verfahren zur erstellung einer navigationsgrundlage für die navigation in einem verkaufsbereiches |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
| 17P | Request for examination filed |
Effective date: 20241018 |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| RAP3 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: VUSIONGROUP GMBH Owner name: VUSIONGROUP DEUTSCHLAND GMBH |
|
| DAV | Request for validation of the european patent (deleted) | ||
| DAX | Request for extension of the european patent (deleted) |