WO2019214641A1 - 基于光标签的信息设备交互方法及系统 - Google Patents
基于光标签的信息设备交互方法及系统 Download PDFInfo
- Publication number
- WO2019214641A1 WO2019214641A1 PCT/CN2019/085997 CN2019085997W WO2019214641A1 WO 2019214641 A1 WO2019214641 A1 WO 2019214641A1 CN 2019085997 W CN2019085997 W CN 2019085997W WO 2019214641 A1 WO2019214641 A1 WO 2019214641A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- terminal device
- optical
- optical tag
- information device
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
- G05B19/042—Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K19/00—Record carriers for use with machines and with at least a part designed to carry digital markings
- G06K19/06—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
- G06K19/067—Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components
- G06K19/07—Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components with integrated circuit chips
- G06K19/0723—Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components with integrated circuit chips the record carrier comprising an arrangement for non-contact communication, e.g. wireless communication circuits on transponder cards, non-contact smart cards or RFIDs
- G06K19/0728—Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components with integrated circuit chips the record carrier comprising an arrangement for non-contact communication, e.g. wireless communication circuits on transponder cards, non-contact smart cards or RFIDs the arrangement being an optical or sound-based communication interface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0308—Detection arrangements using opto-electronic means comprising a plurality of distinctive and separately oriented light emitters or reflectors associated to the pointing device, e.g. remote cursor controller with distinct and separately oriented LEDs at the tip whose radiations are captured by a photo-detector associated to the screen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K17/00—Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10821—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
- G06K7/10861—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices sensing of data fields affixed to objects or articles, e.g. coded labels
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B10/00—Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
- H04B10/11—Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B10/00—Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
- H04B10/11—Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
- H04B10/114—Indoor or close-range type systems
- H04B10/1141—One-way transmission
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/023—Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
Definitions
- the present invention relates to the field of optical information technology and location services, and more particularly to a method and system for device interaction using optical tags.
- optical tags are also referred to as optical communication devices.
- the present invention provides a new optical tag-based information device interaction method and system, so that the user can manipulate the devices in the field of view anytime and anywhere, and the WYSIWYG of the device interaction operation.
- the present invention provides a method for optical device-based information device interaction, the method comprising:
- the method may further include: adjusting an imaging position of each information device on a display screen of the terminal device in response to a change in a position and/or a posture of the terminal device.
- the method may further include:
- the identified operation is converted into a corresponding operational command and sent to the information device over the network.
- the user's operation on the interactive interface of the information device may be at least one of the following: screen input, keyboard input, voice input, or gesture input.
- the present invention also provides a system for optical device-based information device interaction, the system comprising one or more information devices, optical tags at relatively fixed positions of the information device, storage and information devices, and optical tags a server for information, a terminal device with an imaging device;
- the terminal device is configured to:
- the interactive interfaces of the respective information devices are respectively presented at their imaging positions on the display screen for interworking with the respective information devices.
- the terminal device may be further configured to adjust an imaging position of each information device on a display screen of the terminal device in response to a change in position and/or posture of the terminal device.
- the terminal device may also be configured to:
- the identified operation is converted into a corresponding operational command and sent to the information device over the network.
- the user's operation on the interactive interface of the information device may be at least one of the following: screen input, keyboard input, voice input, or gesture input.
- the invention further relates to a computing device comprising a processor and a memory in which is stored a computer program, which, when executed by the processor, can be used to implement the above method.
- the invention further relates to a storage medium in which is stored a computer program which, when executed, can be used to implement the above method.
- 1 is a schematic diagram of the basic principle of a triangulation method
- FIG. 2 is a schematic diagram showing the principle of an imaging process of an imaging device when collecting an optical tag
- FIG. 3 is a schematic diagram showing a simplified relationship between an object coordinate system and an image coordinate system
- FIG. 4 is a schematic flowchart diagram of an optical tag-based information device interaction method according to an embodiment of the present invention.
- Barcodes and QR codes have been widely adopted to encode information. When these barcodes and QR codes are scanned with a specific device or software, the corresponding information is identified.
- the recognition distance between the barcode and the two-dimensional code is very limited. For example, for a two-dimensional code, when scanning with a mobile phone camera, the phone must typically be placed at a relatively short distance, typically about 15 times the width of the two-dimensional code. Therefore, for long-distance recognition (for example, 200 times the width of the two-dimensional code), barcodes and two-dimensional codes are usually not implemented, or very large barcodes and two-dimensional codes must be customized, but this will bring about an increase in cost. And in many cases it is impossible to achieve due to various other restrictions.
- the optical tag transmits information by emitting different light, which has the advantages of long distance, visible light condition requirement, strong directivity, and positionability, and the information transmitted by the optical tag can be rapidly changed with time, thereby providing a larger Information capacity (for example, an optical communication device described in Chinese Patent Publication No. CN104168060A, CN105740936A, etc.). Compared with the traditional two-dimensional code, the optical tag has stronger information interaction ability, which can provide great convenience for users and businesses.
- the optical tag may be any optical communication device capable of transmitting different information by emitting different light.
- the optical tag can include at least one light source and a controller for controlling different light emitted by the light source to convey different information.
- the controller can cause the light source to emit different light by changing the properties of the light emitted by the light source.
- the property of the light may be any property that the optical imaging device (eg, CMOS imaging device) can perceive; for example, it may be an attribute of the human eye that is perceived by the intensity, color, wavelength, etc. of the light, or other attributes that are not perceptible to the human eye.
- the intensity, color or wavelength of the electromagnetic wavelength outside the visible range of the human eye changes, or any combination of the above properties.
- a change in the properties of light can be a single property change, or a combination of two or more properties can change.
- the intensity of the light is selected as an attribute, it can be achieved simply by selecting to turn the light source on or off.
- the light source is turned on or off to change the properties of the light, but those skilled in the art will appreciate that other ways to change the properties of the light are also possible.
- the optical tag can be used in the optical tag as long as one of its properties that can be perceived by the optical imaging device can be varied at different frequencies.
- Various common optical devices can be included in the light source, such as a light guide plate, a soft plate, a diffuser, and the like.
- the light source may be an LED light, an array of a plurality of LED lights, a display screen or a part thereof, and even an illuminated area of light (for example, an illuminated area of light on a wall) may also serve as a light source.
- the shape of the light source may be various shapes such as a circle, a square, a rectangle, a strip, an L, or the like.
- the controller of the optical tag can control the properties of the light emitted by each source to communicate information.
- "0" or "1" of binary digital information can be represented by controlling the turning on and off of each light source such that multiple light sources in the optical tag can be used to represent a sequence of binary digital information.
- each light source can be used not only to represent a binary number, but also to represent data in ternary or larger hexadecimal.
- each light source can represent data in ternary or larger hexadecimal. Therefore, the optical tag of the present invention can significantly increase the data encoding density compared to the conventional two-dimensional code.
- the controller of the optical tag can control the light source to change the properties of the light it emits at a certain frequency. Therefore, the optical tag of the present invention can represent different data information at different times, for example, different. A sequence of binary digital information.
- each frame of the image can be used to represent a set of information sequences, thereby comparing to a conventional static
- the QR code can further significantly increase its data encoding density.
- an optical label can be imaged using an optical imaging device or an image capture device that is common in the art, and the transmitted information, such as a binary data 1 or a data 0 information sequence, is determined from each frame of image to achieve light.
- the optical imaging device or image acquisition device may include an image acquisition component, a processor, a memory, and the like.
- the optical imaging device or image acquisition device may be, for example, a mobile terminal having a photographing function, including a mobile phone, a tablet, smart glasses, etc., which may include an image capture device and an image processing module.
- the user visually finds the optical tag within a range of distance from the optical tag, and scans the optical tag by performing the information capture and interpretation process by causing the imaging sensor of the mobile terminal to face the optical tag.
- the controller of the optical tag controls the light source to change the attribute of the light emitted by the light source at a certain frequency
- the image acquisition frequency of the mobile terminal can be set to be greater than or equal to twice the frequency of the attribute conversion of the light source.
- the process of identifying and decoding can be completed by performing a decoding operation on the acquired image frame.
- the serial number, the check digit, the time stamp, and the like may be included in the information transmitted by the optical tag.
- a start frame or an end frame may be given in a plurality of image frames as needed, or both, for indicating a start or end position of a complete period of the plurality of image frames, the start frame or the end frame may be It is set to display a particular combination of data, for example: all 0s or all 1s, or any special combination that will not be the same as the information that may actually be displayed.
- CMOS imaging device when a continuous multi-frame image of a light source is captured by a CMOS imaging device, it can be controlled by a controller such that a switching time interval between operating modes of the light source is equal to a full frame imaging time of the CMOS imaging device. Length, thereby achieving frame synchronization of the light source with the imaging device. Assuming that each light source transmits 1 bit of information per frame, for a shooting speed of 30 frames per second, each light source can deliver 30 bits of information per second, with an encoding space of 2 30 , which can include, for example, an initial Frame tag (frame header), optical tag ID, password, verification code, URL information, address information, time stamp, or a different combination thereof.
- Table 1 presents an example packet structure in accordance with one embodiment of the present invention:
- optical tags Compared with the traditional two-dimensional code, the above optical label transmits information by emitting different light, which has the advantages of long distance, visible light condition requirement, strong directivity, and positionability, and the information transmitted by the optical label can be timed. It changes rapidly, which can provide a large information capacity. Therefore, optical tags have greater information interaction capabilities, which can provide great convenience for users and businesses. In order to provide corresponding services to users and merchants based on optical tags, each optical tag is assigned a unique identifier (ID) for uniquely identifying or identifying by the manufacturer, manager, user, etc. of the optical tag. Light label.
- ID unique identifier
- the identifier can be issued by the optical tag, and the user can use the image capturing device or the imaging device built in the mobile phone to perform image acquisition on the optical tag to obtain information (such as an identifier) transmitted by the optical tag, so that the access can be based on The service provided by the optical label.
- the imaging device that scans the optical tag based on the aforementioned optical tag.
- the geographical location information of the optical tag can be registered in advance on, for example, a server.
- the optical label can transmit its identification information (for example, ID information) during the working process, and the imaging device can obtain the ID information by scanning the optical label. After the imaging device obtains the ID information of the optical label, the ID information is used to query the server. The geographic location corresponding to the optical tag can be obtained, thereby performing reverse positioning to determine the specific location of the imaging device.
- optical tag can have a uniform or default physical size or shape and the user device can be aware of the physical size or shape.
- Various possible reverse positioning methods can be used to determine the relative positional relationship between the user (actually the user's imaging device) and the optical tag. For example, it can be determined by determining the relative distance of the imaging device from the optical tag (eg, by the imaging size of the optical tag, or by any application with a ranging function on the handset) and using three or more optical tags by triangulation. The relative positional relationship between the imaging device and any of the optical tags. The relative positional relationship between the imaging device and the optical tag can also be determined by determining the relative distance of the imaging device from the optical tag and by analyzing the perspective distortion of the imaging of the optical tag on the imaging device.
- physical size information and/or orientation information or the like of the optical label may be further used.
- the physical size information and/or orientation information may be stored in the server in association with the identification information of the optical tag.
- At least two optical tags can be used for positioning.
- the following steps can be performed for each optical tag:
- Step 1 Collect the ID information of the optical tag using the imaging device.
- Step 2 Obtain physical size information and geographic location information of the optical label by using the ID information query.
- Step 3 Photograph the optical label by using the default focal length of the imaging device to obtain an image of the optical label. Since the default focal length of the imaging device is used, the captured optical label image may be blurred.
- Step 4 Adjust and optimize the focal length of the imaging device to obtain a clear image of the optical tag. For example, based on the default focal length, it is first attempted to increase the focal length, and if the optical label image becomes clear, the focal length continues to increase, and if the optical label image becomes blurred, the opposite direction is adjusted, that is, the focal length is reduced; and vice versa.
- the texture feature of the optical label image can be extracted. The clearer the optical label image is, the simpler the corresponding texture information is, the smaller the density of the texture is. Therefore, according to the light
- the density of the texture of the label image determines the optimal focal length parameter. When a smaller texture density cannot be obtained after multiple iterations, the image with the smallest texture density can be considered to be a clear image and will be minimized.
- the focal length parameter corresponding to the texture density is used as the optimal focal length parameter.
- Step 5 Based on the optimal focal length parameter, take a clear image of the optical label, and then, using a simple lens object image formula and object image relationship, according to the size of the clear image of the optical label, the physical size of the optical label, and the optimal focal length parameter Calculate the relative distance between the imaging device and the optical tag.
- the specific position information of the imaging device that is, the specific coordinates of the imaging device in the physical world coordinate system, can be determined using the triangulation method.
- 1 is a schematic diagram of a triangulation method in which two optical tags (optical tag 1 and optical tag 2) are used for triangulation.
- two candidate locations are typically obtained. In this case, it may be necessary to choose from these two candidate locations.
- one of the candidate locations may be selected in conjunction with positioning information (eg, GPS information) of the imaging device (eg, the handset) itself. For example, a candidate location that is closer to the GPS information can be selected.
- the orientation information of each optical tag may be further considered, the orientation information actually defining an area in which the optical tag can be observed, and thus one of the candidate locations may be selected based on the orientation information.
- the orientation information of the optical tag can also be stored in the server and can be obtained by querying the ID information of the optical tag.
- two optical tags have been described as an example, but those skilled in the art can understand that the above-described method based on the triangular positioning can also be applied to the case of three or more optical tags. In fact, using three or more optical tags allows for more precise positioning, and multiple candidate points typically do not occur.
- the following reverse positioning method can also be employed, which does not require the use of at least two optical tags, but can be reverse positioned using one optical tag.
- the method of this embodiment comprises the following steps:
- Step 1 Collect the ID information of the optical tag using the imaging device.
- Step 2 Querying the ID information to obtain the geographical location information of the optical tag and related information of multiple points on the same.
- the related information is, for example, position information of these points on the optical tag and their coordinate information.
- Step 3 Photograph the optical label by using the default focal length of the imaging device to obtain an image of the optical label.
- the optimum focal length parameter can be determined according to the density of the texture of the optical label image. When a smaller texture density cannot be obtained after multiple iterations, the image with the smallest texture density can be considered as a clear image. And the focal length parameter corresponding to the obtained minimum texture density is taken as the optimal focal length parameter.
- Step 5 Based on the optimal focal length parameter, take a clear image of the optical label to achieve the reverse positioning as described below:
- FIG. 2 is a schematic diagram of an imaging process of an optical label on an imaging device.
- the object coordinate system (X, Y, Z) is established with the centroid of the optical label as the origin, and the image coordinate system (x, y, z) is established with the position F c of the imaging device as the origin, and the object coordinate system is also called the physical world coordinate system.
- the image coordinate system is also called the camera coordinate system.
- a point of the upper left corner of the image of the optical tag collected by the imaging device is used as a coordinate origin, and a two-dimensional coordinate system (u, v) is established in the image plane of the optical tag, which is called an image plane coordinate system, and the image plane and the light are called
- the intersection of the axis ie, the Z axis
- (c x , c y ) is the coordinate of the main point in the image plane coordinate system.
- the coordinates of any point P on the optical label in the object coordinate system are (X, Y, Z), the corresponding image point is q, and the coordinates in the image coordinate system are (x, y, z), in the image.
- the coordinates in the plane coordinate system are (u, v).
- the image coordinate system has not only the change of displacement with respect to the object coordinate system, but also the rotation of the angle, the relationship between the object coordinate system (X, Y, Z) and the image coordinate system (x, y, z). It can be expressed as:
- f x and f y are the focal lengths of the imaging device in the x-axis and y-axis directions, respectively, c x , c y are the coordinates of the main point in the image plane coordinate system, f x , f y , c x , c y
- f x , f y , c x , c y For the parameters inside the imaging device, it can be detected in advance.
- the rotation matrix R and the displacement vector t respectively represent attitude information of the object coordinate system relative to the image coordinate system (ie, the attitude of the imaging device relative to the optical label, that is, the deflection of the optical axis of the imaging device compared to the optical axis, also referred to as the imaging device relative
- the displacement information ie, the displacement between the imaging device and the optical tag.
- the rotation can be decomposed into two-dimensional rotations around their respective axes, if they are rotated around the x, y, and z axes in turn, And ⁇ , then the total rotation matrix R is three matrices R x ( ⁇ ), The product of R z ( ⁇ ), namely: among them,
- the displacement vector t can be simply written as follows, ie
- s is the object image conversion factor, which is equal to the ratio of the size of the image plane to the resolution of the imaging device, and is also known.
- Determining the points in the optical label based on the information about the plurality of points (eg, at least four points A, B, C, and D) on the optical label obtained in step two (eg, the position information of the points on the optical label) Image points in the image, such as A', B', C', and D'.
- the four points A, B, C, and D may, for example, be respectively the left and right sides of the optical tag, or may be four separate point sources located at the four corners of the optical tag, and the like.
- the rotation matrix R determines the attitude of the imaging device relative to the optical tag.
- an optical tag based information device interaction control system is also provided.
- An information device in the system refers to any computing device that can be interactively controlled over a network, including but not limited to information appliances or home devices.
- Each information device can be associated with one or more optical tags, and each optical tag can be associated with one or more information devices.
- the optical tag can be placed on the information device or can be at a relatively fixed location relative to the information device. The physical location of the optical tag and the position of the information device relative to the optical tag are pre-calibrated. Information about the optical tag and its associated information device can be saved on the server for query.
- the information related to the optical tag may include, for example, ID information of the optical tag, physical world coordinates of the optical tag, physical size, orientation, information device identifier associated with the optical tag, and multiple points on the optical tag on the optical tag Information such as location information and its object coordinates.
- the information relating to the information device may, for example, comprise an identifier of the information device, a coordinate in the object coordinate system established by the information device with the centroid of the optical tag associated therewith, the optical tag of the information device with which it is associated Relative location information, an operational interface of the information device, descriptive information, size, orientation, etc. of the information device.
- the image device of the optical device associated with the information device can be used to obtain the ID information of the optical tag by using an imaging device of the terminal device (such as a mobile phone).
- the terminal device may obtain information about the information device associated with the optical tag from the server according to the optical tag ID information, and may present the interaction interface of the information device at a location where the information device is located in the display screen of the terminal device. . In this way, the user can perform related interactive control operations on the information device through an interactive interface superimposed on or near the information device.
- the interaction interface of the information device Before the interaction interface of the information device is presented at the location currently displayed on the screen of the terminal device, it may be determined whether any information device associated with the optical tag will appear on the display screen of the terminal device. And further determining the imaging position of the information device on the display screen in the case where the judgment result is present, for example, determining the two-dimensional image plane coordinates when the screen is imaged.
- the reverse positioning method mentioned above may first be used to determine an initial relative positional relationship between the terminal device carried by the user and the optical tag, thereby determining the initial position and initial orientation of the user's terminal device. Further, since the physical position of the optical tag and the position of each information device relative to the optical tag have been previously calibrated, the terminal device and each of the user may be determined based on the initial position of the terminal device and the pre-stored calibration information. The initial relative positional relationship between information devices.
- the user equipment and the information device Based on the initial relative positional relationship between the user equipment and the information device and the initial orientation of the terminal device, it may be determined whether any information device currently associated with the optical tag is present on the display screen of the terminal device, and The imaging position of the information device on the display screen is further determined in the case where the judgment result is an appearance. If the information device that the user wishes to control does not appear in the current display screen, the user can move the terminal device from the initial location to enable the information device to appear in the display screen, for example, the user can pan or rotate the terminal device such that Its camera eventually faces the information device.
- the change in the position and posture of the terminal device can be detected by various existing methods (for example, monitoring by sensors such as an acceleration sensor and a gyroscope built in the terminal device) to determine Based on the location information and the orientation information, the location information and the orientation information of the mobile terminal device can determine which information devices currently appear on the display screen of the terminal device and their respective presentation positions. Further, the interactive interfaces of these information devices can be superimposed on their respective imaging locations on the display screen to achieve WYSIWYG interactions for the various information devices.
- the physical world coordinate system (X, Y, Z) established with the centroid of the optical tag as the origin and the imaging device are located.
- the position is established at the origin.
- the camera coordinate system (x, y, z) (for example, formula (1)), which can be described by rotating the matrix R and the displacement vector t.
- the transformation relationship between the physical world coordinates and the image plane coordinates (for example, formula (3)) is also determined, and the transformation relationship is also called For the projection relationship, it can be used to determine the projected position of the actual object at a certain position in the physical world coordinate system in the imaging plane.
- the specificity of the imaging device relative to the optical tag is determined based on the optical tag image acquired by the imaging device and the information related to the optical tag acquired from the server.
- the rotation matrix R and the displacement vector t in the equation (3) and the internal parameters of the imaging device have been determined, whereby the image plane coordinates of each information device can be determined by the formula. Since the physical position of the optical tag and the relative position between the optical tag and the information device are set in advance, the object coordinate of each information device in the physical world coordinate system can be determined by the relative position between the optical tag and the information device.
- the image plane coordinate of the information device in the imaging plane can be obtained by substituting it into the formula (3), and then the interactive interface of the information device can be presented on the terminal device screen based on the image plane coordinate of the information device for the user to use.
- the icon of the information device may also be superimposed on the information device on the terminal device screen for the user to select.
- the interactive interface of the information device is Presented on the terminal device screen for the user to operate and control the information device. If there is a occlusion between the icons, you can either translucent the front icon or use a numeric cue near the top icon to indicate that there are multiple icons overlaid at that location.
- Terminal devices can monitor changes in their position and posture in a variety of ways. For example, the terminal device may use the optical tag as a reference point, compare the image captured by the current imaging device with the previous image, identify the difference in the calculated image, thereby forming feature points, and use these feature points to calculate the change of the position and posture of the self.
- a terminal device such as a mobile phone can estimate the position and orientation of the camera in the real world over time by the value measured by the built-in acceleration sensor, gyroscope or the like. Then, the rotation matrix R and the displacement vector t are adjusted based on the current position and orientation of the terminal device, and then the current image plane coordinates of the information device are reacquired to present related icons or interfaces on the screen.
- the user can configure and operate the information device through the interactive interface displayed on the screen of the terminal device.
- the manner of operation of the information device such as voice control or gesture control, may be predefined.
- the operation mode of the information device is configured as voice control
- the terminal device detects the voice input and performs voice recognition, converts the received voice into an operation command, and sends a control command to the network through the network.
- the information device operates on it.
- the gesture of the user may be captured by the imaging device of the terminal device or the camera device installed in the environment of the user, and gesture recognition is performed on the terminal device to convert it into a corresponding
- the operation instruction is sent and sent through the network to control the related information device.
- Gestures associated with each information device operation may be pre-defined, for example, gestures associated with operating the light may include turning the palms on to turn the lights on, the fists to turn off the lights, the fingers to increase the brightness, and the fingers down to reduce the brightness.
- FIG. 4 is a flow chart showing an optical tag-based information device interaction method according to an embodiment of the present invention.
- the initial position and posture of the terminal device relative to the optical tag are determined by performing image acquisition on the optical tag at a relatively fixed position of the information device by the terminal device carried by the user in step S1). For example, by using various reverse positioning methods described above, an initial relative positional relationship between an imaging device that performs image acquisition and an optical tag can be obtained by image acquisition of an optical tag, thereby determining an initial position and initiality of the terminal device. Orientation.
- step S3 Determining, according to the initial position of the terminal device and the pre-calibrated position of the respective information device relative to the optical tag as mentioned above, the relative position between the terminal device and the respective information device, and then determining in step S3) The imaging position of each information device on the display screen.
- the physical position of the optical tag and the relative position between the optical tag and the information device are set in advance, in accordance with the light collected by the imaging device.
- the rotation matrix R and the displacement vector t in the formula (3) and the internal parameters of the imaging device have been determined, It is thus possible to determine the image plane coordinates of each information device by the formula. Therefore, the object coordinates of each information device in the physical world coordinate system can be determined by the relative position between the optical tag and the information device, and the image plane coordinates of the information device in the imaging plane can be obtained by substituting it into the formula (3).
- the interactive interfaces of the respective information devices can then be superimposed to their imaging locations on the display screen for interaction with the respective information devices, respectively, in step S4).
- the user can move the terminal device from the initial location to enable the information device to appear in the display screen, for example, the user can pan or rotate
- the terminal device is such that its camera is ultimately oriented towards the information device.
- the terminal device moves from the initial position, detecting a change in the position and posture of the terminal device, thereby determining location information and posture information of the moved terminal device, and based on the location information and the posture information, determining which information devices are currently available Appear on the display screen of the terminal device and their respective presentation positions.
- the interactive interfaces of these information devices can be superimposed on their presentation locations on the display screen to achieve WYSIWYG interactions for the various information devices.
- the terminal device can monitor changes in its position and posture in various ways.
- the terminal device may use the optical tag as a reference point, compare the image captured by the current imaging device with the previous image, identify the difference in the calculated image, thereby forming feature points, and use these feature points to calculate the change of the position and posture of the self.
- a terminal device such as a mobile phone can estimate the position and orientation of the camera in the real world over time by the value measured by the built-in acceleration sensor, gyroscope or the like. Then, the rotation matrix R and the displacement vector t are adjusted based on the current position and orientation of the terminal device, and then the current image plane coordinates of the information device are reacquired to present related icons or interfaces on the screen.
- the method may further comprise identifying a user's operation of the interactive interface of the information device, converting the operation to a corresponding operational command, and transmitting it to the information device over the network.
- the information device can perform the corresponding operations in response to the received related operational instructions.
- Users can interact with information devices in a variety of ways. For example, the user can configure and operate the information device through the interactive interface of the screen display of the terminal device, for example, using a touch screen input or using a keyboard input.
- the manner of operation of the information device such as voice control or gesture control, may be predefined.
- the operation mode of the information device is configured as voice control
- the terminal device detects the voice input and performs voice recognition, converts the received voice into an operation command, and sends a control command to the network through the network.
- the information device operates on it.
- the operation mode of the information device is configured as the handheld control
- the gesture of the user may be captured by the imaging device of the terminal device or the camera device installed in the environment of the user, and gesture recognition is performed on the terminal device to convert it into a corresponding
- the operation instruction is sent and sent through the network to control the related information device.
- Gestures associated with each information device operation may be pre-defined, for example, gestures associated with operating the light may include turning the palms on to turn the lights on, the fists to turn off the lights, the fingers to increase the brightness, and the fingers down to reduce the brightness.
- any optical tag (or light source) that can be used to communicate information can be used.
- the method of the present invention can be applied to a light source that transmits information through different stripes based on a rolling shutter effect of CMOS (for example, the optical communication device described in Chinese Patent Publication No. CN104168060A), and can also be used in, for example, the patent CN105740936A.
- CMOS complementary metal-oxide-semicon-based on a rolling shutter effect of CMOS
- the described optical tags can also be applied to a variety of optical tags that can be used to identify the transmitted information by the CCD sensor, or can be applied to an array of optical tags (or sources).
- appearances of the phrases “in the various embodiments”, “in some embodiments”, “in one embodiment”, or “in an embodiment” are not necessarily referring to the same implementation. example.
- the particular features, structures, or properties may be combined in any suitable manner in one or more embodiments.
- the particular features, structures, or properties shown or described in connection with one embodiment may be combined, in whole or in part, with the features, structures, or properties of one or more other embodiments without limitation, as long as the combination is not Logical or not working.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Toxicology (AREA)
- Artificial Intelligence (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Computer Hardware Design (AREA)
- Automation & Control Theory (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Optical Communication System (AREA)
Abstract
Description
帧头 | 属性字段(可选) | 数据字段 | 校验位 | 帧尾 |
Claims (10)
- 一种基于光标签的信息设备交互的方法,该方法包括:S1)通过用户携带的终端设备对光标签进行图像采集来确定终端设备相对于光标签的初始位置和姿态;S2)基于所确定的终端设备的位置和预先标定好的各个信息设备相对于光标签的位置来确定终端设备与各个信息设备之间的相对位置;S3)根据所确定的终端设备的姿态及其与各个信息设备之间的相对位置计算各信息设备在所述终端设备的显示屏幕上的成像位置;S4)将各信息设备的交互接口分别呈现于其在显示屏幕上的成像位置处,以用于与各个信息设备进行交互操作。
- 根据权利要求1所述的方法,还包括:响应于终端设备的位置和/或姿态的变化,调整各信息设备在所述终端设备的显示屏幕上的成像位置。
- 根据权利要求1所述的方法,还包括:识别用户对于信息设备的交互接口的操作;以及将所识别的操作转换为相应的操作指令,并通过网络将其发送给该信息设备。
- 根据权利要求3所述的方法,其中用户对于信息设备的交互接口的操作为下列中的至少一个:屏幕输入、键盘输入、语音输入或手势输入。
- 一种基于光标签的信息设备交互的系统,该系统包括一个或多个信息设备、处于信息设备相对固定位置处的光标签、用于存储与信息设备和光标签有关信息的服务器、带有成像装置的终端设备;其中终端设备被配置为:对光标签进行图像采集来确定终端设备相对于光标签的初始位置和姿态;基于所确定的终端设备的位置和从服务器获取的预先标定好的各个信息设备相对于光标签的位置来确定终端设备与各个信息设备之间的相对位置;根据所确定的终端设备的姿态及其与各个信息设备之间的相对位置计算各信息设备在所述终端设备的显示屏幕上的成像位置;将各信息设备的交互接口分别呈现于其在显示屏幕上的成像位置处,以用于与各个信息设备进行交互操作。
- 根据权利要求5所述的系统,其中所述终端设备还被配置为:响应于终端设备的位置和/或姿态的变化,调整各信息设备在所述终端设备的显示屏幕上的成像位置。
- 根据权利要求5所述的系统,其中所述终端设备还被配置为:识别用户对于信息设备的交互接口的操作;以及将所识别的操作转换为相应的操作指令,并通过网络将其发送给该信息设备。
- 根据权利要求7所述的系统,其中用户对于信息设备的交互接口的操作为下列中的至少一个:屏幕输入、键盘输入、语音输入或手势输入。
- 一种计算设备,包括处理器和存储器,所述存储器中存储有计算机程序,所述计算机程序在被所述处理器执行时能够用于实现权利要求1-4中任一项所述的方法。
- 一种存储介质,其中存储有计算机程序,所述计算机程序在被执行时能够用于实现权利要求1-4中任一项所述的方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020207035360A KR20210008403A (ko) | 2018-05-09 | 2019-05-08 | 광학 라벨을 기반으로 하는 정보 기기의 상호 작용 방법 및 시스템 |
JP2021512990A JP7150980B2 (ja) | 2018-05-09 | 2019-05-08 | 光ラベルに基づく情報デバイスインタラクション方法及びシステム |
EP19799577.2A EP3792711A4 (en) | 2018-05-09 | 2019-05-08 | INTERACTION METHOD AND SYSTEM OF AN INFORMATION DEVICE BASED ON OPTICAL LABELS |
US17/089,711 US11694055B2 (en) | 2018-05-09 | 2020-11-04 | Optical tag based information apparatus interaction method and system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810435183.8A CN110471580B (zh) | 2018-05-09 | 2018-05-09 | 基于光标签的信息设备交互方法及系统 |
CN201810435183.8 | 2018-05-09 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/089,711 Continuation US11694055B2 (en) | 2018-05-09 | 2020-11-04 | Optical tag based information apparatus interaction method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019214641A1 true WO2019214641A1 (zh) | 2019-11-14 |
Family
ID=68468452
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/085997 WO2019214641A1 (zh) | 2018-05-09 | 2019-05-08 | 基于光标签的信息设备交互方法及系统 |
Country Status (7)
Country | Link |
---|---|
US (1) | US11694055B2 (zh) |
EP (1) | EP3792711A4 (zh) |
JP (1) | JP7150980B2 (zh) |
KR (1) | KR20210008403A (zh) |
CN (1) | CN110471580B (zh) |
TW (1) | TWI696960B (zh) |
WO (1) | WO2019214641A1 (zh) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112051919B (zh) * | 2019-06-05 | 2022-10-18 | 北京外号信息技术有限公司 | 一种基于位置的交互方法和交互系统 |
CN111162840B (zh) * | 2020-04-02 | 2020-09-29 | 北京外号信息技术有限公司 | 用于设置光通信装置周围的虚拟对象的方法和系统 |
TWI738318B (zh) * | 2020-05-05 | 2021-09-01 | 光時代科技有限公司 | 用於確定光通信裝置的成像區域的系統 |
TWI756963B (zh) * | 2020-12-03 | 2022-03-01 | 禾聯碩股份有限公司 | 目標物件之區域定義辨識系統及其方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104168060A (zh) | 2014-07-09 | 2014-11-26 | 珠海横琴华策光通信科技有限公司 | 一种利用可见光信号传输信息/获取信息的方法和装置 |
US20150025838A1 (en) * | 2011-11-15 | 2015-01-22 | Panasonic Corporation | Position estimation device, position estimation method, and integrated circuit |
CN105718840A (zh) * | 2016-01-27 | 2016-06-29 | 西安小光子网络科技有限公司 | 一种基于光标签的信息交互系统及方法 |
CN105740936A (zh) | 2014-12-12 | 2016-07-06 | 方俊 | 一种光标签和识别光标签的方法及设备 |
CN107703872A (zh) * | 2017-10-31 | 2018-02-16 | 美的智慧家居科技有限公司 | 家电设备的终端控制方法、装置及终端 |
CN107784414A (zh) * | 2016-08-31 | 2018-03-09 | 湖南中冶长天节能环保技术有限公司 | 一种生产过程参数管理系统 |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3904036B2 (ja) * | 1996-05-17 | 2007-04-11 | 株式会社安川電機 | 多指多関節ハンドの制御装置 |
US20110001651A1 (en) * | 2009-07-02 | 2011-01-06 | Candelore Brant L | Zero standby power laser controlled device |
US8413881B2 (en) * | 2010-02-22 | 2013-04-09 | Into Great Companies, Inc. | System of receiving prerecorded media discs from users |
US8447070B1 (en) * | 2010-04-19 | 2013-05-21 | Amazon Technologies, Inc. | Approaches for device location and communication |
JP2012037919A (ja) * | 2010-08-03 | 2012-02-23 | Doshisha | 拡張現実感技術を利用した情報家電操作システム |
JP5257437B2 (ja) * | 2010-10-20 | 2013-08-07 | コニカミノルタビジネステクノロジーズ株式会社 | 携帯端末及び処理装置の操作方法 |
JP2012156836A (ja) * | 2011-01-27 | 2012-08-16 | Seiko Epson Corp | リモコン装置及びプログラム |
JP6066037B2 (ja) * | 2012-03-27 | 2017-01-25 | セイコーエプソン株式会社 | 頭部装着型表示装置 |
WO2014103161A1 (ja) * | 2012-12-27 | 2014-07-03 | パナソニック株式会社 | 情報通信方法 |
JP2014139745A (ja) * | 2013-01-21 | 2014-07-31 | Shimizu Corp | 機器管理システム、機器管理装置、機器管理方法及びプログラム |
US9696703B2 (en) * | 2013-05-18 | 2017-07-04 | Fipak Research And Development Company | Method and apparatus for ensuring air quality in a building, including method and apparatus for controlling a working device using a handheld unit having scanning, networking, display and input capability |
WO2015008102A1 (en) * | 2013-07-19 | 2015-01-22 | Niss Group Sa | System and method for indentifying and authenticating a tag |
CN103823204B (zh) * | 2014-03-10 | 2015-03-11 | 北京理工大学 | 一种基于可见光标签的室内定位方法 |
US20210203994A1 (en) * | 2015-06-12 | 2021-07-01 | Shaoher Pan | Encoding data in a source image with watermark image codes |
US9838844B2 (en) * | 2015-09-25 | 2017-12-05 | Ca, Inc. | Using augmented reality to assist data center operators |
JP2017130047A (ja) * | 2016-01-20 | 2017-07-27 | 沖電気工業株式会社 | 情報処理装置、情報処理システム、及びプログラム |
CN105740375A (zh) * | 2016-01-27 | 2016-07-06 | 西安小光子网络科技有限公司 | 一种基于多个光标签的信息推送系统及方法 |
CN106446883B (zh) * | 2016-08-30 | 2019-06-18 | 西安小光子网络科技有限公司 | 基于光标签的场景重构方法 |
CN106339488B (zh) * | 2016-08-30 | 2019-08-30 | 西安小光子网络科技有限公司 | 一种基于光标签的虚拟设施插入定制实现方法 |
CN106446737B (zh) * | 2016-08-30 | 2019-07-09 | 西安小光子网络科技有限公司 | 一种多个光标签的快速识别方法 |
CN106372556B (zh) * | 2016-08-30 | 2019-02-01 | 西安小光子网络科技有限公司 | 一种光标签的识别方法 |
CN106408667B (zh) * | 2016-08-30 | 2019-03-05 | 西安小光子网络科技有限公司 | 基于光标签的定制现实方法 |
US10284293B2 (en) * | 2016-09-23 | 2019-05-07 | Qualcomm Incorporated | Selective pixel activation for light-based communication processing |
WO2018069952A1 (ja) * | 2016-10-11 | 2018-04-19 | 株式会社オプティム | 遠隔制御システム、遠隔制御方法、およびプログラム |
CN107368805A (zh) * | 2017-07-17 | 2017-11-21 | 深圳森阳环保材料科技有限公司 | 一种基于智能家居系统电器识别的遥控器 |
CN107734449B (zh) * | 2017-11-09 | 2020-05-12 | 陕西外号信息技术有限公司 | 一种基于光标签的室外辅助定位方法、系统及设备 |
US10841174B1 (en) * | 2018-08-06 | 2020-11-17 | Apple Inc. | Electronic device with intuitive control interface |
-
2018
- 2018-05-09 CN CN201810435183.8A patent/CN110471580B/zh active Active
-
2019
- 2019-05-08 KR KR1020207035360A patent/KR20210008403A/ko not_active Application Discontinuation
- 2019-05-08 EP EP19799577.2A patent/EP3792711A4/en not_active Withdrawn
- 2019-05-08 WO PCT/CN2019/085997 patent/WO2019214641A1/zh unknown
- 2019-05-08 JP JP2021512990A patent/JP7150980B2/ja active Active
- 2019-05-09 TW TW108116063A patent/TWI696960B/zh active
-
2020
- 2020-11-04 US US17/089,711 patent/US11694055B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150025838A1 (en) * | 2011-11-15 | 2015-01-22 | Panasonic Corporation | Position estimation device, position estimation method, and integrated circuit |
CN104168060A (zh) | 2014-07-09 | 2014-11-26 | 珠海横琴华策光通信科技有限公司 | 一种利用可见光信号传输信息/获取信息的方法和装置 |
CN105740936A (zh) | 2014-12-12 | 2016-07-06 | 方俊 | 一种光标签和识别光标签的方法及设备 |
CN105718840A (zh) * | 2016-01-27 | 2016-06-29 | 西安小光子网络科技有限公司 | 一种基于光标签的信息交互系统及方法 |
CN107784414A (zh) * | 2016-08-31 | 2018-03-09 | 湖南中冶长天节能环保技术有限公司 | 一种生产过程参数管理系统 |
CN107703872A (zh) * | 2017-10-31 | 2018-02-16 | 美的智慧家居科技有限公司 | 家电设备的终端控制方法、装置及终端 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3792711A4 |
Also Published As
Publication number | Publication date |
---|---|
JP7150980B2 (ja) | 2022-10-11 |
US11694055B2 (en) | 2023-07-04 |
TWI696960B (zh) | 2020-06-21 |
CN110471580B (zh) | 2021-06-15 |
US20210056370A1 (en) | 2021-02-25 |
TW201947457A (zh) | 2019-12-16 |
KR20210008403A (ko) | 2021-01-21 |
JP2021524119A (ja) | 2021-09-09 |
CN110471580A (zh) | 2019-11-19 |
EP3792711A4 (en) | 2022-01-26 |
EP3792711A1 (en) | 2021-03-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019214641A1 (zh) | 基于光标签的信息设备交互方法及系统 | |
US9600078B2 (en) | Method and system enabling natural user interface gestures with an electronic system | |
US9310891B2 (en) | Method and system enabling natural user interface gestures with user wearable glasses | |
US9727298B2 (en) | Device and method for allocating data based on an arrangement of elements in an image | |
CN102783041B (zh) | 通信装置以及通信方法 | |
US9900500B2 (en) | Method and apparatus for auto-focusing of an photographing device | |
JP2003256876A (ja) | 複合現実感表示装置及び方法、記憶媒体、並びにコンピュータ・プログラム | |
KR20170050995A (ko) | 디스플레이 장치 및 그의 영상 표시 방법 | |
US20170038912A1 (en) | Information providing device | |
CN104081307A (zh) | 图像处理装置、图像处理方法和程序 | |
WO2017147909A1 (zh) | 目标设备的控制方法和装置 | |
CN112699849A (zh) | 手势识别方法和装置、电子设备、可读存储介质和芯片 | |
CN111553196A (zh) | 检测隐藏摄像头的方法、系统、装置、以及存储介质 | |
JP2011054162A (ja) | 対話型情報操作システム及びプログラム | |
US20170302908A1 (en) | Method and apparatus for user interaction for virtual measurement using a depth camera system | |
US20160117553A1 (en) | Method, device and system for realizing visual identification | |
CN106657600B (zh) | 一种图像处理方法和移动终端 | |
KR20190035373A (ko) | 혼합 현실에서의 가상 모바일 단말 구현 시스템 및 이의 제어 방법 | |
JP2018018308A (ja) | 情報処理装置、及びその制御方法ならびにコンピュータプログラム | |
CN112529770B (zh) | 图像处理方法、装置、电子设备和可读存储介质 | |
WO2020244577A1 (zh) | 一种基于位置的交互方法和交互系统 | |
US10969865B2 (en) | Method for transmission of eye tracking information, head mounted display and computer device | |
JP2023519755A (ja) | 画像レジストレーション方法及び装置 | |
JP2016139396A (ja) | ユーザーインターフェイス装置、方法およびプログラム | |
WO2019214644A1 (zh) | 基于光标签网络的辅助识别方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19799577 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021512990 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20207035360 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2019799577 Country of ref document: EP Effective date: 20201209 |