CN114779969B - Device for displaying object position on second display screen near to eye and combination thereof - Google Patents

Device for displaying object position on second display screen near to eye and combination thereof Download PDF

Info

Publication number
CN114779969B
CN114779969B CN202210696448.6A CN202210696448A CN114779969B CN 114779969 B CN114779969 B CN 114779969B CN 202210696448 A CN202210696448 A CN 202210696448A CN 114779969 B CN114779969 B CN 114779969B
Authority
CN
China
Prior art keywords
display screen
light
display
eye
light beam
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210696448.6A
Other languages
Chinese (zh)
Other versions
CN114779969A (en
Inventor
于林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aisiya Health Technology Suzhou Co ltd
Original Assignee
Aisiya Health Technology Suzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aisiya Health Technology Suzhou Co ltd filed Critical Aisiya Health Technology Suzhou Co ltd
Priority to CN202210696448.6A priority Critical patent/CN114779969B/en
Publication of CN114779969A publication Critical patent/CN114779969A/en
Application granted granted Critical
Publication of CN114779969B publication Critical patent/CN114779969B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04162Control or interface arrangements specially adapted for digitisers for exchanging data with external devices, e.g. smart pens, via the digitiser sensing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The invention discloses a device and a combination thereof for displaying the position of an object on a second display screen near to the eye, comprising at least one light emitter, at least one light receiver and a processing unit, wherein the processing unit is configured to acquire the preset relative coordinate position of a light beam propagation path between the at least one light emitter and the at least one light receiver on a first display screen; when an object contacts or approaches the first display screen, the object interferes the original light beam propagation path to enable the light beam received by the light receiver to change, and at least one object positioning event is triggered; and the processing unit analyzes and acquires the relative coordinate position of the object and the first display screen based on the object positioning event and displays the relative coordinate position on the second display screen near the eye. Compared with the prior art, the method can realize accurate and real-time human-computer interaction under the use scene that the touch screen and the display screen are separated and the touch screen is invisible, and has the advantages of low manufacturing cost and easy customization, installation and adjustment.

Description

Device for displaying object position on second display screen near to eye and combination thereof
Technical Field
The invention relates to a device for displaying the position of an object on a second display screen near to the eye and a combination thereof.
Background
In the prior art, an intelligent electronic device such as a smart phone or a tablet computer generally uses a touch screen with a first display screen as an input device to implement human-computer interaction. For example, a smart phone has an integrated touch screen and display screen, so that when a user operates on the touch screen through a finger or a stylus or other objects, the user can see icons displayed on the display screen, and the user can operate the smart phone intuitively and accurately.
However, the touch screen and the display screen of the wearable device, especially the head-mounted smart device, are often separate structures, such as VR or AR glasses connected to the smart phone, and since the user can only see the near-to-eye VR or AR display screen (the second display screen) but cannot see the smart phone screen (the first display screen) when wearing the VR or AR glasses, the user cannot control the smart phone screen to implement human-computer interaction when using the VR or AR glasses. In the prior art, several interaction modes such as voice detection, eye movement tracking or gesture detection exist, but the voice input and eye movement tracking have the defects of inaccuracy and long time consumption, and the gesture detection interaction mode easily causes fatigue and is easy to operate mistakenly due to the fact that hand suspension operation is needed. Therefore, how to display the position of an object on the first display screen invisible to a user of the electronic device on the second display screen near to the eyes to realize human-computer interaction is a problem to be solved urgently at present.
Disclosure of Invention
To solve the technical problems in the prior art, an object of the present invention is to provide an apparatus for displaying an object position on a second display screen near to the eye and a combination thereof.
In order to achieve one of the above objects, an embodiment of the present invention provides an apparatus for displaying an object position on a second near-to-eye display screen, where the electronic device includes a first display screen invisible to a user and a second display screen displayed near-to-eye, the apparatus includes at least one optical transmitter, at least one optical receiver, and a processing unit, and the processing unit is configured to acquire a preset relative coordinate position of a light beam propagation path between the at least one optical transmitter and the at least one optical receiver on the first display screen;
when no object contacts or approaches the first display screen, the processing unit controls the light emitters to sequentially emit light beams according to a preset sequence and transmits the light beams to the corresponding light receivers through air media;
when an object is in contact with or close to the first display screen, the object interferes with the original light beam propagation path to enable the light beam received by the light receiver to change to trigger at least one object positioning event;
the processing unit analyzes and obtains the relative coordinate position of the object and the first display screen based on the object positioning event and the preset relative coordinate position, and correspondingly displays the relative coordinate position on the second display screen near to the eye.
As a further improvement of the embodiment of the present invention, the first display screen is a touch screen, the optical transmitters and the optical receivers are arranged and distributed on the periphery of the touch screen in a set regular arrangement, and the second display screen is a head-mounted glasses display screen.
As a further improvement of one embodiment of the present invention, when no object contacts or approaches the first display screen, the light beam emitted by the light emitter passes through at least a partial area above the touch screen and is received by the light receiver.
As a further improvement of an embodiment of the present invention, the apparatus further includes a first optical component disposed at a side of the light emitter from which the light beam exits, the first optical component being configured to modulate the light beam emitted by the light emitter into a parallel light beam.
As a further improvement of an embodiment of the present invention, the parallel light beam has a set height in a direction perpendicular to the touch screen.
As a further improvement of an embodiment of the present invention, the apparatus further includes a light focusing member disposed at a side of the at least one light receiver beam receiving side.
As a further improvement of an embodiment of the present invention, the apparatus further includes a second optical component disposed at a side of the light beam received by the light receiver, and the second optical component is configured to modulate the parallel light beam and transmit the modulated parallel light beam to the light receiver.
As a further improvement of an embodiment of the present invention, the light condensing member has a light shielding portion extending in a light beam transmission direction.
As a further improvement of an embodiment of the present invention, the apparatus further comprises a mounting gauge assembly for guiding a user to align the light emitter with the corresponding light receiver.
As a further improvement of an embodiment of the present invention, the installation adjustment assembly includes an indicator light, and the processing unit controls the indicator light to be turned on when entering an installation adjustment mode; when the ratio of the light beam received by the light receiver to the light beam emitted by the light emitter is larger than a set value, the processing unit controls the indicator lamp to be turned off.
As a further improvement of an embodiment of the present invention, the rows of light emitters are disposed on two adjacent sides of the first display screen, the rows of light receivers are disposed on the other two adjacent sides opposite to the light emitters, and the processing unit controls the light emitters to emit light beams to the light receivers row by row or column by column or simultaneously row by column.
As a further improvement of the embodiment of the present invention, when the object blocks the light beam emitted from the light emitter to the light receiver, the light beam signal received by the light receiver changes, and the processing unit obtains the relative coordinate position of the object and the first display screen according to the coordinate position of the changed light receiver.
As a further improvement of the embodiment of the present invention, the processing unit may obtain the size of the object and the proximity of the object to the first display screen within the set height according to the changed size of the light beam signal received by the light receiver.
As a further improvement of an embodiment of the present invention, the processing unit is configured to output the relative coordinate position of the object and the first display screen to the electronic device through a wired connection or a wireless connection and display it on the second display screen.
As a further improvement of one embodiment of the invention, the light emitter comprises one or a combination of infrared LEDs, light emitting diodes and laser light sources.
As a further improvement of an embodiment of the present invention, the light receiver includes one or a combination of a phototransistor, a photodiode, and a photoresistor.
In order to achieve one of the above objects, the present invention further provides an electronic device assembly, which includes a first electronic device, a second electronic device, wherein the first electronic device includes a first display screen, the second electronic device includes a second display screen, and an apparatus for displaying a position of an object on the first display screen, which is invisible to a user of the electronic device, on the second display screen near to the eyes as described above.
Compared with the prior art, the invention has the beneficial effects that: the light emitters regularly arranged above the first display screen are transmitted to the light receivers through an air medium to form a plurality of light transmission channels which are arranged in a cross mode, when at least one finger or a touch pen shields one of the light transmission channels, the light receivers in rows and columns corresponding to the contact position of the finger or the touch pen cannot normally receive light waves diffracted by the light emitters, or received light signals are greatly changed compared with the light signals without object contact or approaching the first display screen, the changed light beam signals are transmitted to the processing unit, the processing unit can judge the relative coordinate position of the object position relative to the first display screen according to the preset position of the light receivers, accurate and real-time human-computer interaction can be achieved when the touch screen and the display screen are separated, and the manufacturing cost is low, and the customization, installation and adjustment are easy.
Drawings
FIG. 1 is a schematic diagram of a device usage scenario according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of the hardware configuration of an object positioning device according to an embodiment of the present invention;
FIG. 3 is a schematic top view of an object positioning device without an object touching or in proximity to a first display screen in accordance with one embodiment of the present invention;
FIG. 4 is a schematic side view of an object positioning device when an object contacts or approaches a first display screen in one embodiment of the invention;
FIG. 5 is a schematic structural diagram of an apparatus according to an embodiment of the present invention;
wherein: 1. an electronic device; 2. an object; 10. a device; 11. a light emitter; 12. an optical receiver; 13. a processing unit; 15. a first optical member; 16. a light focusing member; 17. a second optical member; 18. a light shielding portion; 19. a sensor; 20. a first display screen; 21. a second display screen; 30. and installing the adjusting and correcting assembly.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention. It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly or indirectly secured to the other element. When an element is referred to as being "connected to" another element, it can be directly or indirectly connected to the other element. The terms "upper", "lower", "left", "right", and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description, and do not indicate or imply that the referred devices or elements must have a specific orientation, be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention. The terms "first", "second" and "first" are used merely for descriptive purposes and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features. The meaning of "plurality" is two or more unless specifically limited otherwise.
Referring to fig. 1, an embodiment of the present invention provides an apparatus 10 for displaying an object position on a first display screen invisible to a user of an electronic device on a second display screen near to the eye, where the electronic device 1 may be a smart device such as a mobile phone or a PAD, the first display screen 20 may be a touch screen of the electronic device, the second display screen 21 may be a near-to-eye display screen of a head-mounted electronic device such as AR or VR glasses connected to the electronic device, and the first display screen 20 and the second display screen 21 may display the same content synchronously. The use scenario of the device for positioning according to the invention is that the user can only see the second display screen 21 near to the eye after wearing the AR glasses connected to the mobile phone, and at this time, the touch screen of no mobile phone is not visible to the user. When a user wants to interact with the second display screen 21 near to the eyes, the position of the finger relative to the touch screen can not be confirmed at the moment, so that the technical problem solved by the invention is how to display the position of the finger relative to the touch screen on the second display screen 21 of the AR glasses in real time in the use scene.
Referring to fig. 2, an embodiment of the present invention further provides an electronic device combination, including a first electronic device (such as a smartphone) including a first display screen 20, a second electronic device (such as AR glasses) including a second display screen 21, and further including the above-mentioned apparatus for displaying an object position on the first display screen invisible to a user of the electronic device on the second display screen near to eyes. The apparatus in an embodiment of the present invention includes at least one light emitter 11, at least one light receiver 12, and a processing unit 13, where the processing unit 13 may include a processor (not shown) and a memory (not shown), and the processing unit 13 is configured to acquire position coordinates of the light emitter 11 and the light receiver 12 corresponding to the first display screen 20. In a preferred embodiment, the plurality of light emitters 11 and the plurality of light receivers 12 are distributed on the periphery of the first display screen 20 in a set regular arrangement, and after the plurality of light emitters 11 and the plurality of light receivers 12 are arranged on the periphery of the first display screen, the processing unit 13 is configured to acquire a preset relative coordinate position of a light beam propagation path between the at least one light emitter 11 and the at least one light receiver 12 on the first display screen 20. When no object contacts or approaches the first display screen 20, the processing unit 13 controls the light emitters 11 to sequentially emit light beams according to a preset sequence and transmit the light beams to the corresponding light receivers 12 through an air medium.
In one embodiment of the present invention, the apparatus 10 may further comprise at least one sensor 19, and the sensor 19 is configured to detect an acting force of the object contacting the first display screen 20 of the intelligent electronic device and transmit the acting force to the processing unit 13. When the processing unit 13 judges that the acting force of the object 2 contacting the first display screen 20 is greater than the set value, the object positioning event is identified as pressing, and since the coordinate position of the object relative to the first display screen 20 is obtained by the processing unit 13, the user can select the coordinate position according to the requirement and then press the finger, so that the interaction signal can be effectively input to the first display screen 20; when the processing unit 13 determines that the acting force of the object contacting the first display screen 20 is smaller than the set value, it identifies the object positioning event as touch, and if the acting force is determined as touch, it only displays the relative position of the object on the display screen, for example, in the form of a mouse, and does not effectively input an interaction signal to the first display screen 20, so as to avoid misjudgment of the intelligent electronic device during human-computer interaction.
Because the setting position of the device 10 has a preset relative coordinate position relation with the first display screen 20, when an object contacts or approaches the first display screen 20, the object interferes with the original light beam propagation path, so that the light beam received by the light receiver 12 changes to trigger at least one object positioning event; the processing unit 13 can obtain the relative coordinate position between the object and the first display screen 20 based on the object positioning event and the analysis of the preset relative coordinate position, and transmit the relative coordinate position to the electronic device 1 to be correspondingly displayed on the second display screen 21 near the eye, and display the object position on the first display screen invisible to the user of the electronic device on the second display screen near the eye, thereby realizing real-time interaction under the situation that the first display screen is invisible.
Referring to fig. 4, when at least one finger or stylus blocks one of the light propagation channels, the light receivers 12 in rows and columns corresponding to the contact position of the finger or stylus cannot normally receive light waves diffracted by the light emitters 11, or a received light signal changes greatly compared with that when no object contacts or approaches the first display screen 20, the signal of the change of the light beam is transmitted to the processing unit 13, the processing unit 13 may determine a relative coordinate position of the object position with respect to the first display screen 20 according to a preset position of the light receiver 12 (for example, the object position corresponds to an intersection position of the row and column of the light channel), and may implement accurate and real-time human-machine when the touch screen is separated from the display screen (for example, head-mounted AR glasses connected to a smart phone) Interactive control, low manufacturing cost, and easy customization and installation adjustment.
The first display 20 may be rectangular in shape, typically of the 4/3 type, and in particular may be 100mm long and 75mm wide, with a thickness that is small relative to the length and width, typically between 100 μm and 4mm, for example 450 μm, and that is also very small relative to the typical size of the area to be touched by the first display, typically about 1 cm in diameter of the area touched by the user's finger. In this embodiment, the first display screen 20 is a touch screen of an intelligent electronic device (not shown), such as a smart phone, and when a finger of a user or another object such as a stylus approaches or touches the first display screen, the processing unit 13 of the apparatus 10 is configured to feed back a relative coordinate position of the object 2 and the first display screen 20 (e.g. obtaining a coordinate value of a few rows and a few columns) to the intelligent electronic device in a wired or wireless manner, and can display the relative coordinate position on a second display screen 21 (e.g. a micro-display of a head-mounted electronic device) connected to the intelligent electronic device.
When no object 2 contacts or approaches the first display screen 20, the processing unit 13 controls the light emitters 11 to sequentially emit light beams according to a preset sequence and transmit the light beams to the corresponding light receivers 12 through an air medium; when the object 2 touches or approaches the first display screen 20, the object interferes with the original light beam propagation path to change the light beam received by the light receiver 12 to trigger at least one object positioning event; the processing unit 13 obtains the relative coordinate position of the object 2 and the first display screen 20 based on the object localization event analysis. The at least one object positioning event and the data acquired and/or used during operation of the apparatus 10 may be stored on a memory of the processing unit 13.
Referring to fig. 3, specifically, when no object is in contact with or close to the first display screen 20, the light beam emitted from the light emitter 11 is received by the light receiver 12 after passing through at least a partial region above the first display screen 20. In one embodiment of the present invention, the plurality of light emitters 11 includes a row of light emitters 11 disposed on the upper side of the first display screen 20 and a column of light emitters 11 disposed on the left side of the first display screen 20; the plurality of light receivers 12 include a row of light receivers 12 disposed on the lower side of the first display screen 20 and a column of light receivers 12 disposed on the right side of the first display screen 20. When no object 2 such as a finger or a touch pen approaches the first display screen 20, the light beam emitted by each light emitter 11 positioned on the upper side of the first display screen 20 is received by each light receiver 12 positioned on the lower side of the first display screen 20 and opposite thereto, the light beam emitted by each light emitter 11 positioned on the left side of the first display screen 20 is received by each light receiver 12 positioned on the right side of the first display screen 20 and opposite thereto, and the distance between the light emitter 11 and the light receiver 12 depends on the touch resolution.
The rows and columns of the light emitters 11 are disposed on two adjacent sides of the first display 20, the rows and columns of the light receivers 12 are disposed on the other two adjacent sides opposite to the light emitters 11, the processing unit 13 controls the light emitters 11 to emit light beams to the light receivers 12 row by row or column by column to avoid mutual interference between the adjacent light beams, and the processing unit 13 also controls the light emitters 11 to emit light beams to the light receivers 12 row by column at the same time. When the device 10 is in a normal operating state, the light beam emitted by each light emitter 11 is received by at least the corresponding light receiver 12 within a scanning period of less than 3 milliseconds, for example, each scanning period may be 1 or 2 milliseconds. When the object blocks the light beam emitted from the light emitter 11 to the light receiver 12, the light beam signal received by the light receiver 12 changes, and the processing unit 13 obtains the relative coordinate position between the object 2 and the first display screen 20 according to the changed initial coordinate position of the light receiver 12.
In one embodiment of the present invention, the device 10 further comprises a mounting gauge assembly 30, the mounting gauge assembly 30 being used to guide a user to align the light emitter 11 of the device 10 with the corresponding light receiver 12. Specifically, the installation tuning assembly 30 includes an indicator lamp 31, and the processing unit 13 controls the indicator lamp 31 to be lit when entering the installation adjustment mode. Such as poor alignment of the positions where the light emitter 11 and the light receiver 12 are disposed, the indicator light may flash or highlight to indicate to the user that the apparatus 10 should be adjusted; when the ratio of the light beam received by the light receiver 12 to the light beam parameter (such as the light intensity) emitted by the light emitter 11 is greater than the set value, the processing unit 13 controls the indicator lamp 31 to be turned off to prompt the user that the installation adjustment is completed.
If the alignment of the positions of the optical transmitter 11 and the optical receiver 12 is low, the user can reset the installation position of the device 10, for example, when the alignment of the positions of the optical transmitter 11 and the optical receiver 12 is increased after the user calibration, the brightness of the indicator 31 is reduced or the stroboscopic effect is reduced until the indicator 31 goes out after a set threshold is exceeded, so as to remind the user that the device 10 can be used normally at this time. In another embodiment of the present invention, the smart phone is installed with a driver of the apparatus 10, and the alignment installation adjustment function of the apparatus 10 can be executed by an application APP installed on the smart phone, which can effectively improve the accuracy of human-computer interaction control when the user uses the smart phone.
Referring to fig. 5, in an embodiment of the present invention, the apparatus 10 further includes a first optical component 15 (such as a condenser lens) disposed at a side of the light emitter 11 where the light beam exits, wherein the first optical component 15 is used for modulating the light beam emitted by the light emitter 11 into a parallel light beam. The parallel light beams have a set height in a direction perpendicular to the first display screen 20. When the object 2, such as a finger or a stylus, approaches the first display screen 20 within the set height, the processing unit 13 may obtain the size of the object 2 and the proximity of the object 2 to the first display screen 20 within the set height according to the changed size of the light beam signal received by the light receiver 12. Further, when the user positions the object 2 (such as a finger) at the first time of using the apparatus 10, the processing unit 13 of the apparatus 10 learns the usage characteristics of the user according to how much of the light beam is blocked by the user's finger, and saves the usage characteristics of the user to the apparatus 10 or the intelligent electronic device. When the user uses the apparatus 10 again to locate the object 2, the apparatus 10 or the intelligent electronic device can automatically identify the user according to the saved usage characteristics, and the user intelligent identification function can be realized.
Referring to fig. 5, the apparatus 10 further includes a light-gathering component 16 disposed on a side of the light receiver 12 where the light beam is received, and a second optical component 17 (such as a light-gathering lens) disposed on a side of the light receiver 12 where the light beam is received, where the second optical component 17 is used for modulating the parallel light beam and transmitting the parallel light beam to the light receiver 12, so as to improve the efficiency of receiving the light beam emitted by the light emitter 11 by the light receiver 12. The light-focusing member 16 has a light-shielding portion 18 extending along the light beam transmission direction, the light-shielding portion 18 may be a cylindrical structure, the light beam emitted by the light emitter 11 is received by the light receiver 12 after passing through the light-focusing member 16 and the second optical member 17 from the parallel light beam of the first optical member 15, and the light-shielding portion 18 extending along the light beam transmission direction of the light-focusing member 16 may be used to shield light interference from the external environment, thereby improving the anti-interference capability of the device 10 in the working state. The processing unit 13 of the device 10 may be powered from an intelligent electronic device, such as a cell phone, via a wired interface, or may be powered using a self-contained power source of the device 10, such as a rechargeable battery. The light emitter 11 may be one or a combination of infrared LED, light emitting diode and laser light source. The light receiver 12 may be one or a combination of a photo transistor, a photo diode, and a photo resistor.
It should be understood that although the present description refers to embodiments, not every embodiment contains only a single technical solution, and such description is for clarity only, and those skilled in the art should make the description as a whole, and the technical solutions in the embodiments can also be combined appropriately to form other embodiments understood by those skilled in the art.
The above-listed detailed description is only a specific description of a possible embodiment of the present invention, and they are not intended to limit the scope of the present invention, and equivalent embodiments or modifications made without departing from the technical spirit of the present invention should be included in the scope of the present invention.

Claims (16)

1. A device for displaying a position of an object on a second display screen near an eye, comprising: the electronic equipment comprises a first display screen invisible to a user and a second display screen displayed near the eye, the device comprises at least one light emitter, at least one light receiver and a processing unit, the first display screen is a touch screen, the light emitter and the light receiver are distributed on the periphery of the touch screen in a set regular arrangement mode, the second display screen is a head-mounted glasses display screen, and the processing unit is configured to acquire a preset relative coordinate position of a light beam propagation path between the at least one light emitter and the at least one light receiver on the first display screen;
when no object contacts or approaches the first display screen, the processing unit controls the light emitters to sequentially emit light beams according to a preset sequence and transmits the light beams to the corresponding light receivers through air media;
when an object is in contact with or close to the first display screen, the object interferes with the original light beam propagation path to enable the light beam received by the light receiver to change to trigger at least one object positioning event;
the processing unit analyzes and obtains the relative coordinate position of the object and the first display screen based on the object positioning event and the preset relative coordinate position, and correspondingly displays the relative coordinate position on the second display screen near to the eye.
2. The apparatus of claim 1, wherein the display device is further configured to display the object position on a second display screen proximate to the eye, wherein: when no object contacts or approaches the first display screen, the light beam emitted by the light emitter passes through at least a partial area above the touch screen and is received by the light receiver.
3. The apparatus of claim 2, wherein the display device is further configured to display the object position on a second display screen proximate to the eye, wherein: the device also comprises a first optical component arranged on one side part of the light transmitter for emitting the light beam, wherein the first optical component is used for modulating the light beam emitted by the light transmitter into a parallel light beam.
4. The apparatus of claim 3, wherein the display device is further configured to display the object position on a second display screen proximate to the eye, wherein: the parallel light beams have a set height in a direction perpendicular to the touch screen.
5. The device of claim 4, wherein the display unit is configured to display the object position on the second display screen, and wherein: the device also comprises a light gathering component arranged at the side part of the at least one light receiver for receiving the light beam.
6. The device for displaying the position of the object on the second display screen near to the eye according to claim 5, wherein: the device also comprises a second optical component arranged at one side part of the light receiver for receiving the light beam, and the second optical component is used for modulating the parallel light beam and then transmitting the parallel light beam to the light receiver.
7. The device of claim 5, wherein the display unit is further configured to display the object position on a second display screen near the eye: the light condensing part has a light shielding part extending in a light beam transmission direction.
8. The apparatus of claim 1, wherein the display device is further configured to display the object position on a second display screen proximate to the eye, wherein: the apparatus also includes a mounting gauge assembly for guiding a user to align the light emitter with the corresponding light receiver.
9. The apparatus of claim 8, wherein the display device is further configured to display the object position on a second display screen proximate to the eye, wherein: the installation adjusting assembly comprises an indicator light, and the processing unit controls the indicator light to be lightened when entering an installation adjusting mode; when the ratio of the light beam received by the light receiver to the light beam emitted by the light emitter is larger than a set value, the processing unit controls the indicator lamp to be turned off.
10. The apparatus of claim 2, wherein the display device is further configured to display the object position on a second display screen proximate to the eye, wherein: the light emitter row-column is arranged on two adjacent side parts of the first display screen, the light receiver row-column is arranged on the other two adjacent side parts opposite to the light emitter, and the processing unit controls the light emitter to emit light beams to the light receiver row-by-row or column-by-column or simultaneously row-by-row and column-by-column.
11. The apparatus of claim 1, wherein the display device is further configured to display the object position on a second display screen proximate to the eye, wherein: when the object blocks the light beam emitted to the light receiver by the light emitter, the light beam signal received by the light receiver changes, and the processing unit acquires the relative coordinate position of the object and the first display screen according to the changed coordinate position of the light receiver.
12. The device of claim 4, wherein the display unit is configured to display the object position on the second display screen, and wherein: the processing unit can acquire the size of the object and the proximity degree of the object to the first display screen within the set height according to the changed size of the light beam signal received by the light receiver.
13. The apparatus of claim 1, wherein the display device is further configured to display the object position on a second display screen proximate to the eye, wherein: the processing unit is configured to output the relative coordinate position of the object and the first display screen to the electronic device through a wired connection or a wireless connection and display the relative coordinate position on the second display screen.
14. The apparatus of claim 1, wherein the display device is further configured to display the object position on a second display screen proximate to the eye, wherein: the light emitter comprises one or a combination of several of an infrared LED, a light emitting diode and a laser light source.
15. The apparatus of claim 1, wherein the display device is further configured to display the object position on a second display screen proximate to the eye, wherein: the light receiver comprises one or a combination of a plurality of phototransistors, photodiodes and photoresistors.
16. An electronic equipment combination, includes first electronic equipment, second electronic equipment, first electronic equipment includes first display screen, second electronic equipment includes second display screen, its characterized in that: further comprising means for displaying a location of an object on the second display screen near the eye as recited in any of claims 1-15.
CN202210696448.6A 2022-06-20 2022-06-20 Device for displaying object position on second display screen near to eye and combination thereof Active CN114779969B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210696448.6A CN114779969B (en) 2022-06-20 2022-06-20 Device for displaying object position on second display screen near to eye and combination thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210696448.6A CN114779969B (en) 2022-06-20 2022-06-20 Device for displaying object position on second display screen near to eye and combination thereof

Publications (2)

Publication Number Publication Date
CN114779969A CN114779969A (en) 2022-07-22
CN114779969B true CN114779969B (en) 2022-09-16

Family

ID=82421106

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210696448.6A Active CN114779969B (en) 2022-06-20 2022-06-20 Device for displaying object position on second display screen near to eye and combination thereof

Country Status (1)

Country Link
CN (1) CN114779969B (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8902196B2 (en) * 2002-12-10 2014-12-02 Neonode Inc. Methods for determining a touch location on a touch screen
AU2011229745C1 (en) * 2010-03-24 2015-01-29 Neonode Inc. Lens arrangement for light-based touch screen
CN105183241B (en) * 2015-08-21 2018-10-09 惠州Tcl移动通信有限公司 Based on pressure sensing touch screen, display device and realize pressure sensing method
CN109597489A (en) * 2018-12-27 2019-04-09 武汉市天蝎科技有限公司 A kind of method and system of the eye movement tracking interaction of near-eye display device
CN110187855B (en) * 2019-05-28 2022-09-16 幻蝎科技(武汉)有限公司 Intelligent adjusting method for near-eye display equipment for avoiding blocking sight line by holographic image
CN111708431A (en) * 2020-05-12 2020-09-25 青岛小鸟看看科技有限公司 Human-computer interaction method and device, head-mounted display equipment and storage medium

Also Published As

Publication number Publication date
CN114779969A (en) 2022-07-22

Similar Documents

Publication Publication Date Title
EP1936478B1 (en) Position detecting device
US10895936B2 (en) Display apparatus
US6100538A (en) Optical digitizer and display means for providing display of indicated position
CN101630213B (en) Optical touch screen, optical touch screen light source and touch pen
US11003284B2 (en) Touch sensitive device with a camera
KR100782431B1 (en) Multi position detecting method and area detecting method in infrared rays type touch screen
US9292109B2 (en) Interactive input system and pen tool therefor
KR100619584B1 (en) Detail position detecing method and error correcting method in touch panel system
US20030222849A1 (en) Laser-based user input device for electronic projection displays
US20070165007A1 (en) Interactive input system
JP5366789B2 (en) Input indication tool, control method therefor, and coordinate input device
KR20110005737A (en) Interactive input system with optical bezel
WO2021000795A1 (en) Light stylus and projection system
KR20140038745A (en) Touch system comprising optical touch panel and touch pen, and method of controlling interference optical signal in touch system
US10429995B2 (en) Coordinate detecting apparatus
CN114779969B (en) Device for displaying object position on second display screen near to eye and combination thereof
CN101149658A (en) Photoelectric touch-screen
KR102499513B1 (en) Non-touch sensor module with improved recognition distance
CN103399671A (en) Double-camera touch screen and touch pen thereof
KR20130136313A (en) Touch screen system using touch pen and touch recognition metod thereof
JPH10228349A (en) Indication mark operation system for picture display device
KR20010016506A (en) Light mouse device
US20020097222A1 (en) Computer system with optical pointing device
CN114779970B (en) Device and method for displaying object position on second display screen near to eye
WO2021164455A1 (en) Liquid crystal module, electronic device and screen interaction system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant