WO2014059731A1 - 一种光线遥控定位的方法、装置及系统 - Google Patents

一种光线遥控定位的方法、装置及系统 Download PDF

Info

Publication number
WO2014059731A1
WO2014059731A1 PCT/CN2012/086007 CN2012086007W WO2014059731A1 WO 2014059731 A1 WO2014059731 A1 WO 2014059731A1 CN 2012086007 W CN2012086007 W CN 2012086007W WO 2014059731 A1 WO2014059731 A1 WO 2014059731A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
sensing film
end device
resistance value
point
Prior art date
Application number
PCT/CN2012/086007
Other languages
English (en)
French (fr)
Inventor
王晓晖
毛国红
文立夫
Original Assignee
深圳创维数字技术股份有限公司
深圳市创维软件有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳创维数字技术股份有限公司, 深圳市创维软件有限公司 filed Critical 深圳创维数字技术股份有限公司
Priority to EP12886868.4A priority Critical patent/EP2908214A4/en
Priority to US14/420,922 priority patent/US20150222839A1/en
Publication of WO2014059731A1 publication Critical patent/WO2014059731A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0386Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03542Light pens for emitting or receiving light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04892Arrangements for controlling cursor position based on codes indicative of cursor displacements from one discrete location to another, e.g. using cursor control keys associated to different directions or using the tab key
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L31/00Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof
    • H01L31/08Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof in which radiation controls flow of current through the device, e.g. photoresistors
    • H01L31/09Devices sensitive to infrared, visible or ultraviolet radiation
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L31/00Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof
    • H01L31/08Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof in which radiation controls flow of current through the device, e.g. photoresistors
    • H01L31/10Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof in which radiation controls flow of current through the device, e.g. photoresistors characterised by potential barriers, e.g. phototransistors
    • H01L31/101Devices sensitive to infrared, visible or ultraviolet radiation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details

Definitions

  • the present invention relates to the field of electronic technologies, and in particular, to a method, device and system for remote positioning of light. Background technique
  • the remote control is a device for remotely controlling the machine.
  • the traditional infrared remote control is mainly composed of an integrated circuit board and buttons for generating different messages, and has the advantages of being simple and easy to use, but with the development of technology, Some control devices require more and more complex control functions.
  • the current TV has the functions of web browsing, playing dynamic games, and more and more interface elements, so the traditional remote control can not meet the new operational requirements, and the traditional The infrared remote control itself does not provide positioning capability. Its positioning capability is actually converted into moving information by the operation of the keys carried on the above, thereby changing the positional positioning, and its relative position with the screen cannot be displayed. Not convenient enough, it will reduce the user experience. Summary of the invention
  • the technical problem to be solved by the embodiments of the present invention is to provide a method, a device and a system for remotely positioning a light, which can obtain the position coordinates of the beam spot through the light sensing film to realize the remote control operation of the interface element.
  • an embodiment of the present invention provides a method for remotely positioning a light, including:
  • a light beam sensing film on the display screen of the receiving end device senses a beam spot formed on the light sensing film by a light beam emitted from the transmitting end device;
  • the step of extracting a position parameter of the beam point perceived by the light sensing film on the light sensing film, and calculating a position coordinate of the beam point by using the position parameter comprises: extracting the light Sensing film sensed beam point horizontal axis resistance value of the beam point on the light sensing film and beam point longitudinal axis resistance value; Calculating a position coordinate of the beam spot with respect to the light sensing film according to the beam axis transverse axis resistance value and the beam spot longitudinal axis resistance value.
  • the step of calculating the position coordinates of the beam spot relative to the display screen according to the beam axis horizontal axis resistance value and the beam spot longitudinal axis resistance value includes:
  • the full-screen horizontal axis resistance value of the light-sensing film Presetting the full-screen horizontal axis resistance value of the light-sensing film, the full-screen vertical axis resistance value, the length of the horizontal axis of the light-sensing film, and the length of the longitudinal axis of the light-sensing film; the length of the horizontal axis of the sensing film is obtained as the abscissa of the beam spot; The longitudinal axis length gives the ordinate of the beam spot.
  • the method further includes:
  • the operation of the beam spot at the position coordinates is performed based on the key motion information and the position coordinates.
  • the light sensing film includes a conductive layer, a resistive layer having a uniform resistance, and a photoconductive layer having a photosensitive property, and the light guiding layer is connected to the resistive layer and the conductive layer after being irradiated by the light beam, so that The light sensing film senses a position of the beam spot, and the receiving end device extracts a position parameter of the beam spot on the light sensing film according to a resistance value of the resistance layer.
  • the light sensing film directly senses a beam spot formed by the visible light on the light sensing film.
  • the display screen when the light beam emitted by the transmitting end device is invisible light, the display screen generates a cursor pattern on the projection position of the light sensing film according to the invisible light.
  • the light sensing film directly senses a beam spot formed on the light sensing film by the mixed light beam.
  • the embodiment of the present invention further provides a receiving end device for remotely positioning light, comprising: a sensing module, configured to sense a light beam emitted by the transmitting device by using a light sensing film covering the display screen of the receiving device a beam spot formed on the light sensing film;
  • Extracting a calculation module configured to extract the light beam sensed by the light sensing film at the sense of light
  • the positional parameters on the membrane are determined, and the position coordinates of the beam spot are calculated from the positional parameters.
  • the extraction calculation module includes:
  • An extracting unit configured to extract a beam axis horizontal axis resistance value of the beam point on the light sensing film and a beam point longitudinal axis resistance value perceived by the light sensing film;
  • a calculating unit configured to calculate a position coordinate of the beam spot relative to the light sensing film according to the beam point horizontal axis resistance value and the beam point longitudinal axis resistance value.
  • the computing unit includes:
  • a preset subunit configured to preset a full screen horizontal axis resistance value of the light sensing film, a full screen vertical axis resistance value, a light sensing film horizontal axis length, and a longitudinal length of the light sensing film;
  • An abscissa calculation subunit configured to calculate a ratio of a horizontal axis resistance value of the beam spot to a full-screen horizontal axis resistance value, and multiply the length of the light-sensing film horizontal axis to obtain an abscissa of the beam point;
  • a ordinate calculation subunit configured to calculate a ratio of a longitudinal axis resistance value of the beam spot to a full-screen vertical axis resistance value and multiply the length of the light-sensing film longitudinal axis to obtain an ordinate of the beam spot.
  • a receiving module configured to receive key motion information sent by the transmitting device
  • an execution module configured to perform an operation of the beam spot at the position coordinate according to the button motion information and the position coordinate.
  • the light sensing film includes a conductive layer, a resistive layer having a uniform resistance, and a photoconductive layer having a photosensitive property, and the light guiding layer is connected to the resistive layer and the conductive layer after being irradiated by the light beam, so that The light sensing film senses a position of the beam spot, and the receiving end device extracts a position parameter of the beam spot on the light sensing film according to a resistance value of the resistance layer.
  • the light sensing film directly senses a beam spot formed by the visible light on the light sensing film.
  • the display screen when the light beam emitted by the transmitting end device is invisible light, the display screen generates a cursor pattern on the projection position of the light sensing film according to the invisible light.
  • the light sensing film directly senses a beam spot formed on the light sensing film by the mixed light beam.
  • an embodiment of the present invention further provides a system for remotely positioning a light, comprising a transmitting end device and the above receiving end device:
  • the transmitting end device is configured to emit a light beam to form a light beam on a display screen of the receiving end device Point, and when the button is operated, the button action information is transmitted to the receiving device.
  • a light beam is formed by a light sensing film to sense a beam spot formed on a light sensing film of the light emitting device, and extracting a position parameter of the beam spot on the light sensing film according to the position parameter.
  • the position coordinates of the beam spot are calculated, and the beam point points to an interface element to be operated, so that the user can perform precise remote control operation on the interface element, thereby improving the user experience.
  • FIG. 1 is a schematic flow chart of a first embodiment of a method for remotely positioning a light according to an embodiment of the present invention
  • FIG. 2 is a schematic flow chart of a second embodiment of a method for remotely positioning a light according to an embodiment of the present invention
  • FIG. 3 is a schematic structural diagram of a system for remotely positioning a light according to an embodiment of the present invention
  • FIG. 4 is a schematic structural diagram of a receiving end device according to an embodiment of the present invention
  • FIG. 5 is a schematic structural view of the extraction calculation module of Figure 4.
  • Figure 6 is a schematic view showing the structure of the calculation unit of Figure 5. detailed description
  • Embodiments of the present invention provide a method, a device, and a system for remotely positioning a light, which are capable of acquiring a position coordinate of a beam spot through a light sensing film to realize remote operation of an interface element. Description will be made below by way of specific examples.
  • FIG. 1 is a first embodiment of a method for remotely positioning a light according to an embodiment of the present invention. Schematic diagram of the process. As shown in FIG. 1, the method of the embodiment of the present invention includes the following steps:
  • the light beam emitted by the transmitting end device may be visible light or invisible light, or may be a mixed light beam of visible light and invisible light; the light beam emitted by the transmitting end device may be an optical segment that cannot be emitted by the display screen.
  • Light such as ultraviolet light.
  • the light sensing film directly senses the position of the light beam irradiation point, and the user moves the remote controller, and the light spot and the light beam projected onto the display screen also move, and the user passes the visible light.
  • the beam spot formed by the beam operates on the interface elements.
  • the light sensing film senses a position where the invisible light beam is projected, and the receiving end device
  • the display screen immediately displays a cursor-like pattern at the location where the invisible beam is projected.
  • the user moves the remote controller, and the light spot and the light beam projected onto the display screen also move, and the receiving end device instantly refreshes the cursor pattern on the display screen according to the position of the beam spot perceived by the light sensing film. Used to indicate to the user.
  • the light beam emitted by the transmitting end device is a mixed light beam of visible light and invisible light
  • the visible light is used to indicate to the user
  • the invisible light is used to make the light sensing film sense the coordinate position of the light beam irradiation point
  • the user moves the remote controller.
  • the mixed spots and beams projected onto the display screen also move, and the user operates the interface elements by mixing the light spots formed by the mixed beams.
  • the light sensing film may include a conductive layer, a resistive layer having a uniform resistance, and a light guiding layer having photosensitive properties, that is, the light sensing film may be divided into three layers, and the first layer is a resistor with a uniform resistance value.
  • the resistance value of this layer is constant and evenly distributed;
  • the second layer is a photoconductive layer with photosensitive properties, the characteristic of this layer is that the resistance is large when there is no beam irradiation, and once the beam is irradiated, the resistance value drops rapidly to very
  • the third layer is a conductive layer with negligible resistance value, the resistance of this layer is small, almost negligible;
  • the light guiding layer is connected to the resistive layer and the conductive layer after being irradiated by the light beam, so that The light sensing film senses a position of the beam spot, and the receiving end device extracts a position parameter of the beam spot on the light sensing film according to a resistance value of the resistance layer.
  • the point at which the light beam is irradiated to the first layer is point A
  • the light beam is irradiated to The point of the second layer
  • the resistance of the second layer is rapidly reduced to small after the beam is irradiated
  • the resistance at point B is reduced to a small value, which is equivalent to the point A of the first layer passing through the second layer of resistance ⁇
  • the small B point is turned on with the third layer.
  • the receiving end device extracts the resistance value of the point A, that is, extracts a position parameter of the beam spot on the light sensing film, and the receiving end device calculates the beam according to the position parameter.
  • the position coordinates of the point are such that the receiving device positions the interface element according to the position coordinates.
  • a light beam is formed by a light sensing film to sense a beam spot formed on a light sensing film of the light emitting device, and extracting a position parameter of the beam spot on the light sensing film according to the position parameter.
  • the position coordinates of the beam spot are calculated, and the beam point points to an interface element to be operated, so that the user can perform precise remote control operation on the interface element, thereby improving the user experience.
  • FIG. 2 is a schematic flowchart diagram of a second embodiment of a method for remotely positioning a light according to an embodiment of the present invention. As shown in FIG. 2, the method of the embodiment of the present invention includes the following steps:
  • the light beam emitted by the transmitting end device may be visible light or invisible light, or may be a mixed light beam of visible light and invisible light; the light beam emitted by the transmitting end device may be an optical segment that cannot be emitted by the display screen.
  • Light such as ultraviolet light.
  • the light sensing film directly senses the position of the light beam irradiation point, and the user moves the remote controller, and the light spot and the light beam projected onto the display screen also move, and the user passes the visible light.
  • the beam spot formed by the beam operates on the interface elements.
  • the light sensing film senses a position where the invisible light beam is projected, and the receiving end device
  • the display screen immediately displays a cursor-like pattern at the location where the invisible beam is projected.
  • the user moves the remote controller, and the light spot and the light beam projected onto the display screen also move, and the receiving end device instantly refreshes the cursor pattern on the display screen according to the position of the beam spot perceived by the light sensing film. Used to indicate to the user.
  • the light beam emitted by the transmitting end device is a mixed light beam of visible light and invisible light
  • the visible light is used to indicate to the user
  • the invisible light is used to make the light sensing film sense the coordinate position of the light beam irradiation point
  • the user moves the remote controller.
  • the mixed spots and beams projected onto the display screen also move, and the user operates the interface elements by mixing the light spots formed by the mixed beams.
  • the light sensing film may include a conductive layer, a resistive layer having a uniform resistance, and a light guiding layer having photosensitive properties, that is, the light sensing film may be divided into three layers, and the first layer is a resistor with a uniform resistance value.
  • the resistance value of this layer is constant and evenly distributed;
  • the second layer is a photoconductive layer with photosensitive properties, the characteristic of this layer is that the resistance is large when there is no beam irradiation, and once the beam is irradiated, the resistance value drops rapidly to very
  • the third layer is a conductive layer with negligible resistance value, the resistance of this layer is small, almost negligible;
  • the light guiding layer is connected to the resistive layer and the conductive layer after being irradiated by the light beam, so that The light sensing film senses a position of the beam spot, and the receiving end device extracts a position parameter of the beam spot on the light sensing film according to a resistance value of the resistance layer.
  • the receiving end device extracts the resistance value of the point A, that is, extracts the beam axis horizontal axis resistance value of the beam point and the beam point vertical axis resistance value.
  • each of the first layers of the light-sensing film has an electrode
  • the full-screen horizontal axis resistance is a resistance value between the left and right electrodes in the first layer of the light-sensing film
  • the The full-screen horizontal axis resistance value is R1
  • the full-screen vertical-axis resistance value is a resistance value between the upper and lower end electrodes in the first layer of the light-sensing film
  • the full-screen vertical-axis resistance value is R2
  • the length of the horizontal axis of the light sensing film is X
  • the length of the longitudinal axis of the light sensing film is Y.
  • the beam point is P
  • the illumination point P of the first layer is equivalent to the electrode connected to the third layer
  • the first layer electrode and the third layer are measured by the receiving end device.
  • the horizontal axis resistance value of the beam point of the P point between the electrodes and the longitudinal axis resistance value of the beam spot are set.
  • the horizontal axis resistance value of the beam point is R3, and the abscissa Xp of the beam point P is calculated. Please refer to the following formula 1:
  • Xp X* ( R3/R1 )
  • the longitudinal axis of the light-sensing film is obtained by the longitudinal axis of the beam point; Specifically, the longitudinal axis resistance value of the beam spot is R4, and the ordinate Yp of the beam point P is calculated. Please refer to the following formula 2:
  • S206 Receive button motion information sent by the transmitting device.
  • the receiving end device may receive the button action information sent by the transmitting end device by using infrared or other wireless means.
  • S207 Perform an operation of the beam spot at the position coordinate according to the button motion information and the position coordinate;
  • the receiving end device when the receiving end device receives the button action information, the receiving end device performs a series of operations corresponding to the interface element of the beam spot at the position coordinate according to the position coordinate of the beam spot. operating. For example, the beam spot of the light beam is moved to an HTTP webpage link, and the receiving end device will acquire the beam spot position through the light sensing film. After the button is operated, the terminal such as the television is controlled to perform the webpage link where the beam spot position is opened. operating.
  • a light beam is formed by a light sensing film to sense a beam spot formed on a light sensing film of the light emitting device, and extracting a position parameter of the beam spot on the light sensing film according to the position parameter.
  • the position coordinates of the beam spot are calculated, and the beam point points to an interface element to be operated, so that the user can perform precise remote control operation on the interface element, thereby improving the user experience.
  • FIG. 3 is a schematic diagram of a structure of a remote control positioning system according to an embodiment of the present invention.
  • the system for remotely positioning light includes: a transmitting end device 1 and a receiving end device 2.
  • the transmitting end device 1 is configured to emit a light beam to form a beam spot on the display screen of the receiving end device, and when the button is operated, transmit the button action information to the receiving end device;
  • the light beam emitted by the transmitting end device 1 may be visible light or invisible light, or may be a mixed light beam of visible light and invisible light; the light beam emitted by the transmitting end device may be a light language that cannot be emitted by the display screen. Segmental light, such as ultraviolet light.
  • the transmitting device 1 can transmit the button action information to the receiving device 2 by infrared or other wireless means.
  • the receiving end device 2 is configured to sense, by a light sensing film covering the display screen of the receiving end device, a beam spot formed on the light sensing film by a light beam emitted by the transmitting end device, and extract the light sensing film Perceiving a positional parameter of the beam spot on the light sensing film and passing the bit Specifically, when the light beam emitted by the transmitting end device is visible light, the light sensing film directly senses the position of the light beam irradiation point, and the user moves the remote controller, and the light spot and the light beam projected onto the display screen also move. The user operates the interface elements through the beam spot formed by the visible light beam.
  • the light sensing film senses a position where the invisible light beam is projected, and the receiving end device
  • the display screen immediately displays a cursor-like pattern at the location where the invisible beam is projected.
  • the user moves the remote controller, and the light spot and the light beam projected onto the display screen also move, and the receiving end device instantly refreshes the cursor pattern on the display screen according to the position of the beam spot perceived by the light sensing film. Used to indicate to the user.
  • the light beam emitted by the transmitting end device is a mixed light beam of visible light and invisible light
  • the visible light is used to indicate to the user
  • the invisible light is used to make the light sensing film sense the coordinate position of the light beam irradiation point
  • the user moves the remote controller.
  • the mixed spots and beams projected onto the display screen also move, and the user operates the interface elements by mixing the light spots formed by the mixed beams.
  • the light sensing film may include a conductive layer, a resistive layer having a uniform resistance, and a photoconductive layer having a photosensitive property, that is, the light sensing film may be divided into three layers, and the first layer is a resistive layer having a uniform resistance value.
  • the resistance value of the layer is constant and uniformly distributed;
  • the second layer is a photoconductive layer having photosensitive properties, and the characteristic of this layer is that the resistance is large when no beam is irradiated, and the resistance rapidly drops to a small value once the beam is irradiated;
  • the third layer is a conductive layer with negligible resistance value, the resistance of the layer is small and almost negligible;
  • the light guiding layer is connected to the resistive layer and the conductive layer after being irradiated by the light beam, so that the The light sensing film senses the position of the beam spot, and the receiving end device extracts a position parameter of the beam spot on the light sensing film according to the resistance value of the resistance layer.
  • the receiving end device extracts the resistance value of the point A, that is, extracts a position parameter of the beam spot on the light sensing film, and the receiving end device calculates the beam according to the position parameter.
  • the position coordinates of the point are such that the receiving device positions the interface element according to the position coordinates.
  • the receiving device 2 is further configured to receive key motion information sent by the transmitting device, and perform operation of the beam point at the position coordinate according to the button motion information and the position coordinate Work.
  • the transmitting end device 1 senses the light sensing film by emitting light
  • the display screen also emits light.
  • the following three countermeasures can be solved:
  • the light sensing film has a large degree of perception.
  • light segments such as ultraviolet rays and the like that cannot be emitted by the display screen are used.
  • a filter film is added between the light-sensing film and the display screen, like the ultraviolet filter film on the eye piece, to reduce the influence of the light emitted by the display screen on the light-sensing film.
  • the receiving device 2 in Fig. 3 will be described in detail below.
  • FIG. 4 is a schematic structural diagram of a receiving end device according to an embodiment of the present invention.
  • the receiving end device 2 includes: a sensing module 10 and an extracting computing module 20 .
  • the sensing module 10 is configured to sense, by a light sensing film covering the display screen of the receiving device, a beam spot formed on the light sensing film by a light beam emitted by the transmitting device;
  • the sensing module 10 senses a beam spot formed by the light emitted by the transmitting device on the light sensing film, and the light beam emitted by the transmitting device may be visible light or invisible light. It is a mixed light beam of visible light and invisible light; the light beam emitted by the transmitting end device may be an optical segment light that the display screen cannot emit, such as ultraviolet light.
  • the extraction calculation module 20 is configured to extract a position parameter of the beam spot sensed by the light sensing film on the light sensing film, and calculate a position coordinate of the beam point by using the position parameter;
  • the extraction calculation module 20 extracts a positional parameter of the light beam point perceived by the light sensing film on the light sensing film, and the light sensing film may include a conductive layer, a resistance layer having a uniform resistance, and
  • the photoconductive layer of photosensitive property that is, the light sensing film can be divided into three layers, the first layer is a resistance layer with uniform resistance, the resistance value of the layer is constant and uniformly distributed; the second layer has photosensitive property
  • the light guiding layer the characteristic of this layer is that the resistance is very large when there is no beam irradiation, and the resistance value rapidly drops to a small value once the beam is irradiated; the third layer is a conductive layer with negligible resistance value, and the resistance of the layer is small, almost Neglected; the light guiding layer is connected to the
  • the extraction calculation module 20 extracts the resistance value of the point A, that is, extracts the position parameter of the beam point on the light sensing film, and the extraction calculation module 20 calculates the location according to the position parameter.
  • the position coordinates of the beam spot are described so that the receiving end device positions the interface element according to the position coordinate.
  • the receiving device further includes: a receiving module and an executing module.
  • the receiving module is configured to receive the button action information sent by the transmitting device. Specifically, the receiving module may receive the button action information sent by the transmitting device by using infrared or other wireless manner.
  • the executing module is configured to perform an operation of the beam spot at the position coordinate according to the button action information and the position coordinate;
  • the executing module performs a series of operations such as an operation corresponding to the interface element of the beam point at the position coordinate according to the position coordinate of the beam spot.
  • the beam spot of the light beam is moved to an HTTP webpage link, and the receiving end device will acquire the beam spot position through the light sensing film.
  • the terminal such as the television is controlled to perform the webpage link where the beam spot position is opened. operating.
  • the extraction calculation module 20 in Fig. 4 will be described in detail below.
  • FIG. 5 is a schematic structural diagram of the extraction calculation module in FIG. 4.
  • the extraction calculation module 20 includes: an extraction unit 201 and a calculation unit 202.
  • the extracting unit 201 is configured to extract a horizontal axis resistance value of the beam point and a longitudinal axis resistance value of the beam point of the beam point on the light sensing film sensed by the light sensing film;
  • the light sensing film may include a conductive layer, a resistive layer having a uniform resistance, and a light guiding layer having photosensitive properties, that is, the light sensing film may be divided into three layers, and the first layer is a resistor with a uniform resistance value.
  • the resistance value of this layer is constant and evenly distributed;
  • the second layer is a photoconductive layer with photosensitive properties, the characteristic of this layer is that the resistance is large when there is no beam irradiation, and once the beam is irradiated, the resistance value drops rapidly to very
  • the third layer is a conductive layer with negligible resistance value, the resistance of this layer is small, almost negligible;
  • the light guiding layer is connected to the resistive layer and the conductive layer after being irradiated by the light beam, so that The light sensing film senses a position of the beam spot, and is configured by the receiving end device according to the resistance layer A resistance value extracts a positional parameter of the beam spot on the light sensing film.
  • the extracting unit 201 extracts the resistance value of the point A, that is, extracts the beam axis horizontal axis resistance value of the beam point and the beam point vertical axis resistance value.
  • the calculating unit 202 is configured to calculate a position coordinate of the beam point relative to the light sensing film according to the beam axis horizontal axis resistance value and the beam point longitudinal axis resistance value;
  • the calculation unit 202 in Fig. 5 will be described in detail below.
  • FIG. 6 is a schematic structural diagram of the calculation unit in FIG. 5.
  • the calculation unit 202 includes: a preset subunit 2021, an abscissa calculation subunit 2022, and a ordinate calculation subunit 2023.
  • the preset subunit 2021 is configured to preset a full-screen horizontal axis resistance value, a full-screen vertical axis resistance value, a horizontal length of the light-sensing film, and a longitudinal axis length of the light-sensing film;
  • the preset subunit 2021 presets a full screen horizontal axis resistance value, a full screen vertical axis resistance value, a light sensing film horizontal axis length, and a longitudinal length of the light sensing film.
  • Each of the first layers of the light-sensing film has an electrode, and the full-screen horizontal axis resistance is a resistance value between the left and right electrodes in the first layer of the light-sensing film, and the full-screen horizontal axis is set.
  • the resistance value is R1
  • the full-screen vertical axis resistance value is a resistance value between the upper and lower end electrodes in the first layer of the light-sensing film, and the full-screen vertical-axis resistance value is R2, and the light-sensing film is set
  • the length of the horizontal axis is X
  • the length of the longitudinal axis of the light sensing film is Y.
  • the abscissa calculation subunit 2022 is configured to calculate a ratio of a horizontal axis resistance value of the beam point to a full-screen horizontal axis resistance value and multiply the length of the light-sensing film horizontal axis to obtain an abscissa of the beam point; Specifically, the abscissa calculation subunit 2022 calculates a ratio of the horizontal axis resistance value of the beam spot to the full-screen horizontal axis resistance value and multiplied by the horizontal length of the light-sensing film to obtain the abscissa of the beam spot. Let the beam point be P.
  • the illumination point P of the first layer is equivalent to the electrode connected to the third layer, and the receiving device can measure the relationship between the first layer electrode and the third layer electrode.
  • the horizontal axis resistance value of the beam point at the point P and the longitudinal axis resistance value of the beam point are set.
  • the horizontal axis resistance value of the beam point is R3, and the abscissa Xp of the beam point P is calculated. Please refer to the following formula 1:
  • the ordinate calculation subunit 2023 is configured to calculate the longitudinal axis resistance value of the beam spot to occupy the full screen The ratio of the longitudinal axis resistance value is multiplied by the length of the longitudinal axis of the light sensing film to obtain the ordinate of the beam spot. Specifically, the ordinate calculation subunit 2023 calculates the longitudinal axis resistance value of the beam spot to occupy the full screen. The ratio of the longitudinal axis resistance value is multiplied by the length of the longitudinal axis of the light sensing film to obtain the ordinate of the beam spot.
  • the beam axis longitudinal axis resistance value be R4, and calculate the ordinate Yp of the beam spot P, please refer to the following formula 2:
  • a light beam is formed by a light sensing film to sense a beam spot formed on a light sensing film of the light emitting device, and extracting a position parameter of the beam spot on the light sensing film according to the position parameter.
  • the position coordinates of the beam spot are calculated, and the beam point points to an interface element to be operated, so that the user can perform precise remote control operation on the interface element, thereby improving the user experience.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Power Engineering (AREA)
  • Computer Hardware Design (AREA)
  • Electromagnetism (AREA)
  • Position Input By Displaying (AREA)

Abstract

一种光线遥控定位的方法、装置及系统,可通过光线感应膜获取光束点的位置坐标,以实现对界面元素的遥控操作。其中方法包括如下步骤:通过覆盖在接收端装置的显示屏幕上的光线感应膜感知发射端装置发射的光束在所述光线感应膜上所形成的光束点(S101);提取所述光线感应膜感知的所述光束点在所述光线感应膜上的位置参数,并通过所述位置参数计算出所述光束点的位置坐标(S102)。

Description

一种光线遥控定位的方法、 装置及系统 技术领域
本发明涉及电子技术领域, 尤其涉及一种光线遥控定位的方法、 装置及系 统。 背景技术
遥控器是一种用来远控机械的装置, 传统的红外遥控器主要是由集成电路 电板和用来产生不同讯息的按钮所组成, 具有简单易用的优点, 但随着技术的 发展、 有些控制设备要求控制的功能越来越复杂, 如目前的电视已具有网页浏 览、 玩动感游戏的功能, 界面元素越来越多, 因此传统的遥控器已经无法满足 新的操作需求, 而且传统的红外遥控器, 本身并不提供定位能力, 它的定位能 力实际上靠上面所带的按键的操作转化成移动的信息, 进而改变位置定位, 其 与屏幕的相对位置不能被显示出来, 用户操作时不够方便, 将会降低用户的体 验。 发明内容
本发明实施例所要解决的技术问题在于, 提供一种光线遥控定位的方法、 装置及系统,可通过光线感应膜获取光束点的位置坐标, 以实现对界面元素的遥 控操作。
为了解决上述技术问题, 本发明实施例提供了一种光线遥控定位的方法, 包括:
通过覆盖在接收端装置的显示屏幕上的光线感应膜感知发射端装置发射的 光束在所述光线感应膜上所形成的光束点;
提取所述光线感应膜感知的所述光束点在所述光线感应膜上的位置参数, 并通过所述位置参数计算出所述光束点的位置坐标。
其中, 所述提取所述光线感应膜感知的所述光束点在所述光线感应膜上的 位置参数, 并通过所述位置参数计算出所述光束点的位置坐标的步骤包括: 提取所述光线感应膜感知的所述光束点在所述光线感应膜上的光束点横轴 电阻值和光束点纵轴电阻值; 根据所述光束点横轴电阻值和所述光束点纵轴电阻值计算出所述光束点相 对于所述光线感应膜的位置坐标。
其中, 所述根据所述光束点横轴电阻值和所述光束点纵轴电阻值计算出所 述光束点相对于所述显示屏幕的位置坐标的步骤包括:
预设所述光线感应膜的全屏横轴电阻值、 全屏纵轴电阻值、 光线感应膜横 轴长度以及光线感应膜纵轴长度; 感应膜横轴长度得到所述光束点的横坐标; 感应膜纵轴长度得到所述光束点的纵坐标。
其中, 所述提取所述光线感应膜感知的所述光束点在所述光线感应膜上的 位置参数, 并通过所述位置参数计算出所述光束点的位置坐标的步骤之后, 还 包括:
接收所述发射端装置发送的按键动作信息;
根据所述按键动作信息和所述位置坐标执行所述光束点在所述位置坐标处 的操作。
其中, 所述光线感应膜包括导电层、 具有均匀阻值的电阻层以及具有光敏 特性的光导层, 所述光导层被所述光束照射后将连通所述电阻层与所述导电层, 以使所述光线感应膜感知所述光束点的位置, 并由所述接收端装置根据所述电 阻层的电阻值提取所述光束点在所述光线感应膜上的位置参数。
其中, 当所述发射端装置发射的光束为可见光时, 所述光线感应膜直接感 应所述可见光在所述光线感应膜上形成的光束点。
其中, 当所述发射端装置发射的光束为不可见光时, 所述显示屏幕根据所 述不可见光在所述光线感应膜的投射位置上生成一个光标图案。
其中, 当所述发射端装置发射的光束为可见光及不可见光的混合光束时, 所述光线感应膜直接感应所述混合光束在所述光线感应膜上形成的光束点。
相应地, 本发明实施例还提供了一种光线遥控定位的接收端装置, 包括: 感知模块, 用于通过覆盖在接收端装置的显示屏幕上的光线感应膜感知发 射端装置发射的光束在所述光线感应膜上所形成的光束点;
提取计算模块, 用于提取所述光线感应膜感知的所述光束点在所述光线感 应膜上的位置参数, 并通过所述位置参数计算出所述光束点的位置坐标。
其中, 所述提取计算模块包括:
提取单元, 用于提取所述光线感应膜感知的所述光束点在所述光线感应膜 上的光束点横轴电阻值和光束点纵轴电阻值;
计算单元, 用于根据所述光束点横轴电阻值和所述光束点纵轴电阻值计算 出所述光束点相对于所述光线感应膜的位置坐标。
其中, 所述计算单元包括:
预设子单元, 用于预设所述光线感应膜的全屏横轴电阻值、 全屏纵轴电阻 值、 光线感应膜横轴长度以及光线感应膜纵轴长度;
横坐标计算子单元, 用于计算所述光束点横轴电阻值占所述全屏横轴电阻 值的比例再乘以所述光线感应膜横轴长度得到所述光束点的横坐标;
纵坐标计算子单元, 用于计算所述光束点纵轴电阻值占所述全屏纵轴电阻 值的比例再乘以所述光线感应膜纵轴长度得到所述光束点的纵坐标。
其中, 还包括:
接收模块, 用于接收所述发射端装置发送的按键动作信息;
执行模块, 用于根据所述按键动作信息和所述位置坐标执行所述光束点在 所述位置坐标处的操作。
其中, 所述光线感应膜包括导电层、 具有均匀阻值的电阻层以及具有光敏 特性的光导层, 所述光导层被所述光束照射后将连通所述电阻层与所述导电层, 以使所述光线感应膜感知所述光束点的位置, 并由所述接收端装置根据所述电 阻层的电阻值提取所述光束点在所述光线感应膜上的位置参数。
其中, 当所述发射端装置发射的光束为可见光时, 所述光线感应膜直接感 应所述可见光在所述光线感应膜上形成的光束点。
其中, 当所述发射端装置发射的光束为不可见光时, 所述显示屏幕根据所 述不可见光在所述光线感应膜的投射位置上生成一个光标图案。
其中, 当所述发射端装置发射的光束为可见光及不可见光的混合光束时, 所述光线感应膜直接感应所述混合光束在所述光线感应膜上形成的光束点。
相应地, 本发明实施例还提供了一种光线遥控定位的系统, 包括发射端装 置和上述的接收端装置:
所述发射端装置, 用于发射光束到所述接收端装置的显示屏幕上形成光束 点, 并且当操作按键时, 发射按键动作信息到所述接收端装置。
实施本发明实施例, 具有如下有益效果:
本发明实施例通过光线感应膜感知发射端装置发射的光束在所述光线感应 膜上所形成的光束点, 并提取所述光线感应膜上的所述光束点的位置参数, 根 据所述位置参数计算出所述光束点的位置坐标, 所述光束点指向要操作的界面 元素, 让用户对界面元素实现精确的遥控操作, 提高了用户的体验。 附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案, 下面将对实施 例或现有技术描述中所需要使用的附图作简单地介绍, 显而易见地, 下面描述 中的附图仅仅是本发明的一些实施例, 对于本领域普通技术人员来讲, 在不付 出创造性劳动的前提下, 还可以根据这些附图获得其他的附图。
图 1 是本发明实施例提供的一种光线遥控定位的方法的第一实施例的流程 示意图;
图 2是本发明实施例提供的一种光线遥控定位的方法的第二实施例的流程 示意图;
图 3是本发明实施例提供的一种光线遥控定位的系统的结构示意图; 图 4是本发明实施例提供的一种接收端装置的结构示意图;
图 5是图 4中提取计算模块的结构示意图;
图 6是图 5中计算单元的结构示意图。 具体实施方式
下面将结合本发明实施例中的附图, 对本发明实施例中的技术方案进行清 楚、 完整地描述, 显然, 所描述的实施例仅仅是本发明一部分实施例, 而不是 全部的实施例。 基于本发明中的实施例, 本领域普通技术人员在没有作出创造 性劳动前提下所获得的所有其他实施例, 都属于本发明保护的范围。
本发明实施例提供了一种光线遥控定位的方法、 装置及系统, 能够通过光 线感应膜获取光束点的位置坐标, 以实现对界面元素的遥控操作。 下面通过具 体实施例进行说明。
请参阅图 1 ,为本发明实施例提供的一种光线遥控定位的方法的第一实施例 的流程示意图。 如图 1所示, 本发明实施例的所述方法包括以下步骤:
5101 , 通过覆盖在接收端装置的显示屏幕上的光线感应膜感知发射端装置 发射的光束在所述光线感应膜上所形成的光束点;
具体的, 所述发射端装置发射的光束可以是可见光, 也可以是不可见光, 还可以是可见光及不可见光的混合光束; 所述发射端装置发射的光束可以是显 示屏不能发射的光语段光线, 如紫外线。
当所述发射端装置发射的光束是可见光时, 所述光线感应膜直接感知光束 照射点的位置, 用户移动遥控器, 投射到所述显示屏幕上的光点及光束也跟着 移动, 用户通过可见光光束所形成的光束点对界面元素进行操作。
当所述发射端装置发射的光束是不可见光时, 所述不可见光具有一定强度, 如红外线或紫外线等等, 所述光线感应膜感知所述不可见光束投射的位置, 所 述接收端装置的显示屏幕立即在所述不可见光束投射的位置处显示一个光标类 似的图案。 用户移动遥控器, 投射到所述显示屏幕上的光点及光束也跟着移动, 同时所述接收端装置随时根据所述光线感应膜感知到的光束点的位置即时刷新 显示屏幕上的光标图案, 用以给用户指示。
当所述发射端装置发射的光束是可见光及不可见光的混合光束时, 可见光 用来给与用户指示, 不可见光用来让所述光线感应膜感知光束照射点的坐标位 置, 用户移动遥控器, 投射到所述显示屏幕上的混合光点及光束也跟着移动, 用户通过混合光束所形成的混合光点对界面元素进行操作。
5102, 提取所述光线感应膜感知的所述光束点在所述光线感应膜上的位置 参数, 并通过所述位置参数计算出所述光束点的位置坐标;
具体的, 所述光线感应膜可以包括导电层、 具有均匀阻值的电阻层以及具 有光敏特性的光导层, 即所述光线感应膜可以分为三层, 第一层为均勾阻值的 电阻层, 此层的电阻值是不变的且均匀分布的; 第二层为具有光敏特性的光导 层, 此层特性是没有光束照射时候电阻很大, 一旦有光束照射其阻值迅速下降 到很小; 第三层为电阻值可以忽略的导电层, 此层的电阻很小, 几乎可以忽略 不计; 所述光导层被所述光束照射后将连通所述电阻层与所述导电层, 以使所 述光线感应膜感知所述光束点的位置, 并由所述接收端装置根据所述电阻层的 电阻值提取所述光束点在所述光线感应膜上的位置参数。 例如, 当光束从第一 层的某个点射入后, 设所述光束照射到第一层的点为 A点, 设所述光束照射到 第二层的点为 B点, 由于第二层在光束照射后电阻会迅速下降到艮小, 所以 B 点电阻降到艮小, 也就相当于第一层的 A点通过第二层电阻 ^艮小的 B点与第三 层导通。 此时, 所述接收端装置提取所述 A点的电阻值, 即提取所述光束点在 所述光线感应膜上的位置参数, 所述接收端装置并根据所述位置参数计算出所 述光束点的位置坐标, 以使接收端装置根据所述位置坐标对界面元素进行定位。
本发明实施例通过光线感应膜感知发射端装置发射的光束在所述光线感应 膜上所形成的光束点, 并提取所述光线感应膜上的所述光束点的位置参数, 根 据所述位置参数计算出所述光束点的位置坐标, 所述光束点指向要操作的界面 元素, 让用户对界面元素实现精确的遥控操作, 提高了用户的体验。
请参阅图 2,为本发明实施例提供的一种光线遥控定位的方法的第二实施例 的流程示意图。 如图 2所示, 本发明实施例的所述方法包括以下步骤:
5201 , 通过覆盖在接收端装置的显示屏幕上的光线感应膜感知发射端装置 发射的光束在所述光线感应膜上所形成的光束点;
具体的, 所述发射端装置发射的光束可以是可见光, 也可以是不可见光, 还可以是可见光及不可见光的混合光束; 所述发射端装置发射的光束可以是显 示屏不能发射的光语段光线, 如紫外线。
当所述发射端装置发射的光束是可见光时, 所述光线感应膜直接感知光束 照射点的位置, 用户移动遥控器, 投射到所述显示屏幕上的光点及光束也跟着 移动, 用户通过可见光光束所形成的光束点对界面元素进行操作。
当所述发射端装置发射的光束是不可见光时, 所述不可见光具有一定强度, 如红外线或紫外线等等, 所述光线感应膜感知所述不可见光束投射的位置, 所 述接收端装置的显示屏幕立即在所述不可见光束投射的位置处显示一个光标类 似的图案。 用户移动遥控器, 投射到所述显示屏幕上的光点及光束也跟着移动, 同时所述接收端装置随时根据所述光线感应膜感知到的光束点的位置即时刷新 显示屏幕上的光标图案, 用以给用户指示。
当所述发射端装置发射的光束是可见光及不可见光的混合光束时, 可见光 用来给与用户指示, 不可见光用来让所述光线感应膜感知光束照射点的坐标位 置, 用户移动遥控器, 投射到所述显示屏幕上的混合光点及光束也跟着移动, 用户通过混合光束所形成的混合光点对界面元素进行操作。
5202, 提取所述光线感应膜感知的所述光束点在所述光线感应膜上的光束 点横轴电阻值和光束点纵轴电阻值;
具体的, 所述光线感应膜可以包括导电层、 具有均匀阻值的电阻层以及具 有光敏特性的光导层, 即所述光线感应膜可以分为三层, 第一层为均勾阻值的 电阻层, 此层的电阻值是不变的且均匀分布的; 第二层为具有光敏特性的光导 层, 此层特性是没有光束照射时候电阻很大, 一旦有光束照射其阻值迅速下降 到很小; 第三层为电阻值可以忽略的导电层, 此层的电阻很小, 几乎可以忽略 不计; 所述光导层被所述光束照射后将连通所述电阻层与所述导电层, 以使所 述光线感应膜感知所述光束点的位置, 并由所述接收端装置根据所述电阻层的 电阻值提取所述光束点在所述光线感应膜上的位置参数。 例如, 当光束从第一 层的某个点射入后, 设所述光束照射到第一层的点为 A点, 设所述光束照射到 第二层的点为 B点, 由于第二层在光束照射后电阻会迅速下降到艮小, 所以 B 点电阻降到艮小, 也就相当于第一层的 A点通过第二层电阻 ^艮小的 B点与第三 层导通。 此时, 所述接收端装置提取所述 A点的电阻值, 即提取该光束点的光 束点横轴电阻值和光束点纵轴电阻值。
5203 , 预设所述光线感应膜的全屏横轴电阻值、 全屏纵轴电阻值、 光线感 应膜横轴长度以及光线感应膜纵轴长度;
具体的, 所述光线感应膜第一层的四周分别都有一个电极, 所述全屏横轴 电阻值为所述光线感应膜第一层中的左右两侧电极之间的电阻值, 设所述全屏 横轴电阻值为 R1 , 所述全屏纵轴电阻值为所述光线感应膜第一层中的上下两端 电极之间的电阻值, 设所述全屏纵轴电阻值为 R2, 设所述光线感应膜横轴长度 为 X, 设所述光线感应膜纵轴长度为 Y 。
5204 , 计算所述光束点横轴电阻值占所述全屏横轴电阻值的比例再乘以所 述光线感应膜横轴长度得到所述光束点的横坐标;
具体的, 设所述光束点为 P, 在受到光照的时候, 第一层的光照点 P相当于 接到了第三层的电极上, 可由所述接收端装置测量第一层电极与第三层电极之 间 P点的光束点横轴电阻值和光束点纵轴电阻值, 设所述光束点横轴电阻值为 R3 , 计算所述光束点 P的横坐标 Xp, 请参照如下公式 1 :
Xp=X* ( R3/R1 ) 述光线感应膜纵轴长度得到所述光束点的纵坐标; 具体的, 设所述光束点纵轴电阻值为 R4, 计算所述光束点 P的纵坐标 Yp, 请参照如下公式 2:
Yp=Y*(R4/R2)
通过对所述光束点的位置坐标精确地计算, 可以精确地对需要操作的界面 元素进行定位。
5206, 接收所述发射端装置发送的按键动作信息;
具体的, 所述接收端装置可以通过红外或其他无线方式接收所述发射端装 置发送的所述按键动作信息。
5207, 根据所述按键动作信息和所述位置坐标执行所述光束点在所述位置 坐标处的操作;
具体的, 当所述接收端装置接收到所述按键动作信息时, 所述接收端装置 将根据光束点的位置坐标执行所述光束点在所述位置坐标处的界面元素对应的 操作等一系列操作。 例如, 光束的光束点移到一个 HTTP 网页链接, 所述接收 端装置将通过所述光线感应膜获取光束点位置, 当操作按键之后, 将控制电视 等终端执行打开光束点位置所在的网页链接的操作。
本发明实施例通过光线感应膜感知发射端装置发射的光束在所述光线感应 膜上所形成的光束点, 并提取所述光线感应膜上的所述光束点的位置参数, 根 据所述位置参数计算出所述光束点的位置坐标, 所述光束点指向要操作的界面 元素, 让用户对界面元素实现精确的遥控操作, 提高了用户的体验。
请参照图 3 ,图 3是本发明实施例提供的一种光线遥控定位的系统的结构示 意图, 所述光线遥控定位的系统包括: 发射端装置 1、 接收端装置 2。
所述发射端装置 1 ,用于发射光束到所述接收端装置的显示屏幕上形成光束 点, 并且当操作按键时, 发射按键动作信息到所述接收端装置;
具体的, 所述发射端装置 1发射的光束可以是可见光, 也可以是不可见光, 还可以是可见光及不可见光的混合光束; 所述发射端装置发射的光束可以是显 示屏不能发射的光语段光线, 如紫外线。 当用户操作按键时, 所述发射端装置 1 可以通过红外或其他无线方式发送所述按键动作信息到所述接收端装置 2。
所述接收端装置 2,用于通过覆盖在接收端装置的显示屏幕上的光线感应膜 感知发射端装置发射的光束在所述光线感应膜上所形成的光束点, 并提取所述 光线感应膜感知的所述光束点在所述光线感应膜上的位置参数, 并通过所述位 具体的, 当所述发射端装置发射的光束是可见光时, 所述光线感应膜直接 感知光束照射点的位置, 用户移动遥控器, 投射到所述显示屏幕上的光点及光 束也跟着移动, 用户通过可见光光束所形成的光束点对界面元素进行操作。
当所述发射端装置发射的光束是不可见光时, 所述不可见光具有一定强度, 如红外线或紫外线等等, 所述光线感应膜感知所述不可见光束投射的位置, 所 述接收端装置的显示屏幕立即在所述不可见光束投射的位置处显示一个光标类 似的图案。 用户移动遥控器, 投射到所述显示屏幕上的光点及光束也跟着移动, 同时所述接收端装置随时根据所述光线感应膜感知到的光束点的位置即时刷新 显示屏幕上的光标图案, 用以给用户指示。
当所述发射端装置发射的光束是可见光及不可见光的混合光束时, 可见光 用来给与用户指示, 不可见光用来让所述光线感应膜感知光束照射点的坐标位 置, 用户移动遥控器, 投射到所述显示屏幕上的混合光点及光束也跟着移动, 用户通过混合光束所形成的混合光点对界面元素进行操作。
所述光线感应膜可以包括导电层、 具有均匀阻值的电阻层以及具有光敏特 性的光导层, 即所述光线感应膜可以分为三层, 第一层为均勾阻值的电阻层, 此层的电阻值是不变的且均勾分布的; 第二层为具有光敏特性的光导层, 此层 特性是没有光束照射时候电阻很大, 一旦有光束照射其阻值迅速下降到很小; 第三层为电阻值可以忽略的导电层, 此层的电阻很小, 几乎可以忽略不计; 所 述光导层被所述光束照射后将连通所述电阻层与所述导电层, 以使所述光线感 应膜感知所述光束点的位置, 并由所述接收端装置根据所述电阻层的电阻值提 取所述光束点在所述光线感应膜上的位置参数。 例如, 当光束从第一层的某个 点射入后, 设所述光束照射到第一层的点为 A点, 设所述光束照射到第二层的 点为 B点, 由于第二层在光束照射后电阻会迅速下降到艮小, 所以 B点电阻降 到艮小, 也就相当于第一层的 A点通过第二层电阻 ^艮小的 B点与第三层导通。 此时, 所述接收端装置提取所述 A点的电阻值, 即提取所述光束点在所述光线 感应膜上的位置参数, 所述接收端装置并根据所述位置参数计算出所述光束点 的位置坐标, 以使接收端装置根据所述位置坐标对界面元素进行定位。
所述接收端装置 2, 还用于接收所述发射端装置发送的按键动作信息, 并根 据所述按键动作信息和所述位置坐标执行所述光束点在所述位置坐标处的操 作。
由于所述发射端装置 1 是通过发射光线来让所述光线感应膜进行感应的, 而所述显示屏幕同样也会发光。 为了避免这二者之间产生干扰使所述光线感应 膜失效, 可以有以下三种应对措施加以解决:
第一种, 提高所述发射端装置 1 的光线发射强度, 从而使光束照射到所述 光线感应膜之上的时候, 所述光线感应膜有较大的感知度。
第二种, 采用所述显示屏幕不能发射的光语段光线, 如紫外线等等。
第三种, 在所述光线感应膜和所述显示屏幕之间加一层过滤膜, 如同眼睛 片上的紫外线过滤膜一样, 降低所述显示屏幕发出光线对所述光线感应膜的影 响。
下面对图 3中的接收端装置 2进行详细描述。
具体的,请参阅图 4,为本发明实施例提供的一种接收端装置的结构示意图, 所述接收端装置 2包括: 感知模块 10、 提取计算模块 20。
其中, 所述感知模块 10, 用于通过覆盖在接收端装置的显示屏幕上的光线 感应膜感知发射端装置发射的光束在所述光线感应膜上所形成的光束点;
具体的, 所述感知模块 10感知所述发射端装置发射的光束在所述光线感应 膜上所形成的光束点, 所述发射端装置发射的光束可以是可见光, 也可以是不 可见光, 还可以是可见光及不可见光的混合光束; 所述发射端装置发射的光束 可以是显示屏不能发射的光语段光线, 如紫外线。
所述提取计算模块 20, 用于提取所述光线感应膜感知的所述光束点在所述 光线感应膜上的位置参数, 并通过所述位置参数计算出所述光束点的位置坐标; 具体的, 所述提取计算模块 20将提取所述光线感应膜感知的所述光束点在 所述光线感应膜上的位置参数, 所述光线感应膜可以包括导电层、 具有均匀阻 值的电阻层以及具有光敏特性的光导层, 即所述光线感应膜可以分为三层, 第 一层为均匀阻值的电阻层, 此层的电阻值是不变的且均匀分布的; 第二层为具 有光敏特性的光导层, 此层特性是没有光束照射时候电阻很大, 一旦有光束照 射其阻值迅速下降到很小; 第三层为电阻值可以忽略的导电层, 此层的电阻很 小, 几乎可以忽略不计; 所述光导层被所述光束照射后将连通所述电阻层与所 述导电层, 以使所述光线感应膜感知所述光束点的位置, 并由所述接收端装置 根据所述电阻层的电阻值提取所述光束点在所述光线感应膜上的位置参数。 例 如, 当光束从第一层的某个点射入后, 设所述光束照射到第一层的点为 A点, 设所述光束照射到第二层的点为 B点, 由于第二层在光束照射后电阻会迅速下 降到很小, 所以 B点电阻降到很小, 也就相当于第一层的 A点通过第二层电阻 很小的 B点与第三层导通。此时,所述提取计算模块 20提取所述 A点的电阻值 , 即提取所述光束点在所述光线感应膜上的位置参数, 所述提取计算模块 20并根 据所述位置参数计算出所述光束点的位置坐标, 以使接收端装置根据所述位置 坐标对界面元素进行定位。
所述接收端装置还包括: 接收模块、 执行模块。
其中, 所述接收模块, 用于接收所述发射端装置发送的按键动作信息; 具体的, 所述接收模块可以通过红外或其他无线方式接收所述发射端装置 发送的所述按键动作信息。
所述执行模块, 用于根据所述按键动作信息和所述位置坐标执行所述光束 点在所述位置坐标处的操作;
具体的, 当所述接收模块接收到所述按键动作信息时, 所述执行模块将根 据光束点的位置坐标执行所述光束点在所述位置坐标处的界面元素对应的操作 等一系列操作。 例如, 光束的光束点移到一个 HTTP 网页链接, 所述接收端装 置将通过所述光线感应膜获取光束点位置, 当操作按键之后, 将控制电视等终 端执行打开光束点位置所在的网页链接的操作。
下面对图 4中的提取计算模块 20进行详细描述。
具体的, 请参阅图 5, 图 5是图 4中提取计算模块的结构示意图, 所述提取 计算模块 20包括: 提取单元 201、 计算单元 202。
其中, 所述提取单元 201 , 用于提取所述光线感应膜感知的所述光束点在所 述光线感应膜上的光束点横轴电阻值和光束点纵轴电阻值;
具体的, 所述光线感应膜可以包括导电层、 具有均匀阻值的电阻层以及具 有光敏特性的光导层, 即所述光线感应膜可以分为三层, 第一层为均勾阻值的 电阻层, 此层的电阻值是不变的且均匀分布的; 第二层为具有光敏特性的光导 层, 此层特性是没有光束照射时候电阻很大, 一旦有光束照射其阻值迅速下降 到很小; 第三层为电阻值可以忽略的导电层, 此层的电阻很小, 几乎可以忽略 不计; 所述光导层被所述光束照射后将连通所述电阻层与所述导电层, 以使所 述光线感应膜感知所述光束点的位置, 并由所述接收端装置根据所述电阻层的 电阻值提取所述光束点在所述光线感应膜上的位置参数。 例如, 当光束从第一 层的某个点射入后, 设所述光束照射到第一层的点为 A点, 设所述光束照射到 第二层的点为 B点, 由于第二层在光束照射后电阻会迅速下降到艮小, 所以 B 点电阻降到艮小, 也就相当于第一层的 A点通过第二层电阻 ^艮小的 B点与第三 层导通。 此时, 所述提取单元 201提取所述 A点的电阻值, 即提取该光束点的 光束点横轴电阻值和光束点纵轴电阻值。
所述计算单元 202,用于根据所述光束点横轴电阻值和所述光束点纵轴电阻 值计算出所述光束点相对于所述光线感应膜的位置坐标;
下面对图 5中的计算单元 202进行详细描述。
具体的, 请参阅图 6, 图 6是图 5中计算单元的结构示意图, 所述计算单元 202包括: 预设子单元 2021、横坐标计算子单元 2022、纵坐标计算子单元 2023。
其中, 所述预设子单元 2021 , 用于预设所述光线感应膜的全屏横轴电阻值、 全屏纵轴电阻值、 光线感应膜横轴长度以及光线感应膜纵轴长度;
具体的, 所述预设子单元 2021预设所述光线感应膜的全屏横轴电阻值、 全 屏纵轴电阻值、 光线感应膜横轴长度以及光线感应膜纵轴长度。 所述光线感应 膜第一层的四周分别都有一个电极, 所述全屏横轴电阻值为所述光线感应膜第 一层中的左右两侧电极之间的电阻值, 设所述全屏横轴电阻值为 R1 , 所述全屏 纵轴电阻值为所述光线感应膜第一层中的上下两端电极之间的电阻值, 设所述 全屏纵轴电阻值为 R2, 设所述光线感应膜横轴长度为 X, 设所述光线感应膜纵 轴长度为 Y 。
所述横坐标计算子单元 2022, 用于计算所述光束点横轴电阻值占所述全屏 横轴电阻值的比例再乘以所述光线感应膜横轴长度得到所述光束点的横坐标; 具体的, 所述横坐标计算子单元 2022计算所述光束点横轴电阻值占所述全 屏横轴电阻值的比例再乘以所述光线感应膜横轴长度得到所述光束点的横坐 标。设所述光束点为 P, 在受到光照的时候, 第一层的光照点 P相当于接到了第 三层的电极上, 可由所述接收端装置测量第一层电极与第三层电极之间 P点的 光束点横轴电阻值和光束点纵轴电阻值, 设所述光束点横轴电阻值为 R3 , 计算 所述光束点 P的横坐标 Xp, 请参照如下公式 1 :
Xp=X* ( R3/R1 )
所述纵坐标计算子单元 2023 , 用于计算所述光束点纵轴电阻值占所述全屏 纵轴电阻值的比例再乘以所述光线感应膜纵轴长度得到所述光束点的纵坐标; 具体的, 所述纵坐标计算子单元 2023计算所述光束点纵轴电阻值占所述全 屏纵轴电阻值的比例再乘以所述光线感应膜纵轴长度得到所述光束点的纵坐 标。 设所述光束点纵轴电阻值为 R4, 计算所述光束点 P的纵坐标 Yp, 请参照 如下公式 2:
Yp=Y*(R4/R2)
通过对所述光束点的位置坐标精确地计算, 可以精确地对需要操作的界面 元素进行定位。
本发明实施例通过光线感应膜感知发射端装置发射的光束在所述光线感应 膜上所形成的光束点, 并提取所述光线感应膜上的所述光束点的位置参数, 根 据所述位置参数计算出所述光束点的位置坐标, 所述光束点指向要操作的界面 元素, 让用户对界面元素实现精确的遥控操作, 提高了用户的体验。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程, 是可以通过计算机程序来指令相关的硬件来完成, 所述的程序可存储于一计算 机可读取存储介质中, 该程序在执行时, 可包括如上述各方法的实施例的流程。 其中, 所述的存储介质可为磁碟、 光盘、 只读存储记忆体(Read-Only Memory, ROM )或随机存储记忆体(Random Access Memory, RAM )等。
以上所揭露的仅为本发明较佳实施例而已, 当然不能以此来限定本发明之 权利范围, 因此依本发明权利要求所作的等同变化, 仍属本发明所涵盖的范围。

Claims

1、 一种光线遥控定位的方法, 其特征在于, 包括:
通过覆盖在接收端装置的显示屏幕上的光线感应膜感知发射端装置发射的 光束在所述光线感应膜上所形成的光束点;
提取所述光线感应膜感知的所述光束点在所述光线感应膜上的位置参数, 并通过所述位置参数计算出所述光束点的位置坐标。
2、 如权利要求 1所述的方法, 其特征在于, 所述提取所述光线感应膜感知 的所述光束点在所述光线感应膜上的位置参数, 并通过所述位置参数计算出所 述光束点的位置坐标的步骤包括:
提取所述光线感应膜感知的所述光束点在所述光线感应膜上的光束点横轴 电阻值和光束点纵轴电阻值;
根据所述光束点横轴电阻值和所述光束点纵轴电阻值计算出所述光束点相 对于所述光线感应膜的位置坐标。
3、 如权利要求 2所述的方法, 其特征在于, 所述根据所述光束点横轴电阻 值和所述光束点纵轴电阻值计算出所述光束点相对于所述显示屏幕的位置坐标 的步骤包括:
预设所述光线感应膜的全屏横轴电阻值、 全屏纵轴电阻值、 光线感应膜横 轴长度以及光线感应膜纵轴长度; 感应膜横轴长度得到所述光束点的横坐标; 感应膜纵轴长度得到所述光束点的纵坐标。
4、 如权利要求 3所述的方法, 其特征在于, 所述提取所述光线感应膜感知 的所述光束点在所述光线感应膜上的位置参数, 并通过所述位置参数计算出所 述光束点的位置坐标的步骤之后, 还包括:
接收所述发射端装置发送的按键动作信息; 根据所述按键动作信息和所述位置坐标执行所述光束点在所述位置坐标处 的操作。
5、 如权利要求 4所述的方法, 其特征在于, 所述光线感应膜包括导电层、 射后将连通所述电阻层与所述导电层, 以使所述光线感应膜感知所述光束点的 位置, 并由所述接收端装置根据所述电阻层的电阻值提取所述光束点在所述光 线感应膜上的位置参数。
6、 如权利要求 5所述的方法, 其特征在于, 当所述发射端装置发射的光束 为可见光时, 所述光线感应膜直接感应所述可见光在所述光线感应膜上形成的 光束点。
7、 如权利要求 5所述的方法, 其特征在于, 当所述发射端装置发射的光束 为不可见光时, 所述显示屏幕根据所述不可见光在所述光线感应膜的投射位置 上生成一个光标图案。
8、 如权利要求 5所述的方法, 其特征在于, 当所述发射端装置发射的光束 为可见光及不可见光的混合光束时, 所述光线感应膜直接感应所述混合光束在 所述光线感应膜上形成的光束点。
9、 一种光线遥控定位的接收端装置, 其特征在于, 包括:
感知模块, 用于通过覆盖在接收端装置的显示屏幕上的光线感应膜感知发 射端装置发射的光束在所述光线感应膜上所形成的光束点;
提取计算模块, 用于提取所述光线感应膜感知的所述光束点在所述光线感 应膜上的位置参数, 并通过所述位置参数计算出所述光束点的位置坐标。
10、 如权利要求 9所述的接收端装置, 其特征在于, 所述提取计算模块包 括:
提取单元, 用于提取所述光线感应膜感知的所述光束点在所述光线感应膜 上的光束点横轴电阻值和光束点纵轴电阻值;
计算单元, 用于根据所述光束点横轴电阻值和所述光束点纵轴电阻值计算 出所述光束点相对于所述光线感应膜的位置坐标。
11、 如权利要求 10所述的接收端装置, 其特征在于, 所述计算单元包括: 预设子单元, 用于预设所述光线感应膜的全屏横轴电阻值、 全屏纵轴电阻 值、 光线感应膜横轴长度以及光线感应膜纵轴长度;
横坐标计算子单元, 用于计算所述光束点横轴电阻值占所述全屏横轴电阻 值的比例再乘以所述光线感应膜横轴长度得到所述光束点的横坐标;
纵坐标计算子单元, 用于计算所述光束点纵轴电阻值占所述全屏纵轴电阻 值的比例再乘以所述光线感应膜纵轴长度得到所述光束点的纵坐标。
12、 如权利要求 11所述的接收端装置, 其特征在于, 还包括:
接收模块, 用于接收所述发射端装置发送的按键动作信息;
执行模块, 用于根据所述按键动作信息和所述位置坐标执行所述光束点在 所述位置坐标处的操作。
13、 如权利要求 12所述的接收端装置, 其特征在于, 所述光线感应膜包括 述光束照射后将连通所述电阻层与所述导电层, 以使所述光线感应膜感知所述 光束点的位置, 并由所述接收端装置根据所述电阻层的电阻值提取所述光束点 在所述光线感应膜上的位置参数。
14、 如权利要求 13所述的接收端装置, 其特征在于, 当所述发射端装置发 射的光束为可见光时, 所述光线感应膜直接感应所述可见光在所述光线感应膜 上形成的光束点。
15、 如权利要求 13所述的接收端装置, 其特征在于, 当所述发射端装置发 射的光束为不可见光时, 所述显示屏幕根据所述不可见光在所述光线感应膜的 投射位置上生成一个光标图案。
16、 如权利要求 13所述的接收端装置, 其特征在于, 当所述发射端装置发 射的光束为可见光及不可见光的混合光束时, 所述光线感应膜直接感应所述混 合光束在所述光线感应膜上形成的光束点。
17、 一种光线遥控定位的系统, 其特征在于, 包括发射端装置和接收端装 置:
所述发射端装置, 用于发射光束到所述接收端装置的显示屏幕上形成光束 点, 并且当操作按键时, 发射按键动作信息到所述接收端装置;
其中所述接收端装置包括如权利要求 9-16任一项所述的接收端装置。
PCT/CN2012/086007 2012-10-15 2012-12-06 一种光线遥控定位的方法、装置及系统 WO2014059731A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP12886868.4A EP2908214A4 (en) 2012-10-15 2012-12-06 METHOD, DEVICE AND SYSTEM FOR LIGHT POSITIONING BY REMOTE CONTROL
US14/420,922 US20150222839A1 (en) 2012-10-15 2012-12-06 Method, device and system for light remote control positioning

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201210389018.6A CN102945075B (zh) 2012-10-15 2012-10-15 一种光线遥控定位的方法、装置及系统
CN201210389018.6 2012-10-15

Publications (1)

Publication Number Publication Date
WO2014059731A1 true WO2014059731A1 (zh) 2014-04-24

Family

ID=47728025

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2012/086007 WO2014059731A1 (zh) 2012-10-15 2012-12-06 一种光线遥控定位的方法、装置及系统

Country Status (4)

Country Link
US (1) US20150222839A1 (zh)
EP (1) EP2908214A4 (zh)
CN (1) CN102945075B (zh)
WO (1) WO2014059731A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104345987B (zh) * 2013-08-09 2018-06-01 联想(北京)有限公司 一种电子设备及信号处理方法
CN104461178B (zh) * 2014-12-26 2017-07-04 京东方科技集团股份有限公司 光触控结构、光触控显示基板及其制备方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006011569A (ja) * 2004-06-23 2006-01-12 Citizen Seimitsu Co Ltd 光学式タッチパネル
CN101477427A (zh) * 2008-12-17 2009-07-08 卫明 接触或无接触式红外线激光多点触控装置
CN101593067A (zh) * 2008-05-30 2009-12-02 鸿富锦精密工业(深圳)有限公司 采用光信号进行控制的屏幕
CN101751188A (zh) * 2010-01-27 2010-06-23 汕头超声显示器(二厂)有限公司 内置感应的显示装置
CN101989151A (zh) * 2009-07-31 2011-03-23 智点科技(深圳)有限公司 一种光触控屏
CN102033665A (zh) * 2010-12-09 2011-04-27 邯郸市创讯计算机信息工程有限公司 墙面一体式交互电子白板
CN102279656A (zh) * 2010-06-10 2011-12-14 鼎亿数码科技(上海)有限公司 激光平面定位系统及其实现方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4761547A (en) * 1985-03-18 1988-08-02 Kabushiki Kaisha Komatsu Seisakusho Semiconductor photoelectric conversion device for light incident position detection
US4794634A (en) * 1985-12-24 1988-12-27 Kabushiki Kaisha Komatsu Seisakusho Position-sensitive photodetector and light transmissive tablet and light-emitting pen
JPH0640023B2 (ja) * 1986-09-25 1994-05-25 株式会社神戸製鋼所 光入力の位置・分散検出方法および装置
JPH0749924B2 (ja) * 1989-09-26 1995-05-31 理化学研究所 多重出力電極型像位置検出器
US20070080940A1 (en) * 2005-10-07 2007-04-12 Sharp Kabushiki Kaisha Remote control system, and display device and electronic device using the remote control system
JP5128835B2 (ja) * 2007-03-20 2013-01-23 株式会社トプコン レーザ光受光位置検出センサ及びこれを用いたレベル装置
CN101436114B (zh) * 2007-11-16 2011-10-19 华硕电脑股份有限公司 触控显示装置及识别多个压触点的方法
AT506617B1 (de) * 2008-02-27 2011-03-15 Isiqiri Interface Tech Gmbh Anzeigefläche und damit kombinierte steuervorrichtung

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006011569A (ja) * 2004-06-23 2006-01-12 Citizen Seimitsu Co Ltd 光学式タッチパネル
CN101593067A (zh) * 2008-05-30 2009-12-02 鸿富锦精密工业(深圳)有限公司 采用光信号进行控制的屏幕
CN101477427A (zh) * 2008-12-17 2009-07-08 卫明 接触或无接触式红外线激光多点触控装置
CN101989151A (zh) * 2009-07-31 2011-03-23 智点科技(深圳)有限公司 一种光触控屏
CN101751188A (zh) * 2010-01-27 2010-06-23 汕头超声显示器(二厂)有限公司 内置感应的显示装置
CN102279656A (zh) * 2010-06-10 2011-12-14 鼎亿数码科技(上海)有限公司 激光平面定位系统及其实现方法
CN102033665A (zh) * 2010-12-09 2011-04-27 邯郸市创讯计算机信息工程有限公司 墙面一体式交互电子白板

Also Published As

Publication number Publication date
US20150222839A1 (en) 2015-08-06
EP2908214A4 (en) 2016-07-06
EP2908214A1 (en) 2015-08-19
CN102945075A (zh) 2013-02-27
CN102945075B (zh) 2016-12-21

Similar Documents

Publication Publication Date Title
US9753643B2 (en) Projection screen, remote control terminal, projection device, display device, projection system and remote control method for projection system
CN106973212B (zh) 一种摄像装置及移动终端
CN103049190B (zh) 一种移动通信终端及控制设备操作的方法
US20200142578A1 (en) Portable device and method for controlling brightness of the same
US20130159940A1 (en) Gesture-Controlled Interactive Information Board
CN105786225B (zh) 一种定位触控方法、装置及系统
US10379675B2 (en) Interactive projection apparatus and touch position determining method thereof
JP2012098959A (ja) 表示装置、位置補正方法およびプログラム
CN110780746A (zh) 一种按键结构、按键的控制方法及电子设备
TW201425974A (zh) 姿態感測裝置與方法
JP2013058185A (ja) 解像度が調整可能なマウス
KR20110025520A (ko) 휴대단말기의 제어 장치 및 방법
CN118411814B (zh) 基于投影仪摄像头的类触控遥控方法及系统
WO2014059731A1 (zh) 一种光线遥控定位的方法、装置及系统
CN104270664B (zh) 光笔遥控器、实现智能操作平台输入控制的系统及方法
WO2016034080A1 (zh) 开机模式控制方法和终端
US20130244730A1 (en) User terminal capable of sharing image and method for controlling the same
WO2014048031A1 (zh) 一种光线遥控定位的方法、装置及系统
WO2018065197A1 (en) System and method for positioning cursor on oled display devices
KR101179466B1 (ko) 터치기구의 접근 인지를 이용한 휴대단말 및 그의 객체 표시 방법
KR102373464B1 (ko) 소정의 공간 내의 단말기들을 연결하는 방법 및 이를 이용한 단말기
JP6105434B2 (ja) 画像表示装置及びその操作方法
CN102331868A (zh) 指示系统、控制装置以及控制方法
CN104793813A (zh) 一种显示基板、显示装置及遥控系统
JP2014120168A (ja) テレビ、制御装置及び制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12886868

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2012886868

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 14420922

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE