CN104898956A - Information processing method, input equipment and electronic equipment - Google Patents

Information processing method, input equipment and electronic equipment Download PDF

Info

Publication number
CN104898956A
CN104898956A CN201410082152.0A CN201410082152A CN104898956A CN 104898956 A CN104898956 A CN 104898956A CN 201410082152 A CN201410082152 A CN 201410082152A CN 104898956 A CN104898956 A CN 104898956A
Authority
CN
China
Prior art keywords
optical signalling
electronic equipment
display unit
view data
input equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410082152.0A
Other languages
Chinese (zh)
Other versions
CN104898956B (en
Inventor
彭祎帆
阳光
尚可
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201410082152.0A priority Critical patent/CN104898956B/en
Publication of CN104898956A publication Critical patent/CN104898956A/en
Application granted granted Critical
Publication of CN104898956B publication Critical patent/CN104898956B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Position Input By Displaying (AREA)

Abstract

The present invention discloses an information processing method, input equipment and electronic equipment. The method is applied to the input equipment, the input equipment is in communication with the electronic equipment, the electronic equipment comprises a display unit, and the input equipment is used for forming tracing points on the display unit so as to control the electronic equipment to execute a first operation. The method comprises the steps: transmitting a first optical signal to the electronic equipment; receiving a second optical signal from the electronic equipment, wherein the second optical signal is generated by the electronic equipment according to the first optical signal; and determining position information on the tracing points, which are formed on the display unit by the input equipment, according to the second optical signal. Therefore, in the invention, the position information on the tracing points, which are formed on the display unit by the input equipment, can be determined through transmitting and receiving optical signals between the input equipment and the electronic equipment, and additional electronic components and devices are not required to be attached to the surface of the display unit, so that the light and thin performances and integration of the display unit are improved.

Description

Information processing method, input equipment and electronic equipment
Technical field
The present invention relates to field of computer technology, more specifically, the present invention relates to a kind of information processing method, input equipment and electronic equipment.
Background technology
In recent years, such as the electronic equipment of notebook, desk-top computer, panel computer (PAD), mobile phone, multimedia player, PDA(Personal Digital Assistant) and so on is universal all the more.Usually, in such electronic equipment, touch-display unit (such as, touch-screen) is all integrated with.This touch-display unit can receive user's input, completes various operation to make electronic equipment according to user's request.Simply, user directly can realize gesture operation by finger movement in touch-screen.But finger input mode exists obviously not enough in the accuracy and tractability of operation.Thus, pen-based input mode supported by increasing electronic equipment.
Pen-based input equipment, as the interactive mode of the current comparative maturity in the application of the field of calculating, has had many commercial products, and has been widely used in fields such as comprising Drawing Design, business presentations, digital entertainment.In the prior art, in order to be provided in location and the interactive function on touch-display unit surface by pen-based input equipment, most electronic equipment often adopts the resistance-type/capacitive touch screen of contact, or provides contactless electromagnetic induction touch-screen.
The structure of resistive touch screen by adopting film to add glass, the one side that film is adjacent with glass all scribbles tin indium oxide (ITO) coating, and it has good electric conductivity and the transparency.When touch operation, the ITO of film lower floor can touch the ITO of glass superstrate, spreads out of corresponding electric signal via inductor, and this electric signal can deliver to processor through change-over circuit, and is converted into the coordinate figure on screen by computing.
Capacitance-type touch screen technology utilizes the electric current of human body to respond to carry out work.Capacitive touch screen is one piece of four layers of compound glass screen; the inside surface of glass screen and interlayer respectively scribble one deck ITO, and outermost layer is skim silicon soil glassivation, and interlayer ITO coating is as workplace; four electrodes are drawn at four angles, and internal layer ITO is that screen layer is to ensure good working environment.When finger touch on the metal layer time, the electric capacity of contact point will change, and the oscillator frequency be attached thereto is changed, by survey frequency change can determine touching position information.
The ultimate principle of electromagnetic induction touch-screen produces changes of magnetic field in operation from the check-out console under touch-display unit to differentiate the different input actions of user by time writer.Time writer is signal transmitter, and check-out console is signal receiver, when time writer and check-out console due to close to and produce electromagnetic induction time, magnetic flux between the two changes, and by its computing, can determine the location point of time writer in touch-display unit.
Although above three kinds of touch-control input modes are had nothing in common with each other, these interaction designs all need the electronic devices and components outside the plus of existing display unit surface.Like this, make the structure of display unit become complicated undoubtedly, and be unfavorable for realizing the lightening further and integrated of display unit.
Therefore, a kind of novel information processing method, input equipment and electronic equipment is needed to solve the problems referred to above.
Summary of the invention
In order to solve the problems of the technologies described above, according to an aspect of the present invention, provide a kind of information processing method, described method is applied to input equipment, described input equipment communicates with electronic equipment, and described electronic equipment comprises display unit, and described input equipment for forming tracing point on described display unit, perform the first operation to control described electronic equipment, described method comprises: transmit the first optical signalling to described electronic equipment; Receive the second optical signalling from described electronic equipment, described second optical signalling is that described electronic equipment generates according to described first optical signalling; And the positional information of the tracing point that described input equipment is formed on described display unit is determined according to described second optical signalling.
In addition, according to a further aspect in the invention, provide a kind of information processing method, described method is applied to electronic equipment, described electronic equipment communicates with input equipment, and described electronic equipment comprises display unit, and described input equipment for forming tracing point on described display unit, perform the first operation to control described electronic equipment, described method comprises: receive the first optical signalling from described input equipment; The second optical signalling is generated, the positional information that described second optical signalling makes described input equipment can determine from the tracing point formed on described display unit according to described first optical signalling; And transmit described second optical signalling to described input equipment.
According to another aspect of the invention, provide a kind of input equipment, described input equipment communicates with electronic equipment, described electronic equipment comprises display unit, described input equipment for forming tracing point on described display unit, perform the first operation to control described electronic equipment, described input equipment comprises: the first signal transmission unit, for transmitting the first optical signalling to described electronic equipment; Secondary signal receiving element, for receiving the second optical signalling from described electronic equipment, described second optical signalling is that described electronic equipment generates according to described first optical signalling; And positional information determining unit, for determining the positional information of the tracing point that described input equipment is formed on described display unit according to described second optical signalling.
According to another aspect of the invention, provide a kind of electronic equipment, described electronic equipment communicates with input equipment, described electronic equipment comprises display unit, described input equipment for forming tracing point on described display unit, perform the first operation to control described electronic equipment, described electronic equipment comprises: the first signal receiving unit, for receiving the first optical signalling from described input equipment; Secondary signal generation unit, for generating the second optical signalling according to described first optical signalling, the positional information that described second optical signalling makes described input equipment can determine from the tracing point formed on described display unit; And secondary signal delivery unit, for transmitting described second optical signalling to described input equipment.
Compared with prior art, adopt according to information processing method of the present invention, the positional information of the tracing point that described input equipment is formed on described display unit can be determined by transmission and receiving optical signal between input equipment and electronic equipment, and without the need to the electronic devices and components outside the plus of display unit surface, thus improve frivolous property and the integration of display unit.
Other features and advantages of the present invention will be set forth in the following description, and, partly become apparent from instructions, or understand by implementing the present invention.Object of the present invention and other advantages realize by structure specifically noted in instructions, claims and accompanying drawing and obtain.
Accompanying drawing explanation
Accompanying drawing is used to provide a further understanding of the present invention, and forms a part for instructions, together with embodiments of the present invention for explaining the present invention, is not construed as limiting the invention.In the accompanying drawings:
Fig. 1 illustrates according to information processing method of the present invention.
Fig. 2 illustrates according to information processing method of the present invention.
Fig. 3 illustrates information processing method according to a first embodiment of the present invention.
Fig. 4 A illustrates the side view of the interactive system according to embodiment of the present invention instantiation.
Fig. 4 B illustrates the front elevation of the interactive system according to embodiment of the present invention instantiation.
Fig. 5 illustrates information processing method according to a second embodiment of the present invention.
Fig. 6 illustrates information processing method according to a third embodiment of the present invention.
Fig. 7 illustrates information processing method according to a fourth embodiment of the present invention.
Fig. 8 illustrates according to input equipment of the present invention.
Fig. 9 illustrates according to electronic equipment of the present invention.
Figure 10 illustrates interactive system according to a first embodiment of the present invention.
Figure 11 illustrates interactive system according to a second embodiment of the present invention.
Figure 12 illustrates interactive system according to a third embodiment of the present invention.
Figure 13 illustrates interactive system according to a fourth embodiment of the present invention.
Embodiment
Describe in detail with reference to the accompanying drawings according to each embodiment of the present invention.Here it is to be noted that it in the accompanying drawings, identical Reference numeral is given there is identical or similar structures and function ingredient substantially, and the repeated description of will omit about them.
First, the application scenarios according to information processing method of the present invention will be described.
Can be applied to interactive system according to information processing method of the present invention, this interactive system can comprise electronic equipment and input equipment.Described electronic equipment communicates with input equipment.
Described electronic equipment comprises display unit, for showing a graphic user interface to user, and for communicating with input equipment, to detect the interaction gesture that user is performed by described input equipment.Then, the input trajectory that described electronic equipment can be formed according to the touch-control input of user performs corresponding operation.
Such as, described electronic equipment can be the portable electric appts of such as personal computer, panel computer, mobile phone, digital camera, personal digital assistant, portable computer, game machine etc., also can be the large-scale stationary electronic devices of such as information station, ATM (Automatic Teller Machine) (ATM) machine etc.
Described input equipment forms interaction gesture for the touch-control action according to user on the display unit of described electronic equipment, performs the first operation to control described electronic equipment.
This interaction gesture can be a series of input trajectory, and it can comprise one or more tracing point.Such as, if this interaction gesture clicks, then in this interaction gesture, only comprise a tracing point; If this interaction gesture double-clicks, then may comprise two tracing points of a tracing point or close proximity in this interaction gesture; If this interaction gesture is towing, flicks, then may comprise a succession of continuous print tracing point in this interaction gesture; And if this interaction gesture is kneading, launches, then two string continuous print tracing points may be comprised in this interaction gesture.
Such as, described input equipment can be the pen-based input equipment of such as writing pencil or pointer and so on.
In one embodiment, described information processing method can respond to the track formed close to the motion described electronic equipment to described input block leaves described electronic equipment from described input block, as described input trajectory.Such as, when Chinese charater input, described input trajectory can correspond to the unicursal in a word.
In another embodiment, first, described information processing method can respond to the track formed close to the motion described electronic equipment to described input block leaves described electronic equipment from described input block, as the first input trajectory.After sensing that described input block leaves described electronic equipment, described information processing method respond to further in the given time described input block whether again close to described electronic equipment.If sense that described input block is again close to described electronic equipment in the given time, then described information processing method can continue to respond to from described input block again close to the track that the motion described electronic equipment to described input block leaves described electronic equipment again is formed, as the second input trajectory, by that analogy.Described first input trajectory and described second input trajectory etc. be the described input trajectory of composition jointly.Such as, when Chinese charater input, described input trajectory can correspond to a whole Chinese character or even multiple Chinese character.
Fig. 1 illustrates according to information processing method of the present invention.
Information processing method illustrated in Fig. 1 can be applied to the input equipment in interactive system.As mentioned above, described interactive system can comprise electronic equipment and input equipment.Described input equipment communicates with electronic equipment, and described electronic equipment comprises display unit, and described input equipment, for forming tracing point on described display unit, performs the first operation to control described electronic equipment.
As illustrated in figure 1, described information processing method comprises:
In step s 110, the first optical signalling is transmitted to described electronic equipment.
In the step s 120, receive the second optical signalling from described electronic equipment, described second optical signalling is that described electronic equipment generates according to described first optical signalling.
In step s 130, which, the positional information of the tracing point that described input equipment is formed on described display unit is determined according to described second optical signalling.
Fig. 2 illustrates according to information processing method of the present invention.
Information processing method illustrated in Fig. 2 can be applied to the electronic equipment in interactive system.As mentioned above, described interactive system can comprise electronic equipment and input equipment.Described electronic equipment communicates with input equipment, and described electronic equipment comprises display unit, and described input equipment, for forming tracing point on described display unit, performs the first operation to control described electronic equipment.
As illustrated in Figure 2, described information processing method comprises:
In step S210, receive the first optical signalling from described input equipment.
In step S220, generate the second optical signalling according to described first optical signalling, the positional information that described second optical signalling makes described input equipment can determine from the tracing point formed on described display unit.
In step S230, transmit described second optical signalling to described input equipment.
As can be seen here, adopt according to information processing method of the present invention, the positional information of the tracing point that described input equipment is formed on described display unit can be determined by transmission and receiving optical signal between input equipment and electronic equipment, and without the need to the electronic devices and components outside the plus of display unit surface, thus improve frivolous property and the integration of display unit.
Fig. 3 illustrates information processing method according to a first embodiment of the present invention.
Information processing method illustrated in Fig. 3 can be applicable to interactive system.Described interactive system can comprise electronic equipment and input equipment.In an embodiment of the present invention, can by the transparent interactive operation exciting display technique to realize between electronic equipment and input equipment so that user can use the empty electronic equipment of input equipment perform needed for operation.
The transparent display technique that excites continue for a very long time as study hotspot, and academia investigated the technology that some possess market application foreground.Transparence Display technology based on fluorescence excitation obtains fine development at present.
In an embodiment of the present invention, the transparent material containing fluorescence excitation particulate can be utilized to prepare film as location patterning film, invest existing display unit surface, utilize corresponding invisible light (such as, ultraviolet light) or visible ray is (such as, royal purple light) excite the visible ray (or invisible light) producing corresponding wave band, for the instruction operation of consumer positioning in display unit.And, because the luminescence of this fluorescence excitation has obtained checking (Fluorescence laser material coherent reference patent can see US Patent No. 6986581B2 and US7090355B2) in know-why, so this technology is applied to computing equipment interaction design field, constructs new interactive mode in conjunction with the demand of market to computing equipment interaction capabilities and there are market outlook widely.
Below, information processing method according to a first embodiment of the present invention will be described in an instantiation of interactive system.
Fig. 4 A illustrates the side view of the interactive system according to embodiment of the present invention instantiation, and Fig. 4 B illustrates the front elevation of the interactive system according to embodiment of the present invention instantiation.
Illustrated in Fig. 4 A and 4B, in the instantiation of the embodiment of the present invention, this interactive system 300 comprises electronic equipment 200 and input equipment 100, and this electronic equipment 200 is notebooks, and this input equipment 100 is pointer.
Particularly, this electronic equipment 200 at least comprises display unit 10.This display unit 10 comprises display screen, for showing a graphic user interface to user.
Fluorescence excitation film 20 is posted in front end in the display screen of this display unit 10, and this fluorescence excitation film 20 can be completely transparent or translucent.Such as, described transparent fluorescent excites film 20 can be the transparent membrane prepared containing fluorescence excitation material.Further, this fluorescence excitation film 20 can receive the exciting light with first wave length and the first energy from light source.Then, this fluorescence excitation film 20 can absorb received exciting light, and launches the response light with second wave length and the second energy.
Such as, this fluorescence excitation film 20 can adopt the mode of laser excitation.The laser excitation of fluorescent material is the one in various forms of optical excitation, and its utilization has the one or more laser beam being enough to make fluorescent material utilizing emitted light or luminescent energy.In other realizes, the optical excitation of this fluorescence excitation film 20 also can be produced by non-laser light source, and this non-laser light source has the energy being enough to excite the fluorescent material used in display unit 10.The embodiment of non-laser exciting light source comprises various light emitting diode (LED), light modulation and other such light source, that is, its wavelength sent or frequency spectrum can excite the fluorescent material that the light of higher-energy can be converted into more low-energy light.
Particularly, in exemplary configurations, following layer material can be comprised successively from outside to inside in display unit 10: cover glass, these fluorescence excitation film, color filter and display element etc.Like this, fluorescence excitation film 20 can be made can be subject to the protection of cover glass and to avoid wearing and tearing, and the light that the transparency of fluorescence excitation film 20 also can make display element send can not be stopped.
Pointer 100 at least comprises a body main body 40, image acquisition unit 50 and excitation source 60.
This body main body 40 is for playing support and fixation to other unit in pointer 100.
This image acquisition unit 50 can be such as camera or optical sensor etc., is embedded in one end or other positions of pointer 100, and for obtaining the response light that this fluorescence excitation film 20 is launched in response to the excitation of excitation source 60.
This excitation source 60 is for launching exciting light to fluorescence excitation film 20.It can be the single light source or the multiple light courcess that send a kind of wavelength light, or also can be send the single light source (that is, this single light source can launch the light had compared with wide spectral) of multiple different wavelengths of light or send the multiple light courcess (namely each light source launches a kind of light of wavelength respectively) of different wavelengths of light respectively.
In this instantiation, user can use and grip the display unit surface execution touch-control action of this pointer 100 at notebook 200, to form interaction gesture, makes this notebook carry out the first operation according to this interaction gesture.
It should be noted that, the present invention is not limited thereto.But, embodiments of the invention can also be applied to other one or more input equipments (such as, laser designator) with electronic equipment (such as, personal computer, panel computer, mobile phone, multimedia player, personal digital assistant, super, intelligent television etc.) between any reciprocal process.
As illustrated in fig. 3, described information processing method comprises:
In step S310, input equipment transmits the first optical signalling to electronic equipment.
When user wishes to use input equipment (in this instantiation, pointer) 100 to carry out operating electronic equipment (in this instantiation, notebook) 200, first this user is two device power-on.
After being powered up, this notebook 200 can by display unit 10 to user's display graphics user interface.Although be integrated with fluorescence excitation film 20, because this fluorescence excitation film 20 is transparent, so it can not block user normally watch this graphic user interface in the front end (that is, near the side of user) of display unit 10.
Next, user can carry out this computing machine of Operational Note 200 by this graphic user interface and completes operation needed for oneself.Particularly, this user can grip pointer 100 and carries out touch-control action, to form interaction gesture on the surface of the display unit 10 of notebook 200.Preferably, in an embodiment of the present invention, this touch-control action can without the need to touching the surface of display unit 10.Certainly, even if pointer 100 touches the surface of display unit 10 when user performs this touch-control action, this can not affect the normal running of this method too.
As mentioned above, the interaction gesture that user performs on the surface of display unit 10 can be a series of input trajectory, and it comprises one or more tracing point.Below, for convenience of description, being assumed to be by this interaction gesture is a single-click operation.
Such as, when user to wish in the display unit 10 at notebook 200 one of display can operand (such as, application icon) perform single-click operation, with choose or open this can operand time, this user can grip pointer 100, and makes the part being integrated with excitation source 60 in pointer 100 aim at this can operand surface towards display unit 10 to move.
In the process of pointer 100 towards the surface movement of display unit 10, this pointer 100 can pass through excitation source 60, the position of operand can transmit the first optical signalling to the display unit 10 of notebook 200 at this.This first optical signalling has first wave length and the first energy.
Illustrated in Fig. 4 A, in pointer 100, this excitation source 60 can be two paster type light emitting types (LED), is attached to the nib of pointer 100, can aim at easily to make user.The first optical signalling that this excitation source 60 is launched can be visible ray or invisible light.Such as, the wavelength coverage of this visible ray is 400-700 nanometer (nm), and this invisible light can be wavelength coverage be the ultraviolet light of 10-400nm or wavelength coverage is the infrared light of 770-2000nm.
But, the present invention is not limited thereto.As mentioned above, this excitation source 60 can be the single light source or the multiple light courcess that send a kind of wavelength light, or also can be send the single light source of multiple different wavelengths of light or send the multiple light courcess of different wavelengths of light respectively.
In step s 320, electronic equipment generates the second optical signalling according to the first optical signalling.
Correspondingly, in notebook 200, display unit 10 receives the first optical signalling launched by excitation source 60.Particularly, display unit 10 rear end is positioned at (that is, namely in the arrival of this first optical signalling, side away from user) display element before, first optical signalling absorbs as exciting light by this fluorescence excitation film 20, then launches the second optical signalling, responsively light.This second optical signalling has second wave length and the second energy.
In one example, the exciting light that this excitation source 60 is launched can be ultraviolet light.Preferably, the wavelength of this ultraviolet light is 365nm.In the case, down conversion physical phenomenon can be there is.Such as, fluorescence excitation film 20 launches visible ray in response to this ultraviolet light, responsively light.
Particularly, compared with visible ray, ultraviolet light has shorter wavelength and higher energy.Correspondingly, when luminescent substance (that is, fluorescence excitation film 20) absorbs ultraviolet light when launching low-energy visible ray, ultraviolet light is visible ray by down conversion, this is due to when ultraviolet light is converted into visible ray, the energy level reduction of ultraviolet light.
In another example, the exciting light that this excitation source 60 is launched also can be infrared light.Preferably, the wavelength of this infrared light is 880nm.In the case, up conversion physical phenomenon can be there is.Such as, fluorescence excitation film 20 launches visible ray in response to this infrared light, responsively light.
Particularly, compared with visible ray, infrared light has longer wavelength and lower energy.Correspondingly, when luminescent substance (that is, fluorescence excitation film 20) absorbs infrared light when launching high-octane visible ray, infrared light is visible ray by up conversion, this is due to when infrared light is converted into visible ray, the energy level rising of infrared light.Because energy level rises, so in the process of up conversion, need more than one infrared light photons could launch a light photon.
Although describe the conversion of exciting light to response light by two examples above, the present invention is not limited thereto.Such as, except invisible light, the exciting light that this excitation source 60 is launched also can be visible ray, such as, without the bluish violet band of light of obvious observability.Preferably, the wavelength of this royal purple light can be 405nm.In addition, in addition to visible light, this fluorescence excitation film 20 is encouraged launched response light also can be invisible light (such as, infrared light or ultraviolet light).
In fluorescence excitation film 20 transmitting response light (the second optical signalling) period, in order to pointer 100(can be obtained namely, excitation source 60) at display unit 10(namely, fluorescence excitation film 20) middle position coordinates of giving directions, in the response light that fluorescence excitation film 20 is launched in response to this exciting light, adding the positional information of corresponding tracing point, making the positional information of tracing point that pointer 100 can be determined from being formed on display unit 10.
For this reason, according to the precision of identification, fluorescence excitation material (that is, fluorescence excitation film 20) can be divided into multiple band of position, and make the band of position of wherein being triggered can transmit feedback information to pointer 100.Because fluorescence excitation film 20 at least covers the Zone Full of display unit 10, so obtain the positional information of pointer 100 in fluorescence excitation film 20, also just obtain the coordinate information of pointer 100 in display unit 10.
In a first example, fluorescence excitation film 20 can be designed to, comprising multiple bands of position can pointer 100 launch identical exciting light excitation under, launch the response light of different wave length (that is, color).
Particularly, the identification precision of whole display unit be 4(such as, simply display unit is divided into upper left, lower-left, upper right and four, bottom right part) when, launch red light under the excitation of the ultraviolet light that the upper left of fluorescence excitation film 20 can be made to launch at excitation source 60; Sodium yellow is launched under the excitation of the ultraviolet light that the upper right portion of fluorescence excitation film 20 can be made to launch at excitation source 60; Blue light is launched under the excitation of the ultraviolet light that the bottom left section of fluorescence excitation film 20 can be made to launch at excitation source 60; And launch green light under the excitation of the ultraviolet light that the lower right-most portion of fluorescence excitation film 20 can be made to launch at excitation source 60.
Like this, such as, when the image acquisition unit 50(installed in pointer 100 such as, optical sensor) when detecting that the response light that fluorescence excitation film 20 feeds back is blue light, this pointer 100 just can judge that position that active user gives directions is the bottom left section of display unit.
Above, be 4 to be illustrated for identification precision.Obviously, when identification precision higher (such as, being N), only need fluorescence excitation film 20 to be correspondingly divided into N number of part, make this N number of fractional transmission go out the light of N kind different wave length, and ensure that image acquisition unit 50 can identify this N kind light exactly.
In the second example, fluorescence excitation film 20 can be designed to, comprising the excitation of identical exciting light that can launch in pointer 100 of multiple bands of position under, launch the response light comprising the pattern distribution of different location.Such as, this location pattern can as shown in the figure above Fig. 4 B.
Such as, in the process forming fluorescence excitation film 20, patterning (pattern) PROCESS FOR TREATMENT can be carried out to it, make each band of position can under the excitation of exciting light, launch different coded data images (such as, one-dimensional bar code, two-dimensional bar etc.) or positioning image.
Below, still with the identification precision of whole display unit for 4(such as, simply display unit is divided into upper left, lower-left, upper right and four, bottom right part) situation be described.
Particularly, when using bar code, the coded data of two bits can be used simply to identify four of fluorescence excitation film 20 parts.Such as, coded data 00 can be utilized to identify the upper left of fluorescence excitation film 20; Coded data 01 can be utilized to identify the upper right portion of fluorescence excitation film 20; Coded data 10 can be utilized to identify the bottom left section of fluorescence excitation film 20; And coded data 11 can be utilized to identify the lower right-most portion of fluorescence excitation film 20.Further, formed fluorescence excitation film 20 time, make comprising four bands of position under the excitation of exciting light, the one dimension or two-dimensional bar that comprise above-mentioned four kinds of coded datas can be launched respectively.
Like this, such as, when in pointer 100 install image acquisition unit 50(such as, optical sensor) detect that response light that fluorescence excitation film 20 feeds back is formed when comprising the bar code image of coded data 10, this pointer 100 just can judge that position that active user gives directions is the bottom left section of display unit.
In addition, particularly, when using positioning image, four kinds of different positioning images can be used simply to identify four of fluorescence excitation film 20 parts.Such as, the first positioning image can be utilized to identify the upper left of fluorescence excitation film 20; The second positioning image can be utilized to identify the upper right portion of fluorescence excitation film 20; The 3rd positioning image can be utilized to identify the bottom left section of fluorescence excitation film 20; And the 4th positioning image can be utilized to identify the lower right-most portion of fluorescence excitation film 20.Further, formed fluorescence excitation film 20 time, make comprising four bands of position under the excitation of exciting light, above-mentioned four kinds of positioning images can be launched respectively.
Like this, such as, when in pointer 100 install image acquisition unit 50(such as, optical sensor) when detecting that the response light that fluorescence excitation film 20 feeds back forms the 3rd positioning image, this pointer 100 just can judge that position that active user gives directions is the bottom left section of display unit.
In like manner, above, be 4 to be illustrated for identification precision.Obviously, when identification precision higher (such as, being N), only need fluorescence excitation film 20 to be correspondingly divided into N number of part, make this N number of fractional transmission go out the different location pattern of N kind, and ensure that image acquisition unit 50 can identify this N kind location pattern exactly.
In addition, preferably, one's own fluorescence excitation film 20 can also be customized for each notebook 200, under the exciting of the exciting light making the fluorescence excitation film 20 of the first notebook 200 only can launch in the first pointer 100 being exclusively used in this first notebook 200, just to produce response light.Like this, fluorescence excitation film 20 can be avoided to be subject to surround lighting or to be exclusively used in the excitation source interference of other electronic equipments (such as, the second notebook 200), and making the operation confusion reigned of user in notebook 200.
In step S330, electronic equipment transmits the second optical signalling to input equipment.
After display unit 10 generates in response to the excitation of pointer 20 response light comprising tracing point positional information, this response light can be sent to pointer 20 along desired trajectory by this display unit 10.
Particularly, the transparent fluorescent posted in this display unit 10 front end excites after film 20 illuminated by the excitation source 60 of pointer 100, responsively light and inspire the specific wavelength designed in advance light or location pattern, as the positional information fed back to pointer 100.
Correspondingly, pointer 20 can receive this response light by image acquisition unit 50 from electronic equipment, to determine the positional information of the tracing point that pointer 20 is formed on display unit 10 according to this response light after a while.
Particularly, image acquisition unit 50 takes light or the location pattern of the specific wavelength be illuminated on the surface of display unit 10.
Such as, when this response light is visible ray, this image acquisition unit 50 can be visible image capturing head or visible light optical sensor.At this moment, graphic user interface normally cannot be watched in order to make user can not be subject to the interference of this response light in display unit 10, the region occupied by this response light can be made very little, like this, user is when using this pointer 100 to carry out touch-control, and the nib of this pointer 100 can block response light and incide in user's eye.In addition, in order to the response light avoiding small size causes follow-up identifying operation to occur error, preferably, high magnified glass can be added before this image acquisition unit 50.
When this response light is invisible light, this image acquisition unit 50 can be invisible light camera or invisible light optical sensor etc.
In step S340, input equipment obtains the positional information of the tracing point that input equipment is formed on the display unit according to the second optical signalling.
Response light wherein by launching different wave length is determined in the first example of positional information, and this image acquisition unit 50 can identify the wavelength of response light or color.After the wavelength identifying response light, the mapping table of wavelength and position coordinates can be obtained in the storer of pointer 100.Then, by carrying out inquiring about the position coordinates determined on the display unit 10 corresponding with this specific wavelength in this mapping table.
As mentioned above, when in pointer 100 install image acquisition unit 50(such as, optical sensor) when detecting that the response light that fluorescence excitation film 20 feeds back is red light, this pointer 100 just can judge that position that active user gives directions is the upper left of display unit 10.
Response light wherein by launching location pattern distribution is determined in the second example of positional information, and this image acquisition unit 50 can identify and extract, to obtain the positional information of tracing point location pattern distribution.
Such as, when this location pattern distribution is as coded data images (such as, bar code image) time, this image acquisition unit 50 can extract coded data from the second optical signalling of responsively light, obtain the coordinate information of described tracing point by decoding to described coded data, and described coordinate information is defined as described positional information.
Particularly, this image acquisition unit 50, after the picture photographing the bar code image formed by response light, extracts this bar code image from this picture.Then, this bar code is decoded the coded data determined indicated by it.Then, by determine the position coordinates on the display unit 10 corresponding with this coded data with the mapping table of position in inquiry coded data.
As mentioned above, when in pointer 100 install image acquisition unit 50(such as, optical sensor) detect that response light that fluorescence excitation film 20 feeds back is formed when comprising the bar code image of coded data 00, this pointer 100 just can judge that position that active user gives directions is the upper left of display unit.
And for example, when this location pattern distribution is positioning image, this image acquisition unit 50 can extract the first view data from the second optical signalling of responsively light, obtain the overview image model of described display unit, described overview image model comprises multiple view data, and described multiple view data indicates the coordinate information of multiple location point in described display unit respectively.Then, described first view data and described overview image model can compare by this this image acquisition unit 50, when described first view data is mated with the part in described overview image model, the coordinate information of the location point be associated with described compatible portion is defined as the coordinate information of described tracing point, and described coordinate information is defined as described positional information.
Particularly, this image acquisition unit 50, after the picture photographing the positioning image formed by response light, extracts this positioning image from this picture.Then, from the storage unit (not shown) of pointer 100, extract the overview image model built in advance.This overview image model can be in advance by two dimension (2D) reconstruction technique 2D characteristic model of exciting film 20 to rebuild to specific fluorescent.Because 2D reconstruction technique has high precision, so in follow-up meticulous position fixing process, by the very high identifiable design rate of acquisition and positioning precision.Such as, this overview image model can comprise the mapping table of positioning image and position coordinates.
Then, by being compared by each model image in the positioning image extracted and overall model, whether there is coupling to search.Iff existence one coupling, then the coordinate information corresponding with Matching Model image is defined as the position coordinates of pointer 100 on display unit 10.Alternatively, if there is multiple coupling, then the matching degree of multiple matching result is quantized, and coordinate information corresponding for the model image the highest with matching value is defined as the position coordinates of pointer 100 on display unit 10.
As mentioned above, when in pointer 100 install image acquisition unit 50(such as, optical sensor) when detecting that the response light that fluorescence excitation film 20 feeds back is formed with the first positioning image, this pointer 100 just can judge that position that active user gives directions is the upper left of display unit.
In step S350, input equipment transmits the positional information of tracing point to electronic equipment.
Preferably, the input trajectory of user interactions gesture is determined (namely in pointer 100, excitation source 60) after position coordinates on display unit 10, this position coordinates is sent to notebook 200 further by wireless or wired mode by this pointer 100.
Such as, this pointer 100 can be passed through WLAN (wireless local area network) (such as, Wi-Fi), radio-frequency (RF) identification (RFID) or near-field communication (NFC) technology and transmits this location coordinate information.
Alternatively, this pointer 100 also can be connected on notebook 200 by dedicated data line, and uses the agreement such as Ethernet, USB (universal serial bus) (USB) to transmit this location coordinate information by this dedicated data line.
Correspondingly, in notebook 200, the communication technology corresponding with pointer 100 can be used to receive this position coordinates.
In step S360, electronic equipment performs the first operation.
At notebook 200 from after pointer 100 receives the position coordinates of input trajectory, this notebook 200 can perform corresponding operation according to this one or a series of position coordinates.Such as, choose document icon, open application program, play multimedia file, according to the handwriting characters etc. of user.
As can be seen here, adopt information processing method according to a first embodiment of the present invention, a kind of pen-based interaction of display screen integrated positioning can be provided to put and method, it can improve that current most of computing equipment interactive device hardware configuration is complicated, positioning precision is limited, cannot meet the problems such as display and positioning synchronous carry out, and has very large using value.
In the first embodiment of the present invention, fluorescence excitation material can be integrated in display unit surface, its transparency and frivolous property do not affect the display effect of display unit.Then, utilize the local positioning pattern that the excitation light source excites of pen-based interaction device front end is formed on this fluorescence excitation material, and obtain via image acquisition equipment shooting, and then calculate pen-based interaction equipment instant position coordinate.Therefore, contactless quick position can be realized with mutual, for user provides the interactive mode of more gorgeous novelty.
In addition, the pen-based interaction method of display screen integrated positioning of the present invention is simply effective, requires lower to hardware device, configurations is flexible, and display and positioning interaction module integrate to a great extent, have very large practicality, broaden application occasion, is of great practical significance.
In the first embodiment of the present invention, by distinguishing the operation of wavelength and/or location pattern, electronic equipment 200(can be determined exactly namely, excitation source 60) at display unit 10(namely, fluorescence excitation film 20) middle position coordinates of giving directions.
But, in practical operation, in order to avoid user produces maloperation owing to causing electronic equipment at the careless mobile input equipment in position away from display unit 10, preferably, in the second embodiment of the present invention, make this input equipment after receiving the second optical signalling exceeding predetermined strength, only just determine the positional information of input trajectory.
Fig. 5 illustrates information processing method according to a second embodiment of the present invention.
Information processing method illustrated in Fig. 5 can be applicable to interactive system.Described interactive system can comprise electronic equipment and input equipment.Below, continue, with reference to the interactive system instantiation illustrated in Fig. 4 A and 4B, information processing method is according to a second embodiment of the present invention described.
As illustrated in fig. 5, described information processing method comprises:
In step S410, input equipment transmits the first optical signalling to electronic equipment.
In the step s 420, electronic equipment generates the second optical signalling according to the first optical signalling.
In step S430, electronic equipment transmits the second optical signalling to input equipment.
Step S410 to S430 in Fig. 5 is identical respectively with step S310 to the S330 in Fig. 3, and therefore, will omit its repeated description.
In step S440, input equipment judges that whether the intensity of the second optical signalling is more than the first predetermined strength.
After the image acquisition unit 50 of pointer 100 receives the second optical signalling (response light) that fluorescence excitation film 20 sends, whether first this display unit 10 can detect the intensity of this response light more than the first predetermined strength.
Such as, this first predetermined strength can be set to, pointer 100 be contacted when the outside surface (that is, display screen) of display unit 10 is upper, the light intensity of fluorescence excitation film 20 transmitting response light that image acquisition unit 50 receives.Alternatively, also this predetermined strength can be set to, when pointer 100 is less than certain distance (such as, 5 millimeters (mm)) apart from the outside surface of display unit 10, the light intensity of fluorescence excitation film 20 transmitting response light that receives of image acquisition unit 50.
Alternatively, due in notebook 200, the intensity proportional of the response light that this fluorescence excitation film 20 is launched is in the intensity of its exciting light absorbed, so this first predetermined strength directly can be set to, the intensity of the response light that this fluorescence excitation film 20 discharges when receiving pointer 100 and contacting the exciting light that this display unit 10 launches.
When image acquisition unit 50 judges that the intensity of response light is more than or equal to the first predetermined strength, this pointer 100 thinks that user wishes to operate notebook 200, and this method proceeds to step S450; And when image acquisition unit 50 judges that the intensity of response light is less than the first predetermined strength, this pointer 100 thinks that user produces maloperation to it, and this method terminates.
In step S450, input equipment obtains the positional information of the tracing point that input equipment is formed on the display unit according to the second optical signalling.
In step S460, input equipment transmits the positional information of tracing point to electronic equipment.
In step S470, electronic equipment performs the first operation.
Step S450 to S470 in Fig. 5 is identical respectively with step S340 to the S360 in Fig. 3, and therefore, will omit its repeated description.
Whether as can be seen here, adopt information processing method according to a second embodiment of the present invention, can be that the operation that user is intended to perform judges to the instruction of input equipment (such as, pointer) on electronic equipment (such as, notebook) operation.Only when the display unit of input equipment enough close to electronic equipment, just think that user wishes to use input equipment to manipulate electronic equipment; And when user's handling input equipment carries out meaningless operation (such as, rotating pointer), because input equipment distance electronic equipment is comparatively far away, so make electronic equipment without the need to producing response to the input of user.Therefore, avoid user and cause due to careless mobile input equipment electronic equipment to produce maloperation, thus saved the power consumption of input equipment, extend its stand-by time.
In addition, the Area comparison of the response light launched after being encouraged due to fluorescence excitation film 20 is little, so when pointer is away from notebook 200, image acquisition unit 50 also effectively prevent the generation of maloperation for the nonidentifiability of this response light.
In the second embodiment of the present invention, by performing the maloperation judging to avoid user to produce at input equipment place.Preferably, in the third embodiment of the present invention, this judgement operation can be advanceed in electronic equipment and perform.Such as, this fluorescence excitation film 20 can be made only after receiving the first optical signalling more than the second predetermined strength, just to launch the second optical signalling.
Fig. 6 illustrates information processing method according to a third embodiment of the present invention.
Information processing method illustrated in Fig. 6 can be applicable to interactive system.Described interactive system can comprise electronic equipment and input equipment.Below, continue, with reference to the interactive system instantiation illustrated in Fig. 4 A and 4B, information processing method is according to a third embodiment of the present invention described.
As illustrated in FIG. 6, described information processing method comprises:
In step S510, input equipment transmits the first optical signalling to electronic equipment.
Step S310 in step S510 and Fig. 3 in Fig. 6 is identical, and therefore, will omit its repeated description.
In step S520, electronic equipment judges that whether the intensity of the first optical signalling is more than the second predetermined strength.
The first optical signalling (exciting light) launched by the excitation source 60 of pointer 100 is received at the display unit 10 of notebook 200.Then, this display unit 10 can detect the intensity of this exciting light further whether more than the second predetermined strength.
Such as, this second predetermined strength can be set to, pointer 100 be contacted the light intensity when the outside surface (that is, display screen) of display unit 10 is upper, detected by fluorescence excitation film 20.Alternatively, also this predetermined strength can be set to, when pointer 100 is less than certain distance (such as, 5 millimeters (mm)) apart from the outside surface of display unit 10, the light intensity detected by fluorescence excitation film 20.
When display unit 10 judges that the intensity of exciting light is more than or equal to the second predetermined strength, this notebook 200 thinks that user wishes to operate on it, and this method proceeds to step S530; And when display unit 10 judges that the intensity of exciting light is less than the second predetermined strength, this notebook 200 thinks the current generation maloperation of pointer 100, and this method terminates.
In step S530, electronic equipment generates the second optical signalling according to the first optical signalling.
In step S540, electronic equipment transmits the second optical signalling to input equipment.
In step S550, input equipment obtains the positional information of the tracing point that input equipment is formed on the display unit according to the second optical signalling.
In step S560, input equipment transmits the positional information of tracing point to electronic equipment.
In step S570, electronic equipment performs the first operation.
Step S530 to S570 in Fig. 6 is identical respectively with step S320 to the S360 in Fig. 3, and therefore, will omit its repeated description.
As can be seen here, adopt information processing method according to a third embodiment of the present invention, can just detect user before the interactive operation generation response of electronic equipment for input equipment and whether occur maloperation, thus saved the power consumption of input equipment and electronic equipment, extend both stand-by times.
In the first embodiment of the present invention, determine that electronic equipment 200(namely by carrying out identification to the location pattern distribution comprised at response light in input equipment 100, excitation source 60) at display unit 10(namely, fluorescence excitation film 20) middle position coordinates of giving directions, this carries out a large amount of identification, decoding, model storage and contrast operation with regard to needing in input equipment 100.
But in practical operation, the storage space of input equipment 100 and processing power are often very limited, and this input equipment 100 relies on powered battery usually, and cruising time is not long.Therefore, preferably, in the fourth embodiment of the present invention, make to transfer in electronic equipment according to the positional information determining step of location pattern distribution to carry out.
Fig. 7 illustrates information processing method according to a fourth embodiment of the present invention.
Information processing method illustrated in Fig. 7 can be applicable to interactive system.Described interactive system can comprise electronic equipment and input equipment.Below, continue, with reference to the interactive system instantiation illustrated in Fig. 4 A and 4B, information processing method is according to a fourth embodiment of the present invention described.
As illustrated in figure 7, described information processing method comprises:
In step S610, input equipment transmits the first optical signalling to electronic equipment.
In step S620, electronic equipment generates the second optical signalling according to the first optical signalling.
In step S630, electronic equipment transmits the second optical signalling to input equipment.
Step S610 to S630 in Fig. 7 is identical respectively with step S310 to the S330 in Fig. 3, and therefore, will omit its repeated description.
In step S640, input equipment extracts location pattern distribution from the second optical signalling.
After pointer 20 to receive the response light that transparent fluorescent excites film 20 to launch from electronic equipment by image acquisition unit 50, this image acquisition unit 50 can from wherein identifying and extracting location pattern distribution (such as, coded data images or positioning image).
In step S650, input equipment transmits location pattern distribution to electronic equipment.
Because the decode operation to this coded data images and the comparison action need to positioning image expend a large amount of process resources of pointer, so, in the present embodiment, directly in pointer 100, do not determine position coordinates according to location pattern distribution, but pattern distribution in this location is sent in the electronic equipment with abundant process resource and operates.
In notebook 200 side, accept the location pattern distribution transmitted by wired or wireless mode from pointer 100 accordingly.
In step S660, electronic equipment obtains the positional information of the tracing point that input equipment is formed on the display unit according to location pattern distribution.
Except action executing main body difference, the associative operation in the step S340 in step S660 and the Fig. 3 in Fig. 7 is identical, and therefore, will omit it and describe in detail.
Preferably, determine the position coordinates of the interaction gesture that user is performed on display unit 10 by pointer 100 at notebook 200 after, when needed, this position coordinates can be sent to pointer 100 by this notebook 200 further, so that pointer can be followed the tracks of the input trajectory of user and add up.
In step S670, electronic equipment performs the first operation.
Step S360 in step S670 and Fig. 3 in Fig. 7 is identical, and therefore, will omit its repeated description.
As can be seen here, adopt information processing method according to a fourth embodiment of the present invention, the operation consumed in a large number in process resource and storage resources can be transferred to electronic equipment from input equipment and carry out, thus guaranteeing the process resource of the only consumes least in input equipment side to extend the stand-by time of input equipment.
Fig. 8 illustrates according to input equipment of the present invention.
Can be realized by the input equipment 100 illustrated in Fig. 8 according to information processing method of the present invention illustrated in Fig. 1.Described input equipment 100 communicates with electronic equipment, and described electronic equipment comprises display unit, and described input equipment 100, for forming tracing point on described display unit, performs the first operation to control described electronic equipment.As illustrated in Figure 8, described input equipment 100 comprises: the first signal transmission unit 110, secondary signal receiving element 120 and positional information determining unit 130.
This first signal transmission unit 110 is for transmitting the first optical signalling to described electronic equipment.
This secondary signal receiving element 120 is for receiving the second optical signalling from described electronic equipment, and described second optical signalling is that described electronic equipment generates according to described first optical signalling.
This positional information determining unit 130 is for determining the positional information of the tracing point that described input equipment is formed on described display unit according to described second optical signalling.
Fig. 9 illustrates according to electronic equipment of the present invention.
Can be realized by the electronic equipment 200 illustrated in Fig. 9 according to information processing method of the present invention illustrated in Fig. 2.Described electronic equipment 200 communicates with input equipment, and described electronic equipment comprises display unit, and described input equipment is used for forming tracing point on described display unit 200, performs the first operation to control described electronic equipment 200.As illustrated in fig. 9, described electronic equipment 200 comprises: the first signal receiving unit 210, secondary signal generation unit 220 and secondary signal delivery unit 230.
This first signal receiving unit 210 is for receiving the first optical signalling from described input equipment.
This secondary signal generation unit 220 for generating the second optical signalling according to described first optical signalling, the positional information that described second optical signalling makes described input equipment can determine from the tracing point formed on described display unit.
This secondary signal delivery unit 230 is for transmitting described second optical signalling to described input equipment.
As can be seen here, adopt according to input equipment of the present invention and electronic equipment, the positional information of the tracing point that described input equipment is formed on described display unit can be determined by transmission and receiving optical signal between input equipment and electronic equipment, and without the need to the electronic devices and components outside the plus of display unit surface, thus improve frivolous property and the integration of display unit.
Figure 10 illustrates interactive system according to a first embodiment of the present invention.
Information processing method according to a first embodiment of the present invention illustrated in Fig. 3 can be realized by the interactive device 300 illustrated in Figure 10.Illustrated in Figure 10, this interactive system 300 comprises input equipment 100 and electronic equipment 200.This input equipment 100 can be linked together by wireless network or cable network with this electronic equipment 200, and transmits the data message relevant to positioning action according to the data layout of agreement.
This input equipment 100 can comprise: the first signal transmission unit 110, secondary signal receiving element 120 and positional information determining unit 130.In addition, preferably, this input equipment 100 can also comprise: positional information delivery unit 140.
This electronic equipment 200 can comprise: the first signal receiving unit 210, secondary signal generation unit 220 and secondary signal delivery unit 230.In addition, preferably, this electronic equipment 200 can also comprise: positional information receiving element 240.
In this input equipment 100 side, this first signal transmission unit 110 is for transmitting the first optical signalling to described electronic equipment.
In electronic equipment 200 side, this first signal receiving unit 210 is for receiving the first optical signalling from described input equipment.This secondary signal generation unit 220 is for generating the second optical signalling according to described first optical signalling.
In one example, described first optical signalling can be converted to described second optical signalling by down conversion by this secondary signal generation unit 220, and comprise first coding data at described second optical signalling, described first coding data indicates the coordinate information of described tracing point.
In another example, described first optical signalling can be converted to described second optical signalling by down conversion by this secondary signal generation unit 220, and comprise the first view data at described second optical signalling, described first view data indicates the coordinate information of described tracing point.
Wherein, the wavelength of described first optical signalling is less than the wavelength of described second optical signalling, and the energy of described first optical signalling is greater than the energy of described second optical signalling.
This secondary signal delivery unit 230 is for transmitting described second optical signalling to described input equipment 100.
In this input equipment 100 side, this secondary signal receiving element 120 is for receiving the second optical signalling from described electronic equipment 200.This positional information determining unit 130 is for determining the positional information of the tracing point that described input equipment 100 is formed on described display unit according to described second optical signalling.
When described second optical signalling comprises the first coding data of the coordinate information indicating described tracing point, described positional information determining unit 130 can extract described first coding data from described second optical signalling, obtain the coordinate information of described tracing point by decoding to described first coding data, and described coordinate information is defined as described positional information.
When described second optical signalling comprises the first view data of the coordinate information indicating described tracing point, described positional information determining unit 130 can extract described first view data from described second optical signalling, obtain the overview image model of described display unit, described overview image model comprises multiple view data, described multiple view data indicates the coordinate information of multiple location point in described display unit respectively, described first view data and described overview image model are compared, when described first view data is mated with the part in described overview image model, the coordinate information of the location point be associated with described compatible portion is defined as the coordinate information of described tracing point, and described coordinate information is defined as described positional information.
Then, this positional information delivery unit 140, for transmitting the positional information of described tracing point to described electronic equipment 200, performs described first operation to control described electronic equipment.
Correspondingly, in electronic equipment 200 side, this positional information receiving element 240, for receiving the positional information of described tracing point from described input equipment, performs described first operation to control described electronic equipment.
Concrete configuration and the operation of the unit in interactive system 300 according to a first embodiment of the present invention are introduced in detail in the information processing method described above with reference to Fig. 3, and therefore, will omit its repeated description.
As can be seen here, adopt interactive system according to a first embodiment of the present invention, fluorescence excitation material can be integrated in display unit surface, its transparency and frivolous property do not affect the display effect of display unit.Then, utilize the local positioning pattern that the excitation light source excites of pen-based interaction device front end is formed on this fluorescence excitation material, and obtain via image acquisition equipment shooting, and then calculate pen-based interaction equipment instant position coordinate.Therefore, contactless quick position can be realized with mutual, for user provides the interactive mode of more gorgeous novelty.
Figure 11 illustrates interactive system according to a second embodiment of the present invention.
Information processing method according to a second embodiment of the present invention illustrated in Fig. 5 can be realized by the interactive device 300 illustrated in Figure 11.As illustrated in Figure 11, this interactive system 300 comprises input equipment 100 and electronic equipment 200.
This input equipment 100 can comprise: the first signal transmission unit 110, secondary signal receiving element 120, positional information determining unit 130 and positional information delivery unit 140.In addition, preferably, this input equipment 100 can also comprise: the second light intensity judging unit 150.
This electronic equipment 200 can comprise: the first signal receiving unit 210, secondary signal generation unit 220, secondary signal delivery unit 230 and positional information receiving element 240.
This second light intensity judging unit 150, before determining the positional information of the tracing point that described input equipment is formed on described display unit in positional information determining unit 130 according to described second optical signalling, judges that whether the intensity of the second optical signalling is more than the first predetermined strength.Only have when the intensity of the second optical signalling is more than the first predetermined strength, just notify the determination operation of positional information determining unit 130 executing location information.
Concrete configuration and the operation of the unit in interactive system 300 according to a second embodiment of the present invention are introduced in detail in the information processing method described above with reference to Fig. 5, and therefore, will omit its repeated description.
Whether as can be seen here, adopt interactive system according to a second embodiment of the present invention, can be that the operation that user is intended to perform judges to the instruction of input equipment (such as, pointer) on electronic equipment (such as, notebook) operation.Only when the display unit of input equipment close to electronic equipment, just think that user wishes to use input equipment to manipulate electronic equipment; And when user's handling input equipment carries out meaningless operation (such as, turning pen), because input equipment distance electronic equipment is comparatively far away, so make electronic equipment without the need to producing response to the input of user.Therefore, avoid user and cause due to careless mobile input equipment electronic equipment to produce maloperation, thus saved the power consumption of input equipment, extend its stand-by time.
Figure 12 illustrates interactive system according to a third embodiment of the present invention.
Information processing method according to a third embodiment of the present invention illustrated in Fig. 6 can be realized by the interactive device 300 illustrated in Figure 12.As illustrated in Figure 12, this interactive system 300 comprises input equipment 100 and electronic equipment 200.
This input equipment 100 can comprise: the first signal transmission unit 110, secondary signal receiving element 120, positional information determining unit 130 and positional information delivery unit 140.
This electronic equipment 200 can comprise: the first signal receiving unit 210, secondary signal generation unit 220, secondary signal delivery unit 230 and positional information receiving element 240.In addition, preferably, this electronic equipment 200 can also comprise: the first light intensity judging unit 250.
This first light intensity judging unit 250, for before generating the second optical signalling at secondary signal generation unit 220 according to described first optical signalling, judges that whether the intensity of the first optical signalling is more than the second predetermined strength.Only have when the intensity of judgement first optical signalling is more than the second predetermined strength, just notify that secondary signal generation unit 220 performs generation second optical signalling.
Concrete configuration and the operation of the unit in interactive system 300 according to a third embodiment of the present invention are introduced in detail in the information processing method described above with reference to Fig. 6, and therefore, will omit its repeated description.
As can be seen here, adopt interactive system according to a third embodiment of the present invention, can just detect user before the interactive operation generation response of electronic equipment for input equipment and whether occur maloperation, thus saved the power consumption of input equipment and electronic equipment, extend both stand-by times.
Figure 13 illustrates interactive system according to a fourth embodiment of the present invention.
Information processing method according to a fourth embodiment of the present invention illustrated in Fig. 7 can be realized by the interactive device 300 illustrated in Figure 13.As illustrated in figure 13, this interactive system 300 comprises input equipment 100 and electronic equipment 200.
This input equipment 100 can comprise: the first signal transmission unit 110, secondary signal receiving element 120 and positional information determining unit 130.
This electronic equipment 200 can comprise: the first signal receiving unit 210, secondary signal generation unit 220 and secondary signal delivery unit 230.In addition, preferably, this electronic equipment 200 can also comprise: data receipt unit 260, positional information determining unit 270 and positional information delivery unit 280.
In this input equipment 100 side, this first signal transmission unit 110 is for transmitting the first optical signalling to described electronic equipment.
In electronic equipment 200 side, this first signal receiving unit 210 is for receiving the first optical signalling from described input equipment.This secondary signal generation unit 220 is for generating the second optical signalling according to described first optical signalling.
This secondary signal delivery unit 230 is for transmitting described second optical signalling to described input equipment.
In this input equipment 100 side, this secondary signal receiving element 120 is for receiving the second optical signalling from described electronic equipment.This positional information determining unit 130 is for determining the positional information of the tracing point that described input equipment is formed on described display unit according to described second optical signalling.
When described second optical signalling comprises the first view data of the coordinate information indicating described tracing point, described positional information determining unit 130 can extract described first view data from described second optical signalling, transmit described first view data to described electronic equipment, determine the positional information of described tracing point to make described electronic equipment according to described first view data.
In electronic equipment 200 side, this data receipt unit 260 may be used for receiving the first view data from described input equipment, and described first view data indicates the coordinate information of described tracing point.
This positional information determining unit 270 may be used for the overview image model obtaining described display unit, described overview image model comprises multiple view data, described multiple view data indicates the coordinate information of multiple location point in described display unit respectively, described first view data and described overview image model are compared, when described first view data is mated with the part in described overview image model, the coordinate information of the location point be associated with described compatible portion is defined as the coordinate information of described tracing point, and described coordinate information is defined as described positional information.
This positional information delivery unit 280 may be used for transmitting described positional information to described input equipment.
Finally, when needed, described positional information determining unit 130 receives described positional information from described electronic equipment.
Obviously, the first above-mentioned view data also can be the coded data in the pattern of location.
Concrete configuration and the operation of the unit in interactive system 300 according to a fourth embodiment of the present invention are introduced in detail in the information processing method described above with reference to Fig. 7, and therefore, will omit its repeated description.
As can be seen here, adopt interactive system according to a fourth embodiment of the present invention, the operation consumed in a large number in process resource and storage resources can be transferred to electronic equipment from input equipment and carry out, thus guaranteeing the process resource of the only consumes least in input equipment side to extend the stand-by time of input equipment.
In addition, although above-mentioned unit is illustrated each embodiment of the present invention as the executive agent of each step herein, those skilled in the art are it is understood that the present invention is not limited thereto.The executive agent of each step can be served as by other one or more units, unit, even module.
Such as, the first signal transmission unit 110, secondary signal receiving element 120, positional information determining unit 130, positional information delivery unit 140 and each step performed by the second light intensity judging unit 150 can be realized by the CPU (central processing unit) (CPU) in input equipment uniformly.In like manner, above-mentioned first signal receiving unit 210, secondary signal generation unit 220, secondary signal delivery unit 230, positional information receiving element 240, first light intensity judging unit 250, data receipt unit 260, positional information determining unit 270 and each step performed by positional information delivery unit 280 can be realized by the CPU (central processing unit) (CPU) in electronic equipment uniformly.
Through the above description of the embodiments, those skilled in the art can be well understood to the mode that the present invention can add required hardware platform by means of software and realize, and can certainly all be implemented by software or hardware.Based on such understanding, what technical scheme of the present invention contributed to background technology can embody with the form of software product in whole or in part, this computer software product can be stored in storage medium, as ROM/RAM, disk, CD etc., comprising some instructions in order to make a computer equipment (can be personal computer, server, or the network equipment etc.) perform the method described in some part of each embodiment of the present invention or embodiment.
Each embodiment of the present invention is described in detail above.But, it should be appreciated by those skilled in the art that without departing from the principles and spirit of the present invention, various amendment can be carried out to these embodiments, combination or sub-portfolio, and such amendment should fall within the scope of the present invention.

Claims (24)

1. an information processing method, described method is applied to input equipment, it is characterized in that, described input equipment communicates with electronic equipment, described electronic equipment comprises display unit, described input equipment for forming tracing point on described display unit, and perform the first operation to control described electronic equipment, described method comprises:
The first optical signalling is transmitted to described electronic equipment;
Receive the second optical signalling from described electronic equipment, described second optical signalling is that described electronic equipment generates according to described first optical signalling; And
The positional information of the tracing point that described input equipment is formed on described display unit is determined according to described second optical signalling.
2. method according to claim 1, it is characterized in that, described second optical signalling comprises the first coding data of the coordinate information indicating described tracing point, describedly determines that according to described second optical signalling the step of the positional information of the tracing point that described input equipment is formed on described display unit comprises:
Described first coding data is extracted from described second optical signalling;
The coordinate information of described tracing point is obtained by decoding to described first coding data; And
Described coordinate information is defined as described positional information.
3. method according to claim 1, it is characterized in that, described second optical signalling comprises the first view data of the coordinate information indicating described tracing point, describedly determines that according to described second optical signalling the step of the positional information of the tracing point that described input equipment is formed on described display unit comprises:
Described first view data is extracted from described second optical signalling;
Obtain the overview image model of described display unit, described overview image model comprises multiple view data, and described multiple view data indicates the coordinate information of multiple location point in described display unit respectively;
Described first view data and described overview image model are compared;
When described first view data is mated with the part in described overview image model, the coordinate information of the location point be associated with described compatible portion is defined as the coordinate information of described tracing point; And
Described coordinate information is defined as described positional information.
4. method according to claim 1, it is characterized in that, described second optical signalling comprises the first view data of the coordinate information indicating described tracing point, describedly determines that according to described second optical signalling the step of the positional information of the tracing point that described input equipment is formed on described display unit comprises:
Described first view data is extracted from described second optical signalling;
Transmit described first view data to described electronic equipment, determine the positional information of described tracing point to make described electronic equipment according to described first view data; And
Described positional information is received from described electronic equipment.
5. method according to claim 1, is characterized in that, the wavelength of described first optical signalling is less than the wavelength of described second optical signalling, and the energy of described first optical signalling is greater than the energy of described second optical signalling.
6. method according to claim 1, is characterized in that, described method also comprises:
Transmit the positional information of described tracing point to described electronic equipment, perform described first operation to control described electronic equipment.
7. an information processing method, described method is applied to electronic equipment, it is characterized in that, described electronic equipment communicates with input equipment, described electronic equipment comprises display unit, described input equipment for forming tracing point on described display unit, and perform the first operation to control described electronic equipment, described method comprises:
The first optical signalling is received from described input equipment;
The second optical signalling is generated, the positional information that described second optical signalling makes described input equipment can determine from the tracing point formed on described display unit according to described first optical signalling; And
Described second optical signalling is transmitted to described input equipment.
8. method according to claim 7, is characterized in that, the described step generating the second optical signalling according to described first optical signalling comprises:
By down conversion, described first optical signalling is converted to described second optical signalling; And
Comprise first coding data at described second optical signalling, described first coding data indicates the coordinate information of described tracing point.
9. method according to claim 7, is characterized in that, the described step generating the second optical signalling according to described first optical signalling comprises:
By down conversion, described first optical signalling is converted to described second optical signalling; And
Comprise the first view data at described second optical signalling, described first view data indicates the coordinate information of described tracing point.
10. method according to claim 7, is characterized in that, described method also comprises:
Receive the first view data from described input equipment, described first view data indicates the coordinate information of described tracing point;
Obtain the overview image model of described display unit, described overview image model comprises multiple view data, and described multiple view data indicates the coordinate information of multiple location point in described display unit respectively;
Described first view data and described overview image model are compared;
When described first view data is mated with the part in described overview image model, the coordinate information of the location point be associated with described compatible portion is defined as the coordinate information of described tracing point;
Described coordinate information is defined as described positional information; And
Described positional information is transmitted to described input equipment.
11. methods according to claim 7, is characterized in that, the wavelength of described first optical signalling is less than the wavelength of described second optical signalling, and the energy of described first optical signalling is greater than the energy of described second optical signalling.
12. methods according to claim 7, is characterized in that, described method also comprises:
Receive the positional information of described tracing point from described input equipment, perform described first operation to control described electronic equipment.
13. 1 kinds of input equipments, is characterized in that, described input equipment communicates with electronic equipment, described electronic equipment comprises display unit, described input equipment for forming tracing point on described display unit, and perform the first operation to control described electronic equipment, described input equipment comprises:
First signal transmission unit, for transmitting the first optical signalling to described electronic equipment;
Secondary signal receiving element, for receiving the second optical signalling from described electronic equipment, described second optical signalling is that described electronic equipment generates according to described first optical signalling; And
Positional information determining unit, for determining the positional information of the tracing point that described input equipment is formed on described display unit according to described second optical signalling.
14. input equipments according to claim 13, it is characterized in that, described second optical signalling comprises the first coding data of the coordinate information indicating described tracing point, described positional information determining unit extracts described first coding data from described second optical signalling, obtain the coordinate information of described tracing point by decoding to described first coding data, and described coordinate information is defined as described positional information.
15. input equipments according to claim 13, it is characterized in that, described second optical signalling comprises the first view data of the coordinate information indicating described tracing point, described positional information determining unit extracts described first view data from described second optical signalling, obtain the overview image model of described display unit, described overview image model comprises multiple view data, described multiple view data indicates the coordinate information of multiple location point in described display unit respectively, described first view data and described overview image model are compared, when described first view data is mated with the part in described overview image model, the coordinate information of the location point be associated with described compatible portion is defined as the coordinate information of described tracing point, and described coordinate information is defined as described positional information.
16. input equipments according to claim 13, it is characterized in that, described second optical signalling comprises the first view data of the coordinate information indicating described tracing point, described positional information determining unit extracts described first view data from described second optical signalling, described first view data is transmitted to described electronic equipment, to make described electronic equipment determine the positional information of described tracing point according to described first view data, and receive described positional information from described electronic equipment.
17. input equipments according to claim 13, is characterized in that, the wavelength of described first optical signalling is less than the wavelength of described second optical signalling, and the energy of described first optical signalling is greater than the energy of described second optical signalling.
18. input equipments according to claim 13, is characterized in that, described input equipment also comprises:
Positional information delivery unit, for transmitting the positional information of described tracing point to described electronic equipment, performs described first operation to control described electronic equipment.
19. 1 kinds of electronic equipments, is characterized in that, described electronic equipment communicates with input equipment, described electronic equipment comprises display unit, described input equipment for forming tracing point on described display unit, and perform the first operation to control described electronic equipment, described electronic equipment comprises:
First signal receiving unit, for receiving the first optical signalling from described input equipment;
Secondary signal generation unit, for generating the second optical signalling according to described first optical signalling, the positional information that described second optical signalling makes described input equipment can determine from the tracing point formed on described display unit; And
Secondary signal delivery unit, for transmitting described second optical signalling to described input equipment.
20. electronic equipments according to claim 19, it is characterized in that, described first optical signalling is converted to described second optical signalling by down conversion by described secondary signal generation unit, and comprise first coding data at described second optical signalling, described first coding data indicates the coordinate information of described tracing point.
21. electronic equipments according to claim 19, it is characterized in that, described first optical signalling is converted to described second optical signalling by down conversion by secondary signal generation unit, and comprise the first view data at described second optical signalling, described first view data indicates the coordinate information of described tracing point.
22. electronic equipments according to claim 19, is characterized in that, described electronic equipment also comprises:
Data receipt unit, for receiving the first view data from described input equipment, described first view data indicates the coordinate information of described tracing point;
Positional information determining unit, for obtaining the overview image model of described display unit, described overview image model comprises multiple view data, described multiple view data indicates the coordinate information of multiple location point in described display unit respectively, described first view data and described overview image model are compared, when described first view data is mated with the part in described overview image model, the coordinate information of the location point be associated with described compatible portion is defined as the coordinate information of described tracing point, and described coordinate information is defined as described positional information, and
Positional information delivery unit, for transmitting described positional information to described input equipment.
23. electronic equipments according to claim 19, is characterized in that, the wavelength of described first optical signalling is less than the wavelength of described second optical signalling, and the energy of described first optical signalling is greater than the energy of described second optical signalling.
24. electronic equipments according to claim 19, is characterized in that, described electronic equipment also comprises:
Positional information receiving element, for receiving the positional information of described tracing point from described input equipment, performs described first operation to control described electronic equipment.
CN201410082152.0A 2014-03-07 2014-03-07 Information processing method, input equipment and electronic equipment Active CN104898956B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410082152.0A CN104898956B (en) 2014-03-07 2014-03-07 Information processing method, input equipment and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410082152.0A CN104898956B (en) 2014-03-07 2014-03-07 Information processing method, input equipment and electronic equipment

Publications (2)

Publication Number Publication Date
CN104898956A true CN104898956A (en) 2015-09-09
CN104898956B CN104898956B (en) 2019-10-29

Family

ID=54031643

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410082152.0A Active CN104898956B (en) 2014-03-07 2014-03-07 Information processing method, input equipment and electronic equipment

Country Status (1)

Country Link
CN (1) CN104898956B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111078095A (en) * 2018-10-18 2020-04-28 深圳市鹰硕技术有限公司 Intelligent writing device and method based on RFID technology
CN113261031A (en) * 2018-09-21 2021-08-13 迪奥塔索夫特公司 Device, system and method for positioning an acquisition module relative to a device to be monitored by means of a processing module

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1875317A (en) * 2003-11-03 2006-12-06 超级影像股份有限公司 Light emitting material integrated into a substantially transparent substrate
CN101673159A (en) * 2008-09-09 2010-03-17 三星电机株式会社 Opto-touch screen
CN102243704A (en) * 2011-07-21 2011-11-16 我查查信息技术(上海)有限公司 Positioning system for two-dimension bar code and two-dimension bar code identification method and equipment
CN102253737A (en) * 2011-05-31 2011-11-23 王石 Screen-vision mouse system and realizing method thereof
CN102270298A (en) * 2010-06-04 2011-12-07 株式会社理光 Method and device for detecting laser point/area
CN102981618A (en) * 2012-11-16 2013-03-20 Tcl集团股份有限公司 Display terminal and touch control method and touch control system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1875317A (en) * 2003-11-03 2006-12-06 超级影像股份有限公司 Light emitting material integrated into a substantially transparent substrate
CN101673159A (en) * 2008-09-09 2010-03-17 三星电机株式会社 Opto-touch screen
CN102270298A (en) * 2010-06-04 2011-12-07 株式会社理光 Method and device for detecting laser point/area
CN102253737A (en) * 2011-05-31 2011-11-23 王石 Screen-vision mouse system and realizing method thereof
CN102243704A (en) * 2011-07-21 2011-11-16 我查查信息技术(上海)有限公司 Positioning system for two-dimension bar code and two-dimension bar code identification method and equipment
CN102981618A (en) * 2012-11-16 2013-03-20 Tcl集团股份有限公司 Display terminal and touch control method and touch control system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113261031A (en) * 2018-09-21 2021-08-13 迪奥塔索夫特公司 Device, system and method for positioning an acquisition module relative to a device to be monitored by means of a processing module
CN111078095A (en) * 2018-10-18 2020-04-28 深圳市鹰硕技术有限公司 Intelligent writing device and method based on RFID technology

Also Published As

Publication number Publication date
CN104898956B (en) 2019-10-29

Similar Documents

Publication Publication Date Title
CN101963840B (en) System and method for remote, virtual on screen input
CN106055127A (en) Active pen with bidirectional communication
US9261987B2 (en) Method of supporting multiple selections and interactive input system employing same
US9563231B2 (en) Electronic device capable of input from a detachably attached digitizer panel and input method and system therefor
CN105700733A (en) low latency inking
CN106155409A (en) Capacitive character tolerance for patterns of change processes
CN102819315A (en) 3D (three-dimension) man-machine interaction method and system
CA2425746A1 (en) Collaborative input system
CN105683994A (en) Wet ink texture engine for reduced lag digital inking
JP2012230677A (en) External touch keyboard
CN105892776A (en) Capacitive Image Sensing System
WO2014163373A1 (en) Method and apparatus for inputting text in electronic device having touchscreen
US20200371680A1 (en) Method and system for touch screen erasing
CN105183236A (en) Touch screen input device and method
CN104732124B (en) Mobile device and computing device and signature verification method
KR101622466B1 (en) System and method for providing content
CN104732193A (en) Gesture recognition system and gesture recognition method thereof
US20160041635A1 (en) Active stylus pen, electronic device and data input system
KR200477008Y1 (en) Smart phone with mouse module
CN210353736U (en) Intelligent interactive object recognition table and system
CN104898956A (en) Information processing method, input equipment and electronic equipment
CN105807955A (en) Palm rejection visualization for passive stylus
CN106775003A (en) Interactive device, color control method and device
CN101354623A (en) Method and system for implementing two-dimensional and three-dimensional pointer needle utilizing infrared image sensing
CN201556194U (en) Interaction operation input system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant