CN110750202A - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN110750202A
CN110750202A CN201910939393.5A CN201910939393A CN110750202A CN 110750202 A CN110750202 A CN 110750202A CN 201910939393 A CN201910939393 A CN 201910939393A CN 110750202 A CN110750202 A CN 110750202A
Authority
CN
China
Prior art keywords
display area
target image
electronic device
graph
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910939393.5A
Other languages
Chinese (zh)
Other versions
CN110750202B (en
Inventor
顾书露
张建辉
王元成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201910939393.5A priority Critical patent/CN110750202B/en
Publication of CN110750202A publication Critical patent/CN110750202A/en
Application granted granted Critical
Publication of CN110750202B publication Critical patent/CN110750202B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/32Digital ink
    • G06V30/333Preprocessing; Feature extraction
    • G06V30/347Sampling; Contour coding; Stroke extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses an information processing method and electronic equipment, wherein the method comprises the following steps: if a target event occurs, acquiring a target image acquired by second electronic equipment, wherein the target image is an image of a display area of the first electronic equipment; determining the attribute of the graph in the target image; determining the position of the target image in the display area according to the attribute of the graph in the target image and the second incidence relation; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph; performing information processing based on the position of the target image in the display area; wherein the target event comprises at least one of: the second electronic device is in contact with a display area of the first electronic device; and in a target time period, the variation of the number of the graphs collected by the second electronic equipment is within a threshold range.

Description

Information processing method and electronic equipment
Technical Field
The embodiment of the application relates to electronic technology, and relates to but is not limited to an information processing method and electronic equipment.
Background
With the development of information technology, the generation, transmission and processing of information are no longer dependent on traditional paper and pens, but are accomplished using computers. The traditional mode of writing by using paper and a pen cannot be subjected to informatization processing, and is not simple, convenient and environment-friendly, so that the requirement of people cannot be met.
At present, the interaction and processing of information are mostly completed by using an electronic device with a touch screen. However, for an electronic device without a touch screen, i.e., an electronic device with a screen having only a display function, only a mouse or a keyboard can be used for input. Therefore, the use scenes of the user are limited, and poor experience is brought to the user.
Particularly, in the office scene, when the screen is projected by using the extended screen, some screens, such as a TV (television) and a Monitor (display screen), do not have a touch function, and it is very inconvenient if the user needs to write synchronously during the explanation process.
Disclosure of Invention
In view of this, an embodiment of the present application provides an information processing method and an electronic device.
The technical scheme of the embodiment of the application is realized as follows:
in a first aspect, an embodiment of the present application provides an information processing method, which is applied to a first electronic device, and the method includes: if a target event occurs, acquiring a target image acquired by second electronic equipment, wherein the target image is an image of a display area of the first electronic equipment;
determining the attribute of the graph in the target image;
determining the position of the target image in the display area according to the attribute of the graph in the target image and the second incidence relation; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph;
performing information processing based on the position of the target image in the display area;
wherein the target event comprises at least one of:
the second electronic device is in contact with a display area of the first electronic device;
and in a target time period, the variation of the number of the graphs collected by the second electronic equipment is within a threshold range.
In an embodiment of the application, before the obtaining of the target image acquired by the second electronic device, the method further includes: and coding the attributes of the graphs at different positions of the display area and the positions of the graphs according to a coding rule to obtain the second association relation.
In an embodiment of the application, before the obtaining of the target image acquired by the second electronic device, the method further includes: determining a first association relation according to a coding rule corresponding to the graphic template; the first incidence relation is used for indicating the incidence relation between the attribute of the graph and the position of the graph at different positions of the graph template;
and determining the second association relation according to the size of the graphic template, the size of the display area and the first association relation.
In an embodiment of the present application, the method further includes: loading the graphic template to the display area; the size of the loaded graphic template is the same as that of the display area;
correspondingly, the information processing based on the position of the target image in the display area comprises: and processing the display content corresponding to the position of the target image in the display area.
In an embodiment of the present application, the method further includes: loading the graphic template to a sub-area of the display area; wherein the size of the sub-region is smaller than the size of the display region;
correspondingly, the determining the position of the target image in the display area according to the attribute of the graph in the target image and the second association relationship includes: determining the position of the target image in the sub-area according to the attribute of the graph in the target image and the second incidence relation;
correspondingly, the information processing based on the position of the target image in the display area comprises: and processing the display content corresponding to the position of the target image in the sub-area.
In this embodiment of the present application, the performing information processing based on the position of the target image in the display area includes: displaying trajectory information based on a position of the target image in the display area;
or erasing track information based on the position of the target image in the display area;
or, based on the position of the target image in the display area, executing the operation corresponding to the function key at the position.
In this embodiment of the application, if the second electronic device performs continuous image acquisition on the display area, the method includes: acquiring a plurality of target images acquired by the second electronic equipment;
determining the acquisition time of each target image in the plurality of target images and the position of each target image in the display area;
and performing information processing based on the acquisition time and the position of each target image in the display area to display or erase continuous track information.
In a second aspect, an embodiment of the present application provides an information processing method, which is applied to a second electronic device, and the method includes: if the target event occurs, acquiring an image of a display area of the first electronic equipment;
sending the acquired target image to the first electronic device, so that the first electronic device determines the position of the target image in the display area according to the attribute and the second incidence relation of the graph in the target image, and performs information processing based on the position of the target image in the display area; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph;
wherein the target event comprises at least one of:
the second electronic device is in contact with a display area of the first electronic device;
and in a target time period, the variation of the number of the graphs collected by the second electronic equipment is within a threshold range.
In a third aspect, an embodiment of the present application provides an information processing method applied to a second electronic device, where the method includes: if the target event occurs, acquiring an image of a display area of the first electronic equipment;
determining the position of the target image in the display area according to the attribute of the graph in the acquired target image and the second incidence relation; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph;
sending the position of the target image in the display area to the first electronic device so that the first electronic device performs information processing based on the position of the target image in the display area;
wherein the target event comprises at least one of:
the second electronic device is in contact with a display area of the first electronic device;
and in a target time period, the variation of the number of the graphs collected by the second electronic equipment is within a threshold range.
In a fourth aspect, an embodiment of the present application provides an electronic device, which includes a memory and a processor, where the memory stores a computer program that is executable on the processor, and the processor implements the steps in the information processing method when executing the program.
The embodiment of the application provides an information processing method and electronic equipment, wherein if a target event occurs, a target image acquired by second electronic equipment is acquired, and the target image is an image of a display area of first electronic equipment; determining the attribute of the graph in the target image; determining the position of the target image in the display area according to the attribute of the graph in the target image and the second incidence relation; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph; performing information processing based on the position of the target image in the display area; wherein the target event comprises at least one of: the second electronic device is in contact with a display area of the first electronic device; in the target time period, the variation of the number of the graphics collected by the second electronic device is within the threshold range, so that the conventional display device can be changed from a state without a touch function to a state with a touch function without changing the conventional display device (adding hardware), and meanwhile, misoperation of touch is avoided.
Drawings
Fig. 1A is a first schematic flow chart illustrating an implementation of an information processing method according to an embodiment of the present application;
fig. 1B is an external structural schematic diagram of a touch pen according to an embodiment of the present application;
fig. 1C is a first schematic diagram of a scene corresponding to an information processing method according to an embodiment of the present application;
fig. 1D is a schematic diagram of a scene corresponding to the information processing method according to the embodiment of the present application;
FIG. 2A is a schematic diagram of an implementation flow of an information processing method according to an embodiment of the present application
Fig. 2B is a schematic view illustrating a third implementation flow of the information processing method according to the embodiment of the present application;
FIG. 3 is a first schematic diagram illustrating a first exemplary configuration of an information processing apparatus according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a second exemplary embodiment of an information processing apparatus;
FIG. 5 is a third schematic diagram of a configuration of an information processing apparatus according to an embodiment of the present application;
fig. 6 is a schematic diagram of a hardware entity of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the following will describe the specific technical solutions of the present application in further detail with reference to the accompanying drawings in the embodiments of the present application. The following examples are intended to illustrate the present application only and are not intended to limit the scope of the present application.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for the convenience of description of the present application, and have no specific meaning by themselves. Thus, "module", "component" or "unit" may be used mixedly.
The embodiment of the present application provides an information processing method, which is applied to a first electronic device, where functions implemented by the method may be implemented by a processor calling a program code, and of course, the program code may be stored in a storage medium of the first electronic device, and it is apparent that the first electronic device at least includes the processor and the storage medium. Fig. 1A is a first schematic flow chart of an implementation of an information processing method according to an embodiment of the present application, and as shown in fig. 1A, the method includes:
step S101, if a target event occurs, acquiring a target image acquired by second electronic equipment, wherein the target image is an image of a display area of the first electronic equipment; wherein the target event comprises at least one of: the second electronic device is in contact with a display area of the first electronic device; and in a target time period, the variation of the number of the graphs collected by the second electronic equipment is within a threshold range.
In the embodiment of the application, when the variation of the number of the graphics collected by the second electronic device is within a certain threshold range, or the second electronic device is in contact with the display area of the first electronic device, the target image collected by the second electronic device is obtained, and further information processing is performed based on the collected target object.
Here, the first electronic device may be various types of devices having information processing capability, such as a mobile phone, a Personal Digital Assistant (PDA), a navigator, a Digital phone, a video phone, a smart watch, a smart bracelet, a wearable device, a tablet computer, a notebook computer, and the like, and a computing device having information processing capability such as a Personal computer and a server cluster, and the like.
In this embodiment, the first electronic device may be an electronic device including a display device, and the second electronic device is an electronic device including a collection device. For example, the first electronic device may be a mobile phone, a tablet computer, a television, and the like, and the second electronic device may be a touch pen, a touch pen eraser, and the like. Fig. 1B is a schematic structural diagram of an appearance of a touch pen according to an embodiment of the present application, and as shown in fig. 1B, the touch pen includes a pen tip 11, a pen middle 12, and a pen tail 13. The pen tip 11 is mainly used for assisting a user in writing and positioning. The pen 12 has a camera therein, which can capture video information, collect graphics, and collect images within a certain area including the pen tip. The pen tail 13 is mainly used for assisting a user to hold the touch pen.
In this embodiment of the application, the target image acquired by the second electronic device is an image in the display area of the first electronic device. For example, when a touch pen writes in a display area, in the writing process, a camera in the touch pen collects a picture of the display area through which a pen point of the touch pen passes, wherein the pen point of the touch pen can be in contact with the display area, and the pen point of the touch pen can also be at a certain distance from the display area.
S102, determining the attribute of the graph in the target image;
for example, a graphic template may be loaded in the display area of the first electronic device, the graphic template may be a dot matrix graphic template, the dot matrix graphic template may include a triangle, a diamond, a square, a rectangle, a circle, and other graphics formed by each point, and distances, directions, and the like between the graphics are different, that is, the graphics and/or the arrangement between the graphics are different, and the different arrangement represents different positions. For example, the dot matrix graphic template comprises circular graphics, and the arrangement form of the graphics contains coordinate information. The distance between the figures is different, and the arrangement direction is different, so that different coordinate values are expressed. Of course, the dot pattern template may also include other patterns as long as different coordinate information can be expressed, and the type of the pattern, the number of the patterns, and other attributes of the pattern, and the encoding rule corresponding to the attributes are not limited in this application embodiment. Alternatively, the same images, e.g. all circles, have different labels, e.g. label 1 on a first circle, label 2 on a second circle, etc. Furthermore, the attributes of the graphics in the target image of the display area collected by the second electronic device, that is, the shape of the graphics in the target image, the distance between the graphics, the arrangement direction, the identification and other attribute information, may be determined.
That is, the attribute of the graph in the embodiment of the present application includes at least one of the following: the distance between the figures, the shapes of the figures, the sizes of the figures, the number of the figures, the arrangement direction of the figures and the marks of the figures.
S103, determining the position of the target image in the display area according to the attribute of the graph in the target image and the second incidence relation; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph;
here, the second association is used to represent an association between the attribute of the graphic at different positions in the display area and the position of the graphic. The attributes of the graphs at different positions of the display area are different, the different attributes correspond to different coordinate positions, and the coordinate positions are the coordinate positions of the graphs with the attributes.
And step S104, performing information processing based on the position of the target image in the display area.
Here, after the position of the target image in the display area is determined, information processing may be performed for the position. For example, the content displayed at the position is displayed, or the content displayed at the position is hidden, or other processing is performed on the content displayed at the position in the display area, and so on.
In some embodiments, the step S104, performing information processing based on the position of the target image in the display area, includes: displaying trajectory information based on a position of the target image in the display area; or erasing track information based on the position of the target image in the display area; or, based on the position of the target image in the display area, executing the operation corresponding to the function key at the position.
In the embodiment of the application, if a target event occurs, a target image acquired by second electronic equipment is obtained, wherein the target image is an image of a display area of the first electronic equipment; determining the attribute of the graph in the target image; determining the position of the target image in the display area according to the attribute of the graph in the target image and the second incidence relation; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph; performing information processing based on the position of the target image in the display area; wherein the target event comprises at least one of: the second electronic device is in contact with a display area of the first electronic device; in the target time period, the variation of the number of the graphics collected by the second electronic device is within the threshold range, so that the conventional display device can be changed from a state without a touch function to a state with a touch function without changing the conventional display device (adding hardware), and meanwhile, misoperation of touch is avoided.
Based on the foregoing embodiments, an embodiment of the present application further provides an information processing method, where the method is applied to a first electronic device, and the method includes:
s111, coding the attributes of the graphs at different positions of the display area and the positions of the graphs according to a coding rule to obtain a second association relation;
here, the attributes of the graphics at different positions of the display area and the positions of the graphics may be encoded according to a certain encoding rule to obtain the second association relationship. That is to say, the attributes of the graphics corresponding to different positions on the display area are different and are in a one-to-one correspondence relationship.
Step S112, if a target event occurs, obtaining a target image acquired by second electronic equipment, wherein the target image is an image of a display area of the first electronic equipment; wherein the target event comprises at least one of: the second electronic device is in contact with a display area of the first electronic device; in a target time period, the variation quantity of the number of the graphs collected by the second electronic equipment is within a threshold range;
step S113, determining the attribute of the graph in the target image;
step S114, determining the position of the target image in the display area according to the attribute of the graph in the target image and the second incidence relation; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph;
and step S115, performing information processing based on the position of the target image in the display area.
Based on the foregoing embodiments, an embodiment of the present application further provides an information processing method, where the method is applied to a first electronic device, and the method includes:
step S121, determining a first association relation according to a coding rule corresponding to the graphic template; the first incidence relation is used for indicating the incidence relation between the attribute of the graph and the position of the graph at different positions of the graph template;
here, the graphic template may be a preset graphic template, and the correspondence between the attribute of the graphic at different positions of the graphic template and the position of the graphic can be determined by a preset encoding rule. Different graphic templates may have different encoding rules.
Step S122, determining the second association relationship according to the size of the graphic template, the size of the display area and the first association relationship;
here, the graphic template is matched with a display area of the first electronic device, so as to determine a corresponding relationship between attributes of the graphic at different positions in the display area and positions of the graphic.
In this embodiment, the second electronic device may be a touch pen, which has a visible feature and can recognize coordinates by using a graphic template on a screen. And, the number of template patterns appearing in the field of view for screens of different sizes is different. For example, the graphic template is a dot matrix graphic template, and when the same graphic template is applied to display screens with different sizes, the larger the size of the display screen is, the more sparse the density of the points in the display screen is, and the smaller the size of the display screen is, the more compact the density of the points in the display screen is. If the distance between the camera in the touch pen and the screen is the same, the size of the collected area is the same, so that the display screen is large, the number of the points in the unit area is small, the more rough the positioned position (coordinate) is, and conversely, the display screen is small, the number of the points in the unit area is large, and the more accurate the positioned position (coordinate) is. That is, factors that affect the accuracy of locating the pen tip position (coordinates) are: the distance from the pen tip to the screen, and the number of points in a unit area.
In some embodiments, the step S122 of determining the second association relationship according to the size of the graphic template, the size of the display area and the first association relationship may be implemented by:
step S1221, determining the difference ratio of the size of the graphic template and the size of the display area;
generally, the types of the electronic devices are various, such as mobile phones, tablet computers, all-in-one computers, televisions, and the like. The size of the display area of different types of electronic devices is quite different, for example, the size of the display area of a mobile phone and the size of the display area of a computer are quite different. Even for the same type of electronic devices, there is a size difference in the display area of the same type of electronic devices due to differences in manufacturers or production lots of the same manufacturer. For example, there is a difference in the size of the display area associated with different models of cellular phones. However, the size of the graphic template is generally fixed, and therefore, the graphic template needs to be matched with the display area of the first electronic device. If the display area is larger, the pattern per unit area is more sparse, and the positioning accuracy is lower. If the display area is smaller, the closer the pattern per unit area is, the higher the positioning accuracy is.
And step S1222, determining a second association relation according to the difference ratio and the first association relation.
Here, the graphic template may be loaded to the whole display area, or may be loaded to a part of the display area, and the operation is performed according to actual needs. The loading the graphic template to the whole display area comprises: loading a graphic template into a window of an application program of the first electronic device, or loading a graphic template into a partial display area of an application program of the first electronic device, or loading a graphic template into an entire screen background of the first electronic device, and so on.
Step S123, if a target event occurs, acquiring a target image acquired by second electronic equipment, wherein the target image is an image of a display area of the first electronic equipment; wherein the target event comprises at least one of: the second electronic device is in contact with a display area of the first electronic device; in a target time period, the variation quantity of the number of the graphs collected by the second electronic equipment is within a threshold range;
here, the target image is an image of the display area, and means that a subject of the target image is the display area, and the captured content is content in the display area. Generally, the content of the target graphic is a portion of the content of the display area.
Here, the display area may be the entire display area corresponding to the display device of the first electronic device, or may be a partial display area of the entire display area.
Step S124, determining the attribute of the graph in the target image;
here, the attribute of the graphic includes at least one of: the distance between the patterns, the shape of the patterns, the size of the patterns, the number of the patterns, the arrangement direction of the patterns, the marks, and the like.
Step S125, determining the position of the target image in the display area according to the attribute of the graph in the target image and the second incidence relation; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph;
and step S126, performing information processing based on the position of the target image in the display area.
Here, after the position of the target image in the display area is determined, information processing may be performed based on the position, and thus, although the display device of the first electronic device does not have a touch screen function, an input trajectory may be displayed based on an input position after an input is made by a touch pen in the display area, or the input trajectory may be hidden based on the input position after an input is made by the touch pen in the display area, that is, the display device has a touch writing function.
Based on the foregoing embodiments, an embodiment of the present application further provides an information processing method, where the method is applied to a first electronic device, and the method includes:
s131, determining a first association relation according to a coding rule corresponding to the graphic template; the first incidence relation is used for indicating the incidence relation between the attribute of the graph and the position of the graph at different positions of the graph template;
step S132, loading the graphic template to the display area; the size of the loaded graphic template is the same as that of the display area;
here, the loading of the graphic template to the display area refers to loading the graphic template to the entire display area of the first electronic device, or loading the graphic template to the entire display area in an APP (Application program) of the first electronic device.
Step S133, determining the second association relationship according to the size of the graphic template, the size of the display area and the first association relationship;
step S134, if a target event occurs, acquiring a target image acquired by second electronic equipment, wherein the target image is an image of a display area of the first electronic equipment; wherein the target event comprises at least one of: the second electronic device is in contact with a display area of the first electronic device; in a target time period, the variation quantity of the number of the graphs collected by the second electronic equipment is within a threshold range;
here, if the graphic template is loaded to the whole display area of the first electronic device, the acquired image in the whole display area is a valid target image. If the graphic template is loaded to the whole display area of an APP of the first electronic device, the acquired images in the display area of the APP are all valid target images, and the acquired images outside the display area of the APP are invalid target images.
Step S135, determining the attribute of the graph in the target image;
s136, determining the position of the target image in the display area according to the attribute of the graph in the target image and the second incidence relation; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph;
for example, the graphic template is loaded to the entire display area of the first electronic device, the user is viewing a certain file at this time, the position of the target image is identified as the close button on the file through the acquired target image, and then the display position corresponding to the position of the target image in the display area is determined as the close button position of the file. In this case, the file may be closed, which corresponds to the user clicking on the close button. For another example, if the graphic template is loaded on the entire display area of the music APP of the first electronic device, the user is listening to music at this time, the position of the target image is identified through the collected target image, and the position of the target image on the display area is the fast forward key of the music APP, then the fast forward operation may be performed on the song being listened to.
And step S137, processing the display content corresponding to the position of the target image in the display area.
In some embodiments, if the second electronic device performs continuous image acquisition of the display area, the method comprises:
step S11, acquiring a plurality of target images acquired by the second electronic equipment;
step S12, determining the acquisition time of each target image in the plurality of target images and the position of each target image in the display area;
and step S13, based on the acquisition time and the position of each target image in the display area, performing information processing to display or erase continuous track information.
Fig. 1C is a first schematic view of a scene corresponding to the information processing method in the embodiment of the present application, and as shown in fig. 1C, the graph template may be loaded into a background of the drawing software, a plurality of target images acquired by the second electronic device are received, the acquisition time of each target image in the plurality of target images and the position of each target image in the drawing software are determined, then all corresponding positions are connected according to the time sequence of the acquired images, and the connected content is displayed. The drawing software is presented with notes 11 entered by the user. Here, when the pen slides on the screen, the coordinate values can be calculated, the coordinate values are connected to form different characters, the pen has bluetooth, and the bluetooth is connected with the computer. Suppose that a camera 1S (second) in the pen can collect 30 pictures and transmit the pictures into a computer, the pen slides 2cm (centimeter) in 1S, 30 photos are taken (each image is different), each photo comprises 60 points by 60 points, and the shapes, the numbers, the distances and the directions of the graphs formed by all the points are different and represent different coordinates, namely 30 coordinates. The computer feeds the coordinate information back to the drawing software, the drawing software directly generates coordinate points according to the coordinate values, and then the points are connected into a line. The Bluetooth arranged in the pen is used for transmitting the acquired target image. Of course, the pen may also be provided with a processing unit for determining the coordinates of the target image in the screen according to the acquired target image, and then sending the processed coordinate information to the computer.
Based on the foregoing embodiments, an embodiment of the present application further provides an information processing method, where the method is applied to a first electronic device, and the method includes:
step S141, determining a first association relation according to a coding rule corresponding to the graphic template; the first incidence relation is used for indicating the incidence relation between the attribute of the graph and the position of the graph at different positions of the graph template;
step S142, loading the graphic template to a sub-area of the display area; wherein the size of the sub-region is smaller than the size of the display region;
here, the loading of the graphic template to the sub-region of the display region refers to loading the graphic template to the sub-region of the entire display region of the first electronic device, or loading the graphic template to the sub-region of the display region in a certain APP of the first electronic device.
Step S143, determining the second association relationship according to the size of the graphic template, the size of the display area and the first association relationship;
here, the determining the second association relationship according to the size of the graphic template, the size of the display area, and the first association relationship includes: and determining the second association relation according to the size of the graphic template, the size of the sub-area of the display area and the first association relation.
Step S144, if a target event occurs, acquiring a target image acquired by second electronic equipment, wherein the target image is an image of a display area of the first electronic equipment; wherein the target event comprises at least one of: the second electronic device is in contact with a display area of the first electronic device; in a target time period, the variation quantity of the number of the graphs collected by the second electronic equipment is within a threshold range;
step S145, determining the attribute of the graph in the target image;
step S146, determining the position of the target image in the sub-area according to the attribute of the graph in the target image and the second incidence relation; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph;
and step S147, processing the display content corresponding to the position of the target image in the sub-area.
Fig. 1D is a schematic diagram of a corresponding scene of the information processing method in the embodiment of the present application, and as shown in fig. 1D, an image template is loaded into a sub-area 12 (the sub-area 12 is an operation area) of the display area of the movie APP 11 (an image template is not loaded in a playing area in the display area of the movie APP 11, so that the image template does not visually affect a playing video picture), and at this time, a user performs an image capturing operation on the sub-area 12. And determining the position of the target image in the sub-area 12 according to the attribute of the acquired target image and the second association relationship, wherein the display position corresponding to the position of the target image is the position corresponding to a pause/play button in the movie APP 11. If the video is being played before the target image is acquired, processing the display content in the display position, including: pausing the video. If the video is in a pause state before the target image is acquired, processing the display content in the display position, wherein the processing comprises the following steps: the video is played.
Based on the foregoing embodiments, the present application further provides an information processing method, which is applied to a second electronic device, and the functions implemented by the method may be implemented by a processor in a server calling a program code, where of course, the program code may be stored in a storage medium of the second electronic device, and the server at least includes the processor and the storage medium. Fig. 2A is a schematic view of an implementation flow of an information processing method according to an embodiment of the present application, and as shown in fig. 2A, the method includes:
step S201, if a target event occurs, acquiring an image of a display area of first electronic equipment; wherein the target event comprises at least one of: the second electronic device is in contact with a display area of the first electronic device; in a target time period, the variation quantity of the number of the graphs collected by the second electronic equipment is within a threshold range;
here, when the second electronic device touches the display area of the first electronic device, that is, the pen tip and the screen are in contact, it is considered that the front end of the pen has touched the screen.
For example, in a certain period of time, when the number of the patterns in the camera of the pen is kept unchanged, or the number of the patterns in the camera of the pen changes, a small oscillation occurs, and the oscillation is within 10%, the front end of the pen is considered to have contacted the screen. Thus, the misoperation on a high-definition screen caused by the fact that the pen does not contact the screen and does not write can be avoided.
Step S202, sending the acquired target image to the first electronic device, so that the first electronic device determines the position of the target image in the display area according to the attribute and the second association relation of the graph in the target image, and performs information processing based on the position of the target image in the display area; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph.
Here, the second electronic device has an image capturing function, and therefore, the captured target image may be sent to the first electronic device, so that the first electronic device determines a position of the target image in the display area according to an attribute of a graphic in the target image and the second association relationship, and performs information processing based on the position of the target image in the display area.
In the embodiment of the application, if a target event occurs, image acquisition is carried out on a display area of first electronic equipment; sending the acquired target image to the first electronic device, so that the first electronic device determines the position of the target image in the display area according to the attribute and the second incidence relation of the graph in the target image, and performs information processing based on the position of the target image in the display area; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph; wherein the target event comprises at least one of: the second electronic device is in contact with a display area of the first electronic device; in the target time period, the variation of the number of the graphics collected by the second electronic device is within the threshold range, so that the conventional display device can be changed from a state without a touch function to a state with a touch function without changing the conventional display device (adding hardware), and meanwhile, misoperation of touch is avoided.
Based on the foregoing embodiment, an embodiment of the present application further provides an information processing method, where the method is applied to a second electronic device, and fig. 2B is a schematic view of an implementation flow of the information processing method according to the embodiment of the present application, and as shown in fig. 2B, the method includes:
step S211, if a target event occurs, acquiring an image of a display area of the first electronic equipment; wherein the target event comprises at least one of: the second electronic device is in contact with a display area of the first electronic device; in a target time period, the variation quantity of the number of the graphs collected by the second electronic equipment is within a threshold range;
here, when the second electronic device touches the display area of the first electronic device, that is, the pen tip and the screen are in contact, it is considered that the front end of the pen has touched the screen.
For example, in a certain period of time, when the number of the patterns in the camera of the pen is kept unchanged, or the number of the patterns in the camera of the pen changes, a small oscillation occurs, and the oscillation is within 10%, the front end of the pen is considered to have contacted the screen. Thus, the misoperation on a high-definition screen caused by the fact that the pen does not contact the screen and does not write can be avoided.
Step S212, determining the position of the target image in the display area according to the attribute of the acquired graph in the target image and the second incidence relation; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph;
here, a camera, a processor, and other devices may be provided in the second electronic device, so that the second electronic device has not only an image capturing function but also an information processing function. Furthermore, the position of the target image in the display area of the first electronic device may be determined directly according to the attribute of the image in the acquired target image and the second association relationship, so that the position may be sent to the first electronic device, so that the first electronic device performs information processing based on the position of the target image in the display area.
Step S213, sending the position of the target image in the display area to the first electronic device, so that the first electronic device performs information processing based on the position of the target image in the display area.
In the embodiment of the application, if a target event occurs, image acquisition is carried out on a display area of first electronic equipment; determining the position of the target image in the display area according to the attribute of the graph in the acquired target image and the second incidence relation; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph; sending the position of the target image in the display area to the first electronic device so that the first electronic device performs information processing based on the position of the target image in the display area; wherein the target event comprises at least one of: the second electronic device is in contact with a display area of the first electronic device; in the target time period, the variation of the number of the graphics collected by the second electronic device is within the threshold range, so that the conventional display device can be changed from a state without a touch function to a state with a touch function without changing the conventional display device (adding hardware), and meanwhile, misoperation of touch is avoided.
Based on the foregoing embodiments, an information processing apparatus is provided in an embodiment of the present application, where the apparatus includes units and modules included in the units, and may be implemented by a processor in an electronic device; of course, the implementation can also be realized through a specific logic circuit; in the implementation process, the processor may be a CPU (Central Processing Unit), an MPU (Microprocessor Unit), a DSP (Digital Signal Processing), an FPGA (Field Programmable Gate Array), or the like.
Fig. 3 is a schematic diagram illustrating a first configuration of an information processing apparatus according to an embodiment of the present invention, and as shown in fig. 3, the apparatus 300 includes:
an obtaining unit 301, configured to obtain a target image acquired by a second electronic device if a target event occurs, where the target image is an image of a display area of the first electronic device;
an attribute determining unit 302, configured to determine an attribute of a graphic in the target image;
a position determining unit 303, configured to determine, according to an attribute of a graphic in the target image and a second association relationship, a position of the target image in the display area; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph;
a processing unit 304 configured to perform information processing based on a position of the target image in the display area;
wherein the target event comprises at least one of:
the second electronic device is in contact with a display area of the first electronic device;
and in a target time period, the variation of the number of the graphs collected by the second electronic equipment is within a threshold range.
In some embodiments, the apparatus further comprises:
and the coding unit is used for coding the attributes of the graphs at different positions of the display area and the positions of the graphs according to a coding rule to obtain the second association relation.
In some embodiments, the apparatus further comprises:
the first incidence relation determining unit is used for determining a first incidence relation according to the coding rule corresponding to the graphic template; the first incidence relation is used for indicating the incidence relation between the attribute of the graph and the position of the graph at different positions of the graph template;
and the second association relation determining unit is used for determining the second association relation according to the size of the graphic template, the size of the display area and the first association relation.
In some embodiments, the apparatus further comprises:
the first loading unit is used for loading the graphic template to the display area; the size of the loaded graphic template is the same as that of the display area;
correspondingly, the processing unit 304 includes:
and the first processing module is used for processing the display content corresponding to the position of the target image in the display area.
In some embodiments, the apparatus further comprises:
the second loading unit is used for loading the graphic template to a sub-area of the display area; wherein the size of the sub-region is smaller than the size of the display region;
correspondingly, the position determining unit 303 includes:
a sub-region position determining module, configured to determine, according to an attribute of a graph in the target image and the second association relationship, a position of the target image in the sub-region;
correspondingly, the processing unit 304 includes:
and the second processing module is used for processing the display content corresponding to the position of the target image in the sub-area.
In some embodiments, the processing unit 304 includes:
the first display module is used for displaying track information based on the position of the target image in the display area;
and the second display module is used for erasing track information based on the position of the target image in the display area.
In some embodiments, if the second electronic device performs continuous image acquisition on the display area, the apparatus comprises:
the continuous acquisition unit is used for acquiring a plurality of target images acquired by the second electronic equipment;
the continuous determining unit is used for determining the acquisition time of each target image in the plurality of target images and the position of each target image in the display area;
and the continuous processing unit is used for carrying out information processing on the basis of the acquisition time and the position of each target image in the display area so as to display or erase continuous track information.
Based on the foregoing embodiments, an information processing apparatus is further provided in an embodiment of the present application, where the information processing apparatus includes various units, and may be implemented by a processor in an electronic device; of course, the implementation can also be realized through a specific logic circuit; in implementation, the processor may be a CPU, MPU, DSP, FPGA, or the like.
Fig. 4 is a schematic diagram of a second configuration of an information processing apparatus according to an embodiment of the present application, and as shown in fig. 4, the apparatus 400 includes:
the acquisition unit 401 is configured to, if a target event occurs, perform image acquisition on a display area of the first electronic device;
a sending unit 402, configured to send the acquired target image to the first electronic device, so that the first electronic device determines, according to an attribute of a graph in the target image and a second association relationship, a position of the target image in the display area, and performs information processing based on the position of the target image in the display area; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph;
wherein the target event comprises at least one of:
the second electronic device is in contact with a display area of the first electronic device;
and in a target time period, the variation of the number of the graphs collected by the second electronic equipment is within a threshold range.
Based on the foregoing embodiments, an information processing apparatus is further provided in an embodiment of the present application, where the information processing apparatus includes various units, and may be implemented by a processor in an electronic device; of course, the implementation can also be realized through a specific logic circuit; in implementation, the processor may be a CPU, MPU, DSP, FPGA, or the like.
Fig. 5 is a schematic diagram of a third configuration of an information processing apparatus according to an embodiment of the present application, and as shown in fig. 5, the apparatus 500 includes:
the acquisition unit 501 is used for acquiring an image of a display area of the first electronic device if a target event occurs;
a determining unit 502, configured to determine, according to an attribute of a graphic in the acquired target image and the second association relationship, a position of the target image in the display area; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph;
a sending unit 503, configured to send the position of the target image in the display area to the first electronic device, so that the first electronic device performs information processing based on the position of the target image in the display area;
wherein the target event comprises at least one of:
the second electronic device is in contact with a display area of the first electronic device;
and in a target time period, the variation of the number of the graphs collected by the second electronic equipment is within a threshold range.
The above description of the apparatus embodiments, similar to the above description of the method embodiments, has similar beneficial effects as the method embodiments. For technical details not disclosed in the embodiments of the apparatus of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
In the embodiment of the present application, if the information processing method is implemented in the form of a software functional module and sold or used as a standalone product, the information processing method may also be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or portions thereof contributing to the prior art may be embodied in the form of a software product stored in a storage medium, and including several instructions for causing an electronic device (which may be a personal computer, a server, etc.) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a ROM (Read Only Memory), a magnetic disk, or an optical disk. Thus, embodiments of the present application are not limited to any specific combination of hardware and software.
Correspondingly, an embodiment of the present application provides an electronic device, which includes a memory and a processor, where the memory stores a computer program that can be run on the processor, and the processor executes the computer program to implement the steps in the information processing method provided in the foregoing embodiment.
Correspondingly, the embodiment of the present application provides a readable storage medium, on which a computer program is stored, and the computer program realizes the steps in the information processing method when being executed by a processor.
Here, it should be noted that: the above description of the storage medium and device embodiments is similar to the description of the method embodiments above, with similar advantageous effects as the method embodiments. For technical details not disclosed in the embodiments of the storage medium and apparatus of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
It should be noted that fig. 6 is a schematic diagram of a hardware entity of an electronic device according to an embodiment of the present application, and as shown in fig. 6, the hardware entity of the electronic device 600 includes: a processor 601, a communication interface 602, and a memory 603, wherein
The processor 601 generally controls the overall operation of the electronic device 600.
The communication interface 602 may enable the electronic device to communicate with other terminals or servers via a network.
The Memory 603 is configured to store instructions and applications executable by the processor 601, and may also buffer data (e.g., image data, audio data, voice communication data, and video communication data) to be processed or already processed by the processor 601 and modules in the electronic device 600, and may be implemented by FLASH Memory or RAM (Random Access Memory).
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a ROM (Read Only Memory), a magnetic disk, or an optical disk.
Alternatively, the integrated units described above in the present application may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as independent products. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or portions thereof contributing to the prior art may be embodied in the form of a software product stored in a storage medium, and including several instructions for causing an electronic device (which may be a personal computer, a server, etc.) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The methods disclosed in the several method embodiments provided in the present application may be combined arbitrarily without conflict to obtain new method embodiments.
Features disclosed in several of the product embodiments provided in the present application may be combined in any combination to yield new product embodiments without conflict.
The features disclosed in the several method or apparatus embodiments provided in the present application may be combined arbitrarily, without conflict, to arrive at new method embodiments or apparatus embodiments.
The above description is only for the embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. An information processing method is applied to a first electronic device, and comprises the following steps:
if a target event occurs, acquiring a target image acquired by second electronic equipment, wherein the target image is an image of a display area of the first electronic equipment;
determining the attribute of the graph in the target image;
determining the position of the target image in the display area according to the attribute of the graph in the target image and the second incidence relation; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph;
performing information processing based on the position of the target image in the display area;
wherein the target event comprises at least one of:
the second electronic device is in contact with a display area of the first electronic device;
and in a target time period, the variation of the number of the graphs collected by the second electronic equipment is within a threshold range.
2. The method of claim 1, prior to said obtaining a target image captured by a second electronic device, further comprising:
and coding the attributes of the graphs at different positions of the display area and the positions of the graphs according to a coding rule to obtain the second association relation.
3. The method of claim 1, prior to said obtaining a target image captured by a second electronic device, further comprising:
determining a first association relation according to a coding rule corresponding to the graphic template; the first incidence relation is used for indicating the incidence relation between the attribute of the graph and the position of the graph at different positions of the graph template;
and determining the second association relation according to the size of the graphic template, the size of the display area and the first association relation.
4. The method of claim 3, further comprising: loading the graphic template to the display area; the size of the loaded graphic template is the same as that of the display area;
correspondingly, the information processing based on the position of the target image in the display area comprises: and processing the display content corresponding to the position of the target image in the display area.
5. The method of claim 3, further comprising: loading the graphic template to a sub-area of the display area; wherein the size of the sub-region is smaller than the size of the display region;
correspondingly, the determining the position of the target image in the display area according to the attribute of the graph in the target image and the second association relationship includes: determining the position of the target image in the sub-area according to the attribute of the graph in the target image and the second incidence relation;
correspondingly, the information processing based on the position of the target image in the display area comprises: and processing the display content corresponding to the position of the target image in the sub-area.
6. The method of claim 1, the processing information based on the location of the target image in the display area comprising: displaying trajectory information based on a position of the target image in the display area;
or erasing track information based on the position of the target image in the display area;
or, based on the position of the target image in the display area, executing the operation corresponding to the function key at the position.
7. The method of claim 6, if the second electronic device performs continuous image acquisition of the display area, the method comprising:
acquiring a plurality of target images acquired by the second electronic equipment;
determining the acquisition time of each target image in the plurality of target images and the position of each target image in the display area;
and performing information processing based on the acquisition time and the position of each target image in the display area to display or erase continuous track information.
8. An information processing method applied to a second electronic device, the method comprising:
if the target event occurs, acquiring an image of a display area of the first electronic equipment;
sending the acquired target image to the first electronic device, so that the first electronic device determines the position of the target image in the display area according to the attribute and the second incidence relation of the graph in the target image, and performs information processing based on the position of the target image in the display area; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph;
wherein the target event comprises at least one of:
the second electronic device is in contact with a display area of the first electronic device;
and in a target time period, the variation of the number of the graphs collected by the second electronic equipment is within a threshold range.
9. An information processing method applied to a second electronic device, the method comprising:
if the target event occurs, acquiring an image of a display area of the first electronic equipment;
determining the position of the target image in the display area according to the attribute of the graph in the acquired target image and the second incidence relation; the second incidence relation is used for representing the incidence relation between the attribute of the graph at different positions of the display area and the position of the graph;
sending the position of the target image in the display area to the first electronic device so that the first electronic device performs information processing based on the position of the target image in the display area;
wherein the target event comprises at least one of:
the second electronic device is in contact with a display area of the first electronic device;
and in a target time period, the variation of the number of the graphs collected by the second electronic equipment is within a threshold range.
10. An electronic device comprising a memory and a processor, the memory storing a computer program operable on the processor, the processor implementing the steps in the information processing method of any one of claims 1 to 9 when executing the program.
CN201910939393.5A 2019-09-30 2019-09-30 Information processing method and electronic equipment Active CN110750202B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910939393.5A CN110750202B (en) 2019-09-30 2019-09-30 Information processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910939393.5A CN110750202B (en) 2019-09-30 2019-09-30 Information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN110750202A true CN110750202A (en) 2020-02-04
CN110750202B CN110750202B (en) 2021-12-24

Family

ID=69277505

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910939393.5A Active CN110750202B (en) 2019-09-30 2019-09-30 Information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN110750202B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003069547A1 (en) * 2002-02-12 2003-08-21 Anoto Ab Electronic pen, and control device and method thereof
CN2814506Y (en) * 2005-07-29 2006-09-06 姚华 Contact pen capable of being used to ordinary computer display screen
CN202472604U (en) * 2012-03-15 2012-10-03 郝宏贤 Computer screen touch control identification system
CN102736748A (en) * 2011-03-30 2012-10-17 三星电子株式会社 Electronic pen, input method using electronic pen, and display device for electronic pen input
CN102880319A (en) * 2012-09-03 2013-01-16 创维光电科技(深圳)有限公司 Optical image technology-based touch device
CN103049109A (en) * 2012-12-20 2013-04-17 广州视睿电子科技有限公司 Stylus and touch point identification method
CN103076969A (en) * 2012-12-27 2013-05-01 天津三星通信技术研究有限公司 Input system for mobile terminal display screen and control method thereof
CN206649483U (en) * 2017-04-01 2017-11-17 深圳市汇顶科技股份有限公司 Stylus and touch control device
CN109343816A (en) * 2018-09-30 2019-02-15 联想(北京)有限公司 A kind of display methods, display device and electronic equipment
CN110045844A (en) * 2019-04-15 2019-07-23 南京孜博汇信息科技有限公司 Position encoded form data processing system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003069547A1 (en) * 2002-02-12 2003-08-21 Anoto Ab Electronic pen, and control device and method thereof
CN2814506Y (en) * 2005-07-29 2006-09-06 姚华 Contact pen capable of being used to ordinary computer display screen
CN102736748A (en) * 2011-03-30 2012-10-17 三星电子株式会社 Electronic pen, input method using electronic pen, and display device for electronic pen input
CN202472604U (en) * 2012-03-15 2012-10-03 郝宏贤 Computer screen touch control identification system
CN102880319A (en) * 2012-09-03 2013-01-16 创维光电科技(深圳)有限公司 Optical image technology-based touch device
CN103049109A (en) * 2012-12-20 2013-04-17 广州视睿电子科技有限公司 Stylus and touch point identification method
CN103076969A (en) * 2012-12-27 2013-05-01 天津三星通信技术研究有限公司 Input system for mobile terminal display screen and control method thereof
CN206649483U (en) * 2017-04-01 2017-11-17 深圳市汇顶科技股份有限公司 Stylus and touch control device
CN109343816A (en) * 2018-09-30 2019-02-15 联想(北京)有限公司 A kind of display methods, display device and electronic equipment
CN110045844A (en) * 2019-04-15 2019-07-23 南京孜博汇信息科技有限公司 Position encoded form data processing system

Also Published As

Publication number Publication date
CN110750202B (en) 2021-12-24

Similar Documents

Publication Publication Date Title
CN109040297B (en) User portrait generation method and device
US9430106B1 (en) Coordinated stylus haptic action
US20140365923A1 (en) Home screen sharing apparatus and method thereof
US20160203194A1 (en) User terminal for displaying image and image display method thereof
CN105190520A (en) Hover gestures for touch-enabled devices
KR20130088104A (en) Mobile apparatus and method for providing touch-free interface
US20210314668A1 (en) Display Device And Content Recommendation Method
US20140092101A1 (en) Apparatus and method for producing animated emoticon
US20140282204A1 (en) Key input method and apparatus using random number in virtual keyboard
CN104049887A (en) Methods for data transmission and electronic devices using the same
CN108024073B (en) Video editing method and device and intelligent mobile terminal
CN112416207B (en) Information content display method, device, equipment and medium
CN112230914A (en) Method and device for producing small program, terminal and storage medium
CN111177137B (en) Method, device, equipment and storage medium for data deduplication
CN112416485A (en) Information guiding method, device, terminal and storage medium
US10409478B2 (en) Method, apparatus, and recording medium for scrapping content
CN111949879A (en) Method and device for pushing message, electronic equipment and readable storage medium
CN110929159B (en) Resource release method, device, equipment and medium
CN114845152B (en) Display method and device of play control, electronic equipment and storage medium
WO2021197260A1 (en) Note creating method and electronic device
CN110750202B (en) Information processing method and electronic equipment
CN112416486A (en) Information guiding method, device, terminal and storage medium
CN110377914B (en) Character recognition method, device and storage medium
CN111898353A (en) Table display method, device and medium
JP2010154089A (en) Conference system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant