CN111078102B - Method for determining point reading area through projection and terminal equipment - Google Patents

Method for determining point reading area through projection and terminal equipment Download PDF

Info

Publication number
CN111078102B
CN111078102B CN201910500041.XA CN201910500041A CN111078102B CN 111078102 B CN111078102 B CN 111078102B CN 201910500041 A CN201910500041 A CN 201910500041A CN 111078102 B CN111078102 B CN 111078102B
Authority
CN
China
Prior art keywords
user
area
finger
projection
page
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910500041.XA
Other languages
Chinese (zh)
Other versions
CN111078102A (en
Inventor
韦肖莹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Genius Technology Co Ltd
Original Assignee
Guangdong Genius Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Genius Technology Co Ltd filed Critical Guangdong Genius Technology Co Ltd
Priority to CN201910500041.XA priority Critical patent/CN111078102B/en
Publication of CN111078102A publication Critical patent/CN111078102A/en
Application granted granted Critical
Publication of CN111078102B publication Critical patent/CN111078102B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a method for determining a point reading area through projection and a terminal device, which are applied to the technical field of terminal devices and can solve the problem that the point reading area determined by a family education device in the prior art is inaccurate. The method comprises the following steps: under a point reading scene, if a pointing action of a user finger on a learning page is detected, acquiring a finger projection, wherein the finger projection is the projection of the user finger on the learning page under the condition that the user finger is irradiated by using lamplight; if the finger projection is in the first delineation area and the touch input of the user on the first delineation area is detected, the first delineation area is used as a user touch-reading area. The method is applied to a point-reading scene.

Description

Method for determining point reading area through projection and terminal equipment
Technical Field
The embodiment of the invention relates to the technical field of terminal equipment, in particular to a method for determining a point reading area through projection and the terminal equipment.
Background
At present, most family education equipment on the market mostly has a reading function, and when a student encounters an unknown literary composition in the learning process, the student can read the unknown literary composition by using the family education equipment with the reading function. Frequently, the family education device can detect the clicking action of the user for the new word, and recognize the new word to read the new word. In the process of actually using the family education device, it is found that since the visual angle of the image collected by the family education device is different from the observation visual angle of the user, and after the user performs the point reading operation with a finger or a pen point, the point reading area recognized by the family education device may not be consistent with the area actually desired to be point read, so that the point reading area determined by the family education device is inaccurate.
Disclosure of Invention
The embodiment of the invention provides a method for determining a point reading area through projection and a terminal device, which are used for solving the problem that the point reading area determined by a family education device in the prior art is inaccurate. In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, a method for determining a reading area by projection is provided, and the method includes:
under a click-reading scene, if a pointing action of a user finger on a learning page is detected, acquiring a finger projection, wherein the finger projection is the projection of the user finger on the learning page under the condition that the user finger is irradiated by using lamplight;
and if the finger projection is in a first delineation area and the touch input of a user on the first delineation area is detected, taking the first delineation area as a user point-reading area.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, after taking the first delineation area as a user click-to-read area, the method further includes:
and if the gesture that the user hooks the fingers is detected, the first hooking area is cancelled to be used as the user touch and talk area.
As an optional implementation manner, in the first aspect of this embodiment of the present invention, after the acquiring the finger projection, the method further includes:
if the gesture that the user waves the fingers to the first direction is detected, the finger projection is obtained again;
and if the reacquired finger projection is in a second delineating area and the touch input of the user on the second delineating area is detected, taking the second delineating area as a user touch reading area.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, after taking the first delineation area as a user click-to-read area, the method further includes:
identifying first content in the user click-to-read region;
and displaying the first content and reading the first content.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, the learning page is a paper page, and the displaying the first content includes:
searching an electronic page corresponding to the learning page;
determining the first content in the electronic page according to the user click-to-read area;
and displaying the electronic page, and highlighting the first content in the electronic page.
In a second aspect, a terminal device is provided, which includes:
the acquisition module is used for acquiring a finger projection if a pointing action of a user finger on a learning page is detected in a click-reading scene, wherein the finger projection is the projection of the user finger on the learning page under the condition that the user finger is irradiated by using lamplight;
and the determining module is used for taking the first delineating area as a user touch-reading area if the finger projection is in the first delineating area and the touch input of the user on the first delineating area is detected.
As an optional implementation manner, in a second aspect of the embodiment of the present invention, the terminal device further includes:
and the canceling module is used for canceling the first hooking area as the user touch and read area if a gesture of the user for hooking the finger is detected after the determining module takes the first hooking area as the user touch and read area.
As an alternative implementation, in the second aspect of the embodiment of the present invention,
the acquisition module is further used for acquiring the finger projection again after acquiring the finger projection if detecting a gesture that the user waves the finger to the first direction;
the determining module is further configured to use the second delineation area as a user touch-and-read area if the reacquired finger projection is in the second delineation area and a touch input of a user on the second delineation area is detected.
As an optional implementation manner, in a second aspect of the embodiment of the present invention, the terminal device further includes:
the identification module is used for identifying first content in the user click-to-read area after the determination module takes the first delineation area as the user click-to-read area;
and the display module is used for displaying the first content.
And the reading module is used for reading the first content.
As an alternative implementation, in the second aspect of the embodiment of the present invention,
the study page is a paper page, and the display module comprises: the searching submodule is used for searching the electronic page corresponding to the learning page;
the determining submodule is used for determining the first content in the electronic page according to the user click-to-read area;
and the display sub-module is used for displaying the electronic page and highlighting the first content in the electronic page.
In a third aspect, a terminal device is provided, including:
a memory storing executable program code;
a processor coupled with the memory;
the processor calls the executable program code stored in the memory to execute the method for determining the read-and-click area through projection in the first aspect of the embodiment of the present invention.
In a fourth aspect, a computer-readable storage medium is provided, which stores a computer program, where the computer program makes a computer execute the method for determining a click-to-read region through projection in the first aspect of the embodiments of the present invention. The computer readable storage medium includes a ROM/RAM, a magnetic or optical disk, or the like.
In a fifth aspect, there is provided a computer program product for causing a computer to perform some or all of the steps of any one of the methods of the first aspect when the computer program product is run on the computer.
A sixth aspect provides an application publishing platform for publishing a computer program product, wherein the computer program product, when run on a computer, causes the computer to perform some or all of the steps of any one of the methods of the first aspect.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
in the embodiment of the invention, the terminal device can acquire the finger projection under the condition of detecting the pointing action of the finger of the user on the learning page, determine the first pointing area where the finger projection is located, and take the first pointing area as the click-to-read area of the user after detecting the touch input of the user on the first pointing area. Therefore, the user can select the delineation area where the finger is projected through the position of the finger projection by executing the delineation action on the learning page, and the user can determine the delineation area as the user click-to-read area through the touch input on the delineation area, so that the user click-to-read area can be determined according to the user's delineation action and touch input, and the accuracy of the determined click-to-read area is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a first schematic flowchart of a method for determining a click-to-read region through projection according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a method for determining a click-to-read region through projection according to an embodiment of the present invention;
fig. 3 is a schematic flowchart illustrating a third method for determining a click-to-read region through projection according to an embodiment of the present invention;
fig. 4 is a schematic flowchart of a fourth method for determining a click-to-read region through projection according to an embodiment of the present invention;
fig. 5 is a first schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram three of a terminal device according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first" and "second," and the like, in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first and second delineating regions, etc. are used to distinguish between different delineating regions, rather than to describe a particular order of delineating regions.
The terms "comprises," "comprising," and any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that, in the embodiments of the present invention, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The embodiment of the invention provides a method for determining a point-reading area through projection and terminal equipment, which can improve the accuracy of positioning a building position.
The terminal device according to the embodiment of the present invention may be an electronic device having a touch and talk function, such as a Mobile phone, a touch and talk device, a teaching machine, a tablet Computer, a notebook Computer, a palmtop Computer, a vehicle-mounted terminal device, a wearable device, an Ultra-Mobile Personal Computer (UMPC), a netbook, or a Personal Digital Assistant (PDA).
The execution subject of the method for determining a click-to-read region through projection provided in the embodiment of the present invention may be the terminal device described above, or may also be a functional module and/or a functional entity capable of implementing the method for determining a click-to-read region through projection in the terminal device, which may be specifically determined according to actual use requirements, and the embodiment of the present invention is not limited. The following takes a terminal device as an example, and exemplarily describes a method for determining a click-to-read region through projection according to an embodiment of the present invention.
Example one
As shown in fig. 1, an embodiment of the present invention provides a method for determining a reading area through projection, where the method may include the following steps:
101. under a click-reading scene, if the pointing action of the user finger on the learning page is detected, the finger projection is obtained.
The finger projection is the projection of the finger of the user on the learning page under the condition that the finger of the user is irradiated by using lamplight.
In the embodiment of the present invention, the click-to-read mode of the terminal device may be divided into two types, one type is click-to-read of an electronic page displayed on a display screen of the terminal device, and the other type is click-to-read of a paper page.
In the embodiment of the present invention, when the learning page is a paper page, as an optional implementation manner, the method for detecting the hooking action of the finger of the user on the learning page may be: the terminal equipment acquires an image comprising a learning page, and acquires the drawing action of the finger of the user from the image through image analysis.
In the embodiment of the present invention, when the learning page is an electronic page, as an optional implementation manner, the method for detecting the hooking action of the finger of the user on the learning page may be: the terminal equipment can detect the touched area of the display screen through the sensor to acquire the hooking action of the finger of the user on the learning page.
As an optional implementation manner, before performing step 101, the terminal device may further detect a current power amount of the terminal device, and output a first prompt message when the current power amount is detected to be less than a preset power amount, where the first prompt message is used to prompt a user to connect to a power supply.
Through the optional implementation mode, when the current electric quantity of the terminal equipment is smaller than the preset electric quantity, the user is prompted to charge, so that the use of the terminal equipment is prevented from being interrupted due to insufficient electric quantity in the use process of the user, and the user experience is improved.
As an alternative implementation manner, before performing step 101, the terminal device obtains a vertical distance between the eyes of the user and the learning page; and if the vertical distance is smaller than the distance threshold, the terminal equipment outputs second prompt information. And the second prompt information is used for prompting the user to adjust the distance between the eyes and the learning page.
Optionally, the terminal device may acquire the vertical distance between the user's eyes and the learning page from the first image, or may also acquire a second image including the learning page and the user's eyes again, and acquire the vertical distance between the user's eyes and the learning page from the second image.
Optionally, the distance threshold may be determined according to actual conditions, and the embodiment of the present invention is not limited.
Through the optional implementation mode, the terminal device can prompt the user to adjust the distance between the eyes and the learning page under the condition that the vertical distance between the eyes of the user and the learning page is smaller than the distance threshold value, and therefore the eyesight of the user can be protected.
102. If the finger projection is in the first delineation area and the terminal device detects the touch input of the user on the first delineation area, the first delineation area is used as a user touch reading area.
Optionally, the touch input may be a click input, a long-press input, or a re-press input.
Wherein, the click input can be click input or double-click input, etc.; the long press input may be an input in which an input duration is longer than a preset duration; the heavy press input may be an input with an input force greater than a preset force.
In the embodiment of the invention, one page can comprise one or more delineating areas whether the page is a paper page or an electronic page.
The first delineation area can be any delineation area in the learning page.
According to the method for determining the click-to-read area through projection provided by the embodiment of the invention, the terminal equipment can acquire the finger projection under the condition that the pointing action of the finger of the user on the learning page is detected, determine the first pointing area where the finger projection is located, and take the first pointing area as the click-to-read area of the user after the touch input of the user on the first pointing area is detected. Therefore, the user can select the delineation area where the finger is projected through the position of the finger projection by executing the delineation action on the learning page, and the user can determine the delineation area as the user click-to-read area through the touch input on the delineation area, so that the user click-to-read area can be determined according to the user's delineation action and touch input, and the accuracy of the determined click-to-read area is improved.
Example two
As shown in fig. 2, an embodiment of the present invention provides a method for determining a reading area by projection, where the method includes the following steps:
201. under a click-reading scene, if the pointing action of the user finger on the learning page is detected, the finger projection is obtained.
202. If the terminal device is projected by the finger to be in the first delineation area and detects the touch input of the user on the first delineation area, the first delineation area is used as a user point-reading area.
For the above description of 201 and 202, reference may be specifically made to the description of 101 and 102 in the first embodiment, and details are not repeated here.
203. And if the terminal equipment detects the gesture of hooking the finger of the user, the first hooking area is cancelled to be used as the user touch and talk area.
After the first hooking area is used as the user touch and read area, the terminal device can collect an image, acquire a gesture of the user from the image, and cancel the use of the first hooking area as the user touch and read area when the gesture of the user in the image is detected to be a gesture of touching a finger.
According to the method for determining the click-to-read region through projection, after the click-to-read region of the user is determined, the user can trigger the terminal device to cancel the first hooking region as the click-to-read region of the user through the gesture of hooking the finger, so that the method for determining the click-to-read region through projection provided by the embodiment of the invention is more flexible.
EXAMPLE III
As shown in fig. 3, an embodiment of the present invention provides a method for determining a reading area by projection, where the method includes the following steps:
301. under a click-reading scene, if the pointing action of the user finger on the learning page is detected, the finger projection is obtained.
For the description of 301, reference may be specifically made to the description of 101 in the first embodiment, and details are not repeated here.
302. And if the terminal equipment detects the gesture that the user waves the finger to the first direction, the finger projection is acquired again.
After the finger projection is obtained, the terminal device may collect an image, obtain a gesture of the user from the image, and obtain the finger projection again when the gesture that the user waves the finger in the first direction in the image is detected.
In the embodiment of the present invention, the first direction may be any one direction.
303. And if the reacquired finger projection is in the second delineation area and the touch input of the user on the second delineation area is detected, the terminal equipment takes the second delineation area as a user touch-reading area.
In the method for determining a touch and read area through projection provided by this embodiment, after the terminal device obtains the finger projection, before the delineation area where the finger projection is located is determined as the user touch and read area, the user may trigger the terminal device to obtain the finger projection again through a gesture of waving the finger in a certain direction, and determine the area where the finger projection is located as the user touch and read area under the condition that the area where the finger projection is located receives the touch input of the user. Therefore, the user can flexibly select the click-to-read area.
Example four
As shown in fig. 4, an embodiment of the present invention provides a method for determining a reading area by projection, where the method includes the following steps:
401. under a click-reading scene, if the pointing action of the user finger on the learning page is detected, the finger projection is obtained.
402. If the terminal device is projected by the finger to be in the first delineation area and detects the touch input of the user on the first delineation area, the first delineation area is used as a user point-reading area.
For the above description of 401 and 402, reference may be specifically made to the description of 101 and 102 in the first embodiment, and details are not repeated here.
403. The terminal device identifies a first content in the region that is clicked on by the user.
404. The terminal device displays the first content.
405. The terminal equipment reads the first content.
In this embodiment, the terminal device may identify the first content in the actual click-to-read area, display the first content, and report the first content, so that the user may not only hear the first content, but also see the first content on the terminal device, thereby improving user experience when the user clicks-to-read the content.
Optionally, 404 in fig. 3 may specifically be implemented by the following steps:
404a, the terminal device searches an electronic page corresponding to the learning page.
404b, the terminal device determines the first content in the electronic page according to the user click-to-read area.
404c, the terminal device displays the electronic page and highlights the first content in the electronic page.
Through the optional implementation mode, the terminal device can display the electronic page corresponding to the learning page, display the electronic page in the terminal device, and highlight the first content in the electronic page, so that a user can clearly and visually see the first content in the process of reading the first content, and can see the electronic page where the first content is located, and user experience is improved.
As an optional implementation manner, after the step 404a, the terminal device may further receive a click-to-read input of the second content on the electronic page by the user, and read the second content in response to the click-to-read input. The second content may be content on the electronic page other than the first content.
Through the optional implementation mode, after the user reads on the paper page once, the terminal device can display the corresponding electronic page, and the user can continue to read on the electronic page when reading next time, so that switching between the paper page reading and the electronic page reading is realized, and the reading mode is more flexible.
As an optional implementation manner, in 404c above, in addition to displaying the electronic page corresponding to the learning page, the terminal device may also display other pages associated with the electronic page in a form of thumbnails (for example, a previous page of the electronic page and a subsequent page of the electronic page), so that when the user needs to view the content of other pages (denoted as target electronic pages) associated with the electronic page, the terminal device may be triggered to display the target electronic page (i.e., switch the currently displayed electronic page to the target electronic page) by clicking the thumbnail of the target electronic page.
Through the optional implementation mode, the terminal equipment can also display other pages related to the electronic page under the condition of displaying the electronic page corresponding to the learning page, so that a user can conveniently check the related pages, and the human-computer interaction experience of the terminal equipment is improved.
As an optional implementation manner, the reading of the first content by the terminal device in the above 405 may include the following steps:
405a, the terminal device detects the noise degree of the environment of the area where the terminal device is located.
The terminal equipment can receive the audio signal through a receiver in the terminal equipment and detect the ambient noise of the area where the terminal equipment is located according to the audio signal.
405b, when the ambient noise degree is larger than the preset noise degree, the terminal equipment outputs prompt information for indicating a user to wear the earphone.
405c, when the terminal device detects that the terminal device is successfully connected with the earphone, the terminal device performs a reading operation on the first content with a preset volume.
Through the optional implementation mode, the user can be prompted to wear the earphone through detecting the noise degree of the environment where the terminal equipment is located, so that the reading effect is guaranteed.
As an optional implementation manner, the manner in which the terminal device detects the ambient noise level of the area where the terminal device is located may be:
the terminal equipment detects whether a certain wireless access point is accessed currently, and if the certain wireless access point is accessed currently, the terminal equipment identifies whether the identification information of the wireless access point accessed currently is matched with the identification information of the wireless access point on a certain school bus recorded in advance by the terminal equipment.
If the information is matched with the information, the terminal device can think that the terminal device is currently located on the school bus, correspondingly, the terminal device can obtain the identity information of the service device transferred on the school bus, and sends a request message to the service device of the school bus through the wireless access point according to the identity information of the service device on the school bus, wherein the request message carries the identity information of the terminal device and a request field, and the request field is used for requesting the service device of the school bus to detect the environment noise degree in the school bus.
And the terminal equipment acquires the environment noise degree in the school bus, which is sent by the service equipment of the school bus in response to the request message, and takes the environment noise degree in the school bus as the environment noise degree of the area where the terminal equipment is located.
According to the implementation method, the power consumption caused when the terminal device detects the noisy environment in the area through starting the sensor of the terminal device can be avoided, and the calorific value of the terminal device caused by the aggravation of the power consumption is reduced.
Furthermore, after receiving the request message sent by the terminal device to the service device of the school bus via the wireless access point, the service device of the school bus may further perform the following operations:
the service device of the school bus can identify the user attribute of the terminal device according to the identity information of the terminal device carried by the request message, wherein the user attribute can comprise the user name of the user (such as a student) to which the terminal device belongs and a curriculum schedule corresponding to the user grade, and the curriculum schedule can comprise the class time (including date and time) of each subject and the class place of each subject;
the service equipment of the school bus determines the class-taking place of the target subject from the curriculum schedule, wherein the class-taking time of the target subject is closest to the current system time of the service equipment of the school bus;
and when detecting that the school bus runs to the getting-off station corresponding to the class point of the target department, the service equipment of the school bus sends a station arrival notification message to the terminal equipment through the wireless access point, wherein the notification message comprises the class time and the class point of the target department.
By implementing the embodiment, the user can be prevented from missing the place of class in the process of taking the school bus and listening to the contents to be reported.
As shown in fig. 5, an embodiment of the present invention provides a terminal device, where the terminal device includes:
the obtaining module 501 is configured to, in a click-to-read scene, obtain a finger projection if a pointing motion of a finger of a user on a learning page is detected, where the finger projection is a projection of the finger of the user on the learning page when the user's finger is illuminated by light.
The determining module 502 is configured to, if the finger projection is in the first delineation area and a touch input of the user on the first delineation area is detected, use the first delineation area as a user touch-and-read area.
In the embodiment of the invention, the terminal device can acquire the finger projection under the condition of detecting the pointing action of the finger of the user on the learning page, determine the first pointing area where the finger projection is located, and take the first pointing area as the click-to-read area of the user after detecting the touch input of the user on the first pointing area. Therefore, the user can select the delineation area where the finger is projected through the position of the finger projection by executing the delineation action on the learning page, and the user can determine the delineation area as the user click-to-read area through the touch input on the delineation area, so that the user click-to-read area can be determined according to the user's delineation action and touch input, and the accuracy of the determined click-to-read area is improved.
As an alternative implementation, the terminal device may further include the following modules not shown in the existing drawings:
an electric quantity detection module, configured to detect a current electric quantity of the terminal device before the acquisition module 501 acquires the finger projection;
and the information output module is used for outputting first prompt information when the current electric quantity is detected to be smaller than the preset electric quantity, and the first prompt information is used for prompting a user to connect the prompt information of the power supply.
Through the optional implementation mode, when the current electric quantity of the terminal equipment is smaller than the preset electric quantity, the user is prompted to charge, so that the use of the terminal equipment is prevented from being interrupted due to insufficient electric quantity in the use process of the user, and the user experience is improved.
As an alternative implementation, the terminal device may further include the following modules not shown in the existing drawings:
and the distance acquisition module is used for acquiring the vertical distance between the eyes of the user and the learning page.
The information output module is further configured to output a second prompt message by the terminal device if the vertical distance is smaller than the distance threshold. And the second prompt information is used for prompting the user to adjust the distance between the eyes and the learning page.
Through the optional implementation mode, the terminal device can prompt the user to adjust the distance between the eyes and the learning page under the condition that the vertical distance between the eyes of the user and the learning page is smaller than the distance threshold value, and therefore the eyesight of the user can be protected.
Optionally, with reference to fig. 5, as shown in fig. 6, the terminal device further includes:
a canceling module 503, configured to, after the determining module uses the first delineation area as the user touch and talk area, cancel using the first delineation area as the user touch and talk area if a gesture that the user hooks a finger is detected.
In the terminal device shown in fig. 6, after the terminal device determines the user touch and talk area, the user may trigger the terminal device to cancel taking the first hooking area as the user touch and talk area through a gesture of hooking a finger, so that the method for determining the touch and talk area through projection provided by the embodiment of the present invention is more flexible.
Optionally, the obtaining module 501 is further configured to, after obtaining the finger projection, obtain the finger projection again if a gesture that the user waves the finger in the first direction is detected.
The determining module 502 is further configured to, if the retrieved finger projection is in the second delineation area and a touch input of the user on the second delineation area is detected, use the second delineation area as a user touch-and-read area.
After the terminal device acquires the finger projection, before the delineation area where the finger projection is located is determined as the user touch-and-read area, the user can trigger the terminal device to acquire the finger projection again through a gesture of waving the finger in a certain direction, and the delineation area is determined as the user touch-and-read area when the area where the finger projection is located is received the touch input of the user. Therefore, the user can flexibly select the click-to-read area.
Optionally, with reference to fig. 6, as shown in fig. 7, the terminal device further includes:
the identifying module 504 is configured to identify first content in the user click-to-read area after the determining module takes the first delineation area as the user click-to-read area;
a display module 505 for displaying the first content.
And a reading module 506, configured to read the first content.
In the embodiment shown in fig. 7, the terminal device may recognize the first content in the actual click-to-read area, display the first content, and report the first content, so that the user may not only hear the first content, but also see the first content on the terminal device, thereby improving the user experience when the user clicks-to-read the content.
Optionally, the learning page is a paper page, and the display module 505 in fig. 7 includes:
the searching submodule 5051 is used for searching an electronic page corresponding to the learning page;
a determination sub-module 5052, configured to determine, according to the user click-to-read region, first content in the electronic page;
the display sub-module 5053 is configured to display the electronic page and highlight the first content in the electronic page.
As an alternative implementation, the terminal device may further include the following modules not shown in the existing drawings:
a receiving module, configured to display the electronic page in the display sub-module 4053, and after the first content is highlighted in the electronic page, receive a click-to-read input of the second content on the electronic page by the user.
The reading module 406 is further configured to receive a click-to-read input of the second content from the user on the electronic page.
Through the optional implementation mode, after the user reads on the paper page once, the terminal device can display the corresponding electronic page, and the user can continue to read on the electronic page when reading next time, so that switching between the paper page reading and the electronic page reading is realized, and the reading mode is more flexible.
As an optional implementation manner, the display sub-module 5053 is further configured to display, in addition to the electronic page corresponding to the learning page, other pages associated with the electronic page (for example, a previous page of the electronic page and a subsequent page of the electronic page) in a form of a thumbnail, so that when a user needs to view content of other pages (denoted as a target electronic page) associated with the electronic page, the terminal device may be triggered to display the target electronic page (i.e., the currently displayed electronic page is switched to the target electronic page) by clicking the thumbnail of the target electronic page.
Through the optional implementation mode, the terminal equipment can also display other pages related to the electronic page under the condition of displaying the electronic page corresponding to the learning page, so that a user can conveniently check the related pages, and the human-computer interaction experience of the terminal equipment is improved.
As an alternative implementation, the reading module 406 may include the following modules not shown in the prior drawings:
the detection submodule is used for detecting the ambient noise of the area where the terminal equipment is located;
and the output sub-module is used for outputting first prompt information when the ambient noise degree is greater than the preset noise degree, and the first prompt information is used for prompting the user to wear the earphone.
And the reading sub-module is used for executing reading operation on the first content at a preset volume when the terminal equipment is detected to be successfully connected with the earphone.
Through the optional implementation mode, the user can be prompted to wear the earphone through detecting the noise degree of the environment where the terminal equipment is located, so that the reading effect is guaranteed.
As an alternative implementation, the detection sub-module may further include the following units not shown in the existing drawings:
and the wireless detection unit is used for detecting whether the terminal equipment is currently accessed to a certain wireless access point.
And the identification unit is used for identifying whether the identification information of the currently accessed wireless access point is matched with the identification information of the wireless access point on a certain school bus, which is recorded in advance by the terminal equipment, by the terminal equipment if the currently accessed wireless access point is accessed, and the terminal equipment can consider that the terminal equipment is currently positioned on the school bus if the currently accessed identification information of the currently accessed wireless access point is matched with the identification information of the wireless access point on the certain school bus.
And the identity acquisition unit is used for acquiring the identity information of the service equipment transferred on the school bus.
And the message sending unit is used for sending a request message to the service equipment of the school bus through the wireless access point according to the identity information of the service equipment on the school bus, wherein the request message carries the identity information of the terminal equipment and a request field, and the request field is used for requesting the service equipment of the school bus to detect the environment noise in the school bus.
And the message receiving unit is used for acquiring the environment noise degree in the school bus, which is sent by the service equipment of the school bus in response to the request message, and taking the environment noise degree in the school bus as the environment noise degree of the area where the terminal equipment is located.
And acquiring the environment noise degree in the school bus, which is sent by the service equipment of the school bus in response to the request message, and taking the environment noise degree in the school bus as the environment noise degree of the area where the terminal equipment is located.
The implementation of the optional implementation mode can avoid power consumption caused when the terminal device starts the sensor of the terminal device to detect the noise degree of the environment in the area where the terminal device is located, and reduces the heat productivity of the terminal device caused by aggravation of the power consumption.
Furthermore, after receiving the request message sent by the terminal device to the service device of the school bus via the wireless access point, the service device of the school bus may further perform the following operations:
the service device of the school bus can identify the user attribute of the terminal device according to the identity information of the terminal device carried by the request message, wherein the user attribute can comprise the user name of the user (such as a student) to which the terminal device belongs and a curriculum schedule corresponding to the user grade, and the curriculum schedule can comprise the class time (including date and time) of each subject and the class place of each subject;
the service equipment of the school bus determines the class-taking place of the target subject from the curriculum schedule, wherein the class-taking time of the target subject is closest to the current system time of the service equipment of the school bus;
and when detecting that the school bus runs to the getting-off station corresponding to the class point of the target department, the service equipment of the school bus sends a station arrival notification message to the terminal equipment through the wireless access point, wherein the notification message comprises the class time and the class point of the target department.
By implementing the embodiment, the user can be prevented from missing the place of class in the process of taking the school bus and listening to the contents to be reported.
As shown in fig. 8, an embodiment of the present invention further provides a terminal device, where the terminal device may include:
a memory 601 in which executable program code is stored;
a processor 602 coupled to a memory 601.
The processor 602 calls the executable program code stored in the memory 601 to execute the method for determining the reading area through projection performed by the terminal device in the above embodiments of the methods.
It should be noted that the terminal device shown in fig. 8 may further include components, which are not shown, such as a battery, an input key, a speaker, a microphone, a screen, an RF circuit, a Wi-Fi module, a bluetooth module, and a sensor, which are not described in detail in this embodiment.
Embodiments of the present invention provide a computer-readable storage medium storing a computer program, wherein the computer program causes a computer to execute some or all of the steps of the method as in the above method embodiments.
Embodiments of the present invention also provide a computer program product, wherein the computer program product, when run on a computer, causes the computer to perform some or all of the steps of the method as in the above method embodiments.
Embodiments of the present invention further provide an application publishing platform, where the application publishing platform is configured to publish a computer program product, where the computer program product, when running on a computer, causes the computer to perform some or all of the steps of the method in the above method embodiments.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Those skilled in the art should also appreciate that the embodiments described in this specification are exemplary and alternative embodiments, and that the acts and modules illustrated are not required in order to practice the invention.
The terminal device provided by the embodiment of the present invention can implement each process shown in the above method embodiments, and is not described herein again to avoid repetition.
In various embodiments of the present invention, it should be understood that the sequence numbers of the above-mentioned processes do not imply an inevitable order of execution, and the execution order of the processes should be determined by their functions and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated units, if implemented as software functional units and sold or used as a stand-alone product, may be stored in a computer accessible memory. Based on such understanding, the technical solution of the present invention, which is a part of or contributes to the prior art in essence, or all or part of the technical solution, can be embodied in the form of a software product, which is stored in a memory and includes several requests for causing a computer device (which may be a personal computer, a server, a network device, or the like, and may specifically be a processor in the computer device) to execute part or all of the steps of the above-described method of each embodiment of the present invention.
It will be understood by those skilled in the art that all or part of the steps in the methods of the embodiments described above may be implemented by hardware instructions of a program, and the program may be stored in a computer-readable storage medium, where the storage medium includes Read-Only Memory (ROM), Random Access Memory (RAM), Programmable Read-Only Memory (PROM), Erasable Programmable Read-Only Memory (EPROM), One-time Programmable Read-Only Memory (OTPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Compact Disc Read-Only Memory (CD-ROM), or other Memory, such as a magnetic disk, or a combination thereof, A tape memory, or any other medium readable by a computer that can be used to carry or store data.

Claims (8)

1. A method for determining a reading area by projection, the method comprising:
under a click-reading scene, if a pointing action of a user finger on a learning page is detected, acquiring a finger projection, wherein the finger projection is the projection of the user finger on the learning page under the condition that the user finger is irradiated by using lamplight;
if the finger projection is in a first delineation area and touch input of a user on the first delineation area is detected, taking the first delineation area as a user point-reading area;
after taking the first delineation area as a user click-to-read area, the method further comprises:
and if the gesture that the user hooks the fingers is detected, the first hooking area is cancelled to be used as the user touch and talk area.
2. The method of claim 1, wherein after the obtaining the finger projection, the method further comprises:
if the gesture that the user waves the fingers to the first direction is detected, the finger projection is obtained again;
and if the reacquired finger projection is in a second delineating area and the touch input of the user on the second delineating area is detected, taking the second delineating area as a user touch reading area.
3. The method of claim 1, wherein after the first delineating region is taken as a user click-to-read region, the method further comprises:
identifying first content in the user click-to-read region;
and displaying the first content and reading the first content.
4. The method of claim 3, wherein the learning page is a paper page, and wherein displaying the first content comprises:
searching an electronic page corresponding to the learning page;
determining the first content in the electronic page according to the user click-to-read area;
and displaying the electronic page, and highlighting the first content in the electronic page.
5. A terminal device, comprising:
the acquisition module is used for acquiring a finger projection if a pointing action of a user finger on a learning page is detected in a click-reading scene, wherein the finger projection is the projection of the user finger on the learning page under the condition that the user finger is irradiated by using lamplight;
the determining module is used for taking the first delineating area as a user touch-reading area if the finger projection is in the first delineating area and the touch input of the user on the first delineating area is detected;
the terminal device further includes:
and the canceling module is used for canceling the first hooking area as the user touch and read area if a gesture of the user for hooking the finger is detected after the determining module takes the first hooking area as the user touch and read area.
6. The terminal device of claim 5,
the acquisition module is further used for acquiring the finger projection again after acquiring the finger projection if detecting a gesture that the user waves the finger to the first direction;
the determining module is further configured to use the second delineation area as a user touch-and-read area if the reacquired finger projection is in the second delineation area and a touch input of a user on the second delineation area is detected.
7. The terminal device according to claim 5, wherein the terminal device further comprises:
the identification module is used for identifying first content in the user click-to-read area after the determination module takes the first delineation area as the user click-to-read area;
a display module for displaying the first content;
and the reading module is used for reading the first content.
8. The terminal device of claim 7, wherein the learning page is a paper page, and the display module comprises: the searching submodule is used for searching the electronic page corresponding to the learning page;
the determining submodule is used for determining the first content in the electronic page according to the user click-to-read area;
and the display sub-module is used for displaying the electronic page and highlighting the first content in the electronic page.
CN201910500041.XA 2019-06-09 2019-06-09 Method for determining point reading area through projection and terminal equipment Active CN111078102B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910500041.XA CN111078102B (en) 2019-06-09 2019-06-09 Method for determining point reading area through projection and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910500041.XA CN111078102B (en) 2019-06-09 2019-06-09 Method for determining point reading area through projection and terminal equipment

Publications (2)

Publication Number Publication Date
CN111078102A CN111078102A (en) 2020-04-28
CN111078102B true CN111078102B (en) 2021-07-23

Family

ID=70310340

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910500041.XA Active CN111078102B (en) 2019-06-09 2019-06-09 Method for determining point reading area through projection and terminal equipment

Country Status (1)

Country Link
CN (1) CN111078102B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112699870A (en) * 2020-12-22 2021-04-23 深圳Tcl数字技术有限公司 Point reading identification method, terminal, system and computer readable storage medium
CN112785884A (en) * 2021-01-27 2021-05-11 吕瑞 Intelligent auxiliary learning system and method and learning table
CN112785883A (en) * 2021-01-27 2021-05-11 吕瑞 Point reading position marking method and system based on image recognition and intelligent learning table

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324271A (en) * 2012-03-19 2013-09-25 联想(北京)有限公司 Input method and electronic equipment based on gesture
CN106933350A (en) * 2017-02-09 2017-07-07 深圳市创想天空科技股份有限公司 AR exchange methods and device
CN109032360A (en) * 2018-08-30 2018-12-18 广东小天才科技有限公司 A kind of method for controlling projection and intelligent desk lamp of intelligent desk lamp
CN109656465A (en) * 2019-02-26 2019-04-19 广东小天才科技有限公司 A kind of content acquisition method and private tutor's equipment applied to private tutor's equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324271A (en) * 2012-03-19 2013-09-25 联想(北京)有限公司 Input method and electronic equipment based on gesture
CN106933350A (en) * 2017-02-09 2017-07-07 深圳市创想天空科技股份有限公司 AR exchange methods and device
CN109032360A (en) * 2018-08-30 2018-12-18 广东小天才科技有限公司 A kind of method for controlling projection and intelligent desk lamp of intelligent desk lamp
CN109656465A (en) * 2019-02-26 2019-04-19 广东小天才科技有限公司 A kind of content acquisition method and private tutor's equipment applied to private tutor's equipment

Also Published As

Publication number Publication date
CN111078102A (en) 2020-04-28

Similar Documents

Publication Publication Date Title
CN111078102B (en) Method for determining point reading area through projection and terminal equipment
JP6051338B2 (en) Page rollback control method, page rollback control device, terminal, program, and recording medium
CN107256509B (en) Price comparison method and device, terminal, server and storage medium
US10303327B2 (en) Information display method and device
US20140317541A1 (en) Electronic device having touch screen and method for zooming in
CN108932102B (en) Data processing method and device and mobile terminal
CN107766548B (en) Information display method and device, mobile terminal and readable storage medium
CN110597450A (en) False touch prevention identification method and device, touch reading equipment and touch reading identification method thereof
KR20170023746A (en) Method and apparatus of displaying ticket information
CN111078829B (en) Click-to-read control method and system
CN111077996B (en) Information recommendation method and learning device based on click-to-read
CN108881979B (en) Information processing method and device, mobile terminal and storage medium
CN109656444B (en) List positioning method, device, equipment and storage medium
CN108958634A (en) Express delivery information acquisition method, device, mobile terminal and storage medium
CN109359582B (en) Information searching method, information searching device and mobile terminal
US11073905B2 (en) Work assistance system, work assistance method, and computer-readable recording medium
CN103970841A (en) Label management method and device
CN111984180B (en) Terminal screen reading method, device, equipment and computer readable storage medium
CN113407828A (en) Searching method, searching device and searching device
CN108664205A (en) Method for information display, device, mobile terminal and storage medium
CN108803972B (en) Information display method, device, mobile terminal and storage medium
KR102192157B1 (en) Apparatas and method for offering a information about search location in an electronic device
CN106095128B (en) Character input method of mobile terminal and mobile terminal
US20190026380A1 (en) Method and apparatus for processing bookmark and terminal device
CN107203325B (en) Searching method and device, computer device and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant