CN115700847A - Drawing book reading method and related equipment - Google Patents

Drawing book reading method and related equipment Download PDF

Info

Publication number
CN115700847A
CN115700847A CN202110860614.7A CN202110860614A CN115700847A CN 115700847 A CN115700847 A CN 115700847A CN 202110860614 A CN202110860614 A CN 202110860614A CN 115700847 A CN115700847 A CN 115700847A
Authority
CN
China
Prior art keywords
user
electronic device
page
time
concentration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110860614.7A
Other languages
Chinese (zh)
Inventor
朱维峰
曾俊飞
查永东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110860614.7A priority Critical patent/CN115700847A/en
Publication of CN115700847A publication Critical patent/CN115700847A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application discloses a picture book reading method and related equipment, wherein in the user reading process, the electronic equipment can record abnormal behaviors (such as page jumping, long-time non-page turning and the like) of a user, segment the time of the whole reading process, output the concentration degree rating of the user in a corresponding time period based on the times of the abnormal behaviors of the user in each time period, and further calculate the value of the concentration degree of the user in the reading process at this time and the concentration degree rating. Therefore, under the condition that hardware cost and algorithm cost are not additionally increased, according to the page turning record of the user, the concentration degree of the user in the reading process can be output, and the user experience is improved. In addition, in the reading process of the user, if the user has abnormal behaviors (for example, behaviors of continuously skipping pages for 2 times within 1 minute and having skipped pages reaching 3 pages and the like) in a short time, the electronic device can remind the user to concentrate on the attention or remind the user to finish the reading, so that the interactive experience is improved.

Description

Drawing book reading method and related equipment
Technical Field
The application relates to the technical field of terminals, in particular to a picture book reading method and related equipment.
Background
In the early childhood education phase, sketching is an important way for children to recognize the world. However, many difficulties are often met in the use of traditional paper book of drawing, for example, many parents do not have enough time to accompany the child and read the book of drawing, consequently, electronic equipment such as reading pen, reading machine has appeared on the market, can accompany the child and read the book of drawing, however, these electronic equipment can't calculate child and draw the degree of concentration in this in-process of drawing, when the president can't accompany the child and read the book of drawing, under the condition that child read the book of drawing independently promptly, the parents can't learn child and draw the degree of concentration in this in-process of drawing this reading, user experience is poor.
Disclosure of Invention
The embodiment of the application provides a picture book reading method and related equipment, which can calculate the concentration degree of a user in the reading process.
In a first aspect, an embodiment of the present application provides a picture book reading method, which is applied to an electronic device including a camera, and the method includes: the electronic equipment acquires a picture of the picture book by using a camera and identifies a picture book page based on the picture of the picture book; the method comprises the steps that when the electronic equipment detects that the page number of a picture book page changes, page turning records are generated; the electronic equipment judges whether the user has abnormal behaviors or not based on the page turning records, and if so, the electronic equipment records the abnormal behaviors; the electronic device determines a concentration rating for the user based on the anomalous behavior.
By implementing the method provided by the first aspect, the concentration degree of the user in the reading process can be output according to the page turning record of the user without additionally increasing hardware cost and algorithm cost, and the user experience is improved.
In one possible implementation, the determining, by the electronic device, the concentration rating of the user based on the abnormal behavior specifically includes: the electronic equipment records the total time length of the whole reading and picture drawing process of a user and divides the total time length into M time periods, wherein M is a positive integer; the electronic equipment records the occurrence frequency of abnormal behaviors in the M time periods, and determines concentration degree ratings of users corresponding to the M time periods respectively based on the occurrence frequency of the abnormal behaviors in the M time periods; wherein the more the number of occurrences of the abnormal behavior, the lower the concentration rating; the fewer the number of occurrences of abnormal behavior, the higher the concentration rating. In this way, the user's concentration rating at various time periods during the reading process may be determined.
In one possible implementation, the method further includes: if the electronic device determines that the concentration degree rating of the user corresponding to the mth time period in the M time periods is not the highest, and determines that the concentration degree rating of the user corresponding to the M +1 time period is A based on the occurrence times of the abnormal behaviors in the M +1 time period in the M time periods, wherein the concentration degree rating A is not the highest nor the lowest, the electronic device determines to adjust the concentration degree rating of the user corresponding to the M +1 time period from the concentration degree rating A to a concentration degree rating B, wherein the concentration degree rating B is lower than the concentration degree rating A by one level, and M is a positive integer and smaller than M. In this way, it may be determined that the user may have a long period of non-attentive status.
In one possible implementation, after the electronic device determines the attentiveness rating of the user based on the anomalous behavior, the method further includes: the electronic equipment determines the concentration degree rating of the whole reading and drawing process of the user based on the duration of the M time periods and the concentration degree rating of the user corresponding to the M time periods; the higher the attention degree rating of the user corresponding to the M time periods is, and/or the longer the duration corresponding to the time period with the higher attention degree rating is, the higher the attention degree rating of the user in the whole book reading and drawing process is. In this way, the concentration rating of the user throughout the reading process can be determined.
In one possible implementation, the anomalous behavior includes one or more of: the behavior of "page jump", "long-time page turning" behavior, "page turning without reading", and "question unresponsive" behavior. In this way, it can be determined which behaviors are specifically included by the abnormal behavior.
In one possible implementation, the page turning record includes one or more of the following: page number, starting time, page reading completion ratio, playing content ending time, question asking time, question response and ending time. In this way, it can be determined which content the page turn record specifically includes.
In a possible implementation manner, the electronic device determines whether an abnormal behavior occurs to a user based on the page turning record, and specifically includes: if the electronic equipment detects that the page number before the user turns the page is not continuous with the page number turned by the user in the page turning record, the electronic equipment determines that the user has a page skipping behavior. In this way, whether the user has abnormal behavior can be determined based on the page number turned by the user in the page turning record.
In a possible implementation manner, the electronic device determines whether an abnormal behavior occurs to a user based on the page turning record, and specifically includes: if the electronic equipment detects that the end time in the page turning record exceeds the first time threshold value in comparison with the delay of the end time of the playing content, the electronic equipment determines that the user has a long-time non-page turning behavior. In this way, whether the user has abnormal behavior can be determined based on the end time in the page turn record and the end time of the play content.
In a possible implementation manner, the electronic device determines whether an abnormal behavior occurs to a user based on the page turning record, and specifically includes: if the electronic equipment detects that the page reading completion ratio in the page turning record is smaller than 1, the electronic equipment determines that the user has a behavior of turning pages without reading. In this way, whether the user has abnormal behavior can be determined based on the page reading completion ratio in the page turning record.
In a possible implementation manner, the electronic device determines whether an abnormal behavior occurs to a user based on the page turning record, and specifically includes: the electronic equipment initiates a question to the user, records the time of the question, and determines that the user has a question unresponsive behavior if the electronic equipment does not receive the response of the user within a second time threshold after the electronic equipment initiates the question. In this way, whether the user has abnormal behavior can be determined based on the question time in the page turning record.
In one possible implementation, after the electronic device determines that the user has abnormal behavior, the method further includes: and the electronic equipment outputs a prompt corresponding to the abnormal behavior through the output device. Therefore, the user can be reminded to concentrate on or finish reading when abnormal behaviors occur.
In a second aspect, embodiments of the present application provide an electronic device, which includes one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors, the one or more memories being configured to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of the possible implementations of the first aspect.
In a third aspect, an embodiment of the present application provides a computer storage medium, where a computer program is stored, where the computer program includes program instructions, and when the program instructions are executed on an electronic device, the electronic device is caused to execute the method in any one of the possible implementation manners of the first aspect.
In a fourth aspect, the present application provides a computer program product, which when executed on a computer, causes the computer to execute the method in any one of the possible implementation manners of the first aspect.
Drawings
Fig. 1 is a schematic flowchart of a picture recognition method according to an embodiment of the present application;
fig. 2 is a schematic view of a scenario of the picture book recognition provided in the embodiment of the present application;
FIG. 3 is a schematic view illustrating a reading phase of a book reading method according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of a plot reading method in a statistical phase according to an embodiment of the present application;
FIG. 5 is a schematic diagram illustrating a trend of a change in concentration of a user during a reading process according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of another electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: three cases of a alone, a and B both, and B alone exist, and in addition, "a plurality" means two or more than two in the description of the embodiments of the present application.
It should be understood that the terms "first," "second," and the like in the description and claims of this application and in the drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
In the early childhood education phase, sketching is an important way for children to recognize the world. However, traditional paper is painted originally often to meet a lot of difficulties in the use, for example, many parents do not have enough time to go to accompany child and read and paint originally, consequently, electronic equipment such as point reading pen, point reading machine has appeared on the market, can accompany child and read and paint originally, however, these electronic equipment can't calculate child and draw the degree of concentration in this in-process reading, when the president can't accompany child and read and draw originally, promptly under the condition that child independently read and draw originally, the parents can't learn child and draw the degree of concentration in this in-process reading, user experience is poor.
At present, more electronic devices with picture book reading capability (also called picture book reading intelligent terminal devices) appear on the market.
For example, some electronic devices can read the contents of the sketch, when a user places the sketch in front of the electronic device within a certain range, the electronic device can recognize the sketch through the camera, and when the user turns the sketch to a certain page, the electronic device can recognize the contents of the current page of the sketch and read out the text contents on the sketch (i.e., play the audio corresponding to the text contents on the sketch), but these electronic devices still cannot make an evaluation calculation on whether the user concentrates on the text contents in the reading process.
For another example, some electronic devices may generate corresponding pushed contents to be pushed to the user according to the browsing times of the user for trying to read chapter contents and the browsing times of the full book, but these electronic devices have fewer and simple behavior characteristics in the process of acquiring the user reading, and cannot make an evaluation calculation on whether the user is attentive or not in the process of reading.
For another example, in the reading process of the user, some electronic devices observe the user behavior through the camera, determine the reading state of the user (whether the user is focused on reading), and when it is found that the user wants to perform interaction or the reading behavior is abnormal, accompany interaction can be performed, so that interaction with the user in the reading process is completed. However, these electronic devices not only need a camera for recognizing the picture book, but also need an external camera for recognizing the face of the user, rely on the visual correlation algorithm to recognize the state of the user, have high cost and threshold, and are prone to involve the privacy problem of the user.
The embodiment of the application provides a picture book reading method, in the reading process of a user, an electronic device can record abnormal behaviors (such as page skipping, long-time page non-turning and other abnormal behaviors) of the user, segment the time of the whole reading process, output the concentration degree rating of the user in a corresponding time period based on the times of the abnormal behaviors of the user in each time period, and further calculate the value of the overall concentration degree and the concentration degree rating of the user in the reading process. Therefore, under the condition that hardware cost and algorithm cost are not additionally increased, according to the page turning record of the user, the concentration degree of the user in the reading process can be output, and the user experience is improved. In addition, in the process of reading by the user, if the user has abnormal behaviors (for example, behaviors of continuously skipping pages for 2 times within 1 minute and skipping pages for 3 pages and the like) in a short time, the electronic device can remind the user to concentrate on the attention or remind the user to finish reading, so that the interactive experience is improved.
First, a method flow of picture book identification provided by the embodiment of the present application is introduced.
Fig. 1 schematically illustrates a method flow of sketch recognition provided in an embodiment of the present application.
As shown in fig. 1, the method can be applied to an electronic device 100 with a picture book reading capability. The specific steps of the method are described in detail as follows:
s101, the electronic device 100 collects images.
Specifically, the electronic device 100 may acquire an image in real time in the image acquisition area by using a camera carried by the electronic device, and the image acquired in real time may be used for the sketch recognition in the subsequent step. The image acquisition area can be an area which can be shot by a camera.
As shown in fig. 2, when reading the picture book, the user may first place the picture book in a certain range in front of the electronic device 100 (i.e., in an image capturing area of a camera of the electronic device 100), so that the electronic device 100 can completely or nearly completely capture the picture book by using the camera. Further, the electronic device 100 may acquire an image including the sketch in real time by using the camera.
Optionally, the electronic device 100 may start to perform step S101 after receiving a first input of the user, where the first input may include, but is not limited to, a voice instruction input of the user, a click operation, and the like.
S102, the electronic device 100 executes an image recognition algorithm to perform the picture book recognition.
In the embodiment of the present application, the electronic device 100 may execute an image recognition algorithm in real time to recognize the image acquired in real time.
Specifically, the electronic device 100 may input the image acquired in real time into a sketch recognition model trained in advance for sketch recognition.
The template library can be pre-stored with a large number of picture book images, and the picture book images can be images of the picture book cover page or images of the picture book content page. The sketch image may have an association relationship with an affiliated sketch, and the association relationship may also be stored in a template library, and exemplarily, the specific implementation of establishing the association relationship may include: and establishing an association relationship between the picture of the picture book and the name of the corresponding picture book, wherein the name of the picture book can indicate which picture book is the picture book.
The electronic device 100 may use an image recognition technology based on visual AI capability for performing the picture book recognition, and the embodiment of the present application does not limit the image recognition algorithm used in the picture book recognition process.
The template library and the drawing recognition model can be stored locally or in a cloud, and the embodiment of the application does not limit the model.
S103, the electronic device 100 judges whether the sketch can be identified.
Specifically, if the electronic device 100 determines that the image acquired in real time is successfully matched with one of the picture books pre-stored in the template library, the electronic device 100 determines that the picture book can be recognized.
The picture book images in the template library are trained picture book images, namely the pictures are subjected to processes of feature extraction, model training and the like. In the case that the image acquired by the camera in real time includes the trained picture book, the electronic device 100 may recognize the picture book.
S104, the electronic device 100 determines whether the current page of the picture book is a cover page of the picture book.
Specifically, when the electronic device 100 performs the book drawing identification, the image collected by the camera in real time may be preferentially matched with the images of all the pre-stored book drawing cover pages one by one, and if the matching is successful (i.e., the image collected by the camera in real time is successfully matched with the image of one of the pre-stored book drawing cover pages), the electronic device 100 may determine that the image collected by the camera in real time includes the image of the book drawing cover page, i.e., the electronic device 100 identifies the cover page of the book drawing, and further, the electronic device 100 may determine that the cover page of the book drawing is displayed in the camera image collection area by the user.
Further, after determining that the current page of the picture book is the cover page of the picture book, the electronic device 100 may perform step S105.
S105, the electronic device 100 plays the audio of the switched picture book and switches the alternative content index page at the same time.
Specifically, if the electronic device 100 recognizes the cover page of the drawing book, it may be considered that the user has switched the drawing book. Further, the electronic device 100 may play an audio of the switching of the sketches, which is used to prompt the user that the sketches have been switched. Optionally, the audio content may also include the name of the sketch to which the user switches, e.g., the audio may be "you have switched the sketch, the name of the sketch being XXX". Meanwhile, the electronic device 100 may switch the candidate content index page, that is, the electronic device 100 may use the content page of the recognized sketch corresponding to the sketch as the current candidate content index page.
S106, the electronic device 100 judges whether the current page of the picture book is matched with the alternative content index page.
Specifically, after the electronic device 100 determines that the current page of the picture book is not a cover page of the picture book, the electronic device 100 may match the image of the current page of the picture book acquired by the camera in real time with the image of the current candidate content index page one by one, and if the matching is successful (i.e., the image of the current page of the picture book acquired by the camera in real time is successfully matched with one of the images of the current candidate content index page), the electronic device 100 may determine that the current page of the picture book is matched with the candidate content index page, i.e., the electronic device 100 identifies the current page of the picture book. Otherwise, the identification fails.
Further, after the electronic device 100 recognizes the current page of the notebook, the electronic device 100 may perform step S107.
The current candidate content index page is the content page of the drawing corresponding to the drawing identified by the electronic device 100 in step S105.
In the embodiment of the present application, the feature matching method used in the text recognition process may refer to the feature matching method disclosed in the patent with the publication number CN111695419a, and the whole content of CN111695419a is incorporated herein. The method is not limited to this, and other feature matching methods in the prior art may also be used in the text-drawing identification process, which is not limited in this embodiment of the present application.
It is easy to understand that when the user switches the textbook, the cover page of the textbook needs to be placed in a certain range in front of the electronic device 100 first, so that the electronic device 100 can recognize the cover page of the textbook, and switch the alternative content index page, so that the electronic device 100 can perform subsequent steps based on the alternative content index page.
In the embodiment of the present application, the electronic device 100 may perform the drawing recognition periodically, for example, the electronic device 100 may obtain the latest recognition result output by the drawing recognition model every 200ms, so that the change of the drawing page in the whole reading process of the user may be recognized.
S107, the electronic device 100 judges whether the user turns to a new page.
Specifically, after the electronic device 100 recognizes the current page of the picture book, because the electronic device 100 can periodically perform picture book recognition, the electronic device 100 can determine whether the current page of the picture book recognized this time and the current page of the picture book recognized last time are the same page, and if so, it indicates that the page of the picture book has not changed, that is, the current page of the picture book recognized this time is the current page of the picture book recognized last time, and then the electronic device 100 can determine that the user has not turned to a new page; if not, it indicates that the picture book page has changed, that is, the current picture book page recognized this time is not the last picture book page recognized last time, then the electronic device 100 may determine that the user has turned to a new page.
The implementation manner of the electronic device 100 determining whether the current page of the recognized picture book is the same as the current page of the previously recognized picture book may include, but is not limited to, the following 2 types:
in implementation mode 1, it is determined whether the content of the current page of the picture book identified this time is the same as the content of the current page of the picture book identified last time, and if so, the current page of the picture book identified this time is the same as the current page of the picture book identified last time.
And in the implementation mode 2, whether the page number of the current recognized picture book page is the same as the page number of the current recognized picture book page or not is judged, and if so, the current recognized picture book page is the same as the current recognized picture book page.
S108, the electronic device 100 starts to record the new page reading information and starts to play the audio corresponding to the new page content.
Specifically, after determining that the user turns to the new page, the electronic device 100 may start recording the new page reading information, and at the same time, the electronic device 100 may switch the played audio and start playing the audio corresponding to the content of the new page.
The new page reading information may include information such as a page turning time point of a user, a current page reading ratio before page turning, whether a question is responded, and the like.
The following describes a specific implementation procedure of the picture book reading method provided in the embodiment of the present application.
The method can be applied to the electronic device 100 with the picture book reading capability. The specific execution process of the method is divided into two stages: a reading phase and a counting phase. The following detailed description is made:
1. reading stage
The electronic device 100 may enter a reading phase after the picture book recognition is successful, because the electronic device 100 may periodically perform the picture book recognition by using an image recognition algorithm (for example, recognize every 200 ms), and recognize the content on the picture book page, in the reading phase, when the user turns a page (i.e., the user turns to a new page from the current page), the electronic device 100 may recognize the new page turned by the user, and may record information such as a time point of the user turning the page this time, the number of pages turned, and whether the current page is completely read before turning the page. Optionally, if the electronic device 100 performs question-like interaction with the user during the reading process of the user, the electronic device 100 may also record information about whether the user responds to the question. Further, the electronic device 100 may extract information such as an abnormal behavior (e.g., an abnormal behavior such as page skipping or long-time non-page turning) of the user and a start-stop time corresponding to the abnormal behavior from the information recorded in the reading process. Further, when the electronic apparatus 100 recognizes that the user has occurred an abnormal behavior a plurality of times in a short time, the electronic apparatus 100 may confirm with the user whether to continue reading.
The following describes a specific process of the present invention reading method in the reading stage in detail.
Fig. 3 exemplarily shows a specific flow of the sketch reading method in the reading stage provided in the embodiment of the present application. The specific steps of the method in the reading phase are described in detail as follows:
s301, the electronic device 100 records information such as a page turning time point of a user, a reading completion ratio of a current page before page turning, whether a question is responded or not, and the like.
Referring to table 1 below, table 1 exemplarily shows information of a user page turning time point, a current page reading completion ratio before page turning, whether a question is responded, and the like, recorded by the electronic device 100 in one complete reading process of the user.
Wherein the time points in table 1 represent the first minute, second of the start of a reading session, e.g., 1. The contents of one row in table 1 indicate one page-turning record, and the "serial number" in table 1 indicates the serial number of each page-turning record. The following information may be included in a page turn record, but is not limited to: "page number", "start time", "end time of playing content", "ratio of reading out of the page" (i.e. ratio of reading out of the current page before turning over the page) "," question time "," whether to respond to a question ", and" end time ".
When the user turns a page (i.e. turns from the current page to a new page), the electronic device 100 may detect a page turning behavior of the user (for example, the electronic device 100 may recognize that the sketch page has changed by using the image recognition algorithm in the sketch recognition process), at this time, a page number of the new page (i.e. "page number" in table 1 or "page number that the user turns to") may be recorded, and at the same time, a page turning time point (i.e. "end time" of the current page in table 1, "start time" of the new page) may also be recorded.
When the user turns to a new page, the electronic device 100 may start playing the audio corresponding to the content of the page, and when the electronic device 100 finishes playing the audio corresponding to the content of the page, a time point when playing of the audio corresponding to the content of the page is finished (i.e., "playing content end time" in table 1) may be recorded, and at the same time, a ratio at which the user finishes reading the page (i.e., "reading out ratio" in table 1) may also be recorded.
The ratio of the user reading the page may be determined based on the duration of the audio corresponding to the content of the page that has been played and the total duration of the audio corresponding to the content of the page.
For example, when the electronic device 100 finishes playing the audio corresponding to the content of the page, if the audio corresponding to the content of the page has been completely played (that is, before the electronic device 100 finishes playing the audio corresponding to the content of the page, it does not detect that the user has a page turning behavior), then the electronic device may record that the page reading completion ratio is 1.
For another example, when the electronic device 100 finishes playing the audio corresponding to the page content, if the audio corresponding to the page content is not completely played (that is, the electronic device 100 detects that the page turning behavior occurs to the user when only playing a part of the audio corresponding to the page content), then the electronic device 100 may calculate a ratio of a duration of the audio corresponding to the played page content to a total duration of the audio corresponding to the page content, and record the ratio as a ratio of the user reading the page.
When the user turns to a new page and the content of the new page includes a question-class interaction, the electronic device 100 may record a time point of the question (i.e., "question time" in table 1), and after the electronic device 100 initiates the question, may also record whether a voice response to the current question is received by the user (i.e., "whether the question is responded to" in table 1).
Note that, in table 1, the question result is recorded by taking as an example that a question is only issued at most once in the same page. In some embodiments, for a case where multiple questions are initiated in the same page, the electronic device 100 may adjust the recording manner, for example, the electronic device 100 may record the question results once in a row, and when performing processing at a subsequent stage, may merge the multiple question results in the same page by determining whether page numbers of adjacent rows are consistent.
Figure BDA0003184951950000081
TABLE 1
S302, the electronic device 100 extracts information such as start and stop time corresponding to an event that a user jumps pages, pages are not turned for a long time, pages are turned after reading, and questions are not responded.
Referring to table 2 below, table 2 illustrates abnormal behaviors that may occur in one complete reading process of the user and the manner in which the abnormal behaviors are recognized by the electronic device 100.
The abnormal behaviors that may occur in one complete reading process of the user may include, but are not limited to, the following 4 types: page skipping, no page turning for a long time, page turning without reading, and no response to question.
Abnormal behavior 1, page jump:
"page jump" may mean that the user turns multiple pages at a time during reading, skips some pages unread, and does not turn back in a short time.
Possible implementations of electronic device 100 to recognize the "page jump" behavior:
(1) The electronic device 100 may compare the page numbers corresponding to the two adjacent page turning records, and determine that the page number corresponding to the second page turning record (i.e., the next page turning record) is greater than the page number corresponding to the first page turning record (i.e., the previous page turning record) by more than 1 page, that is, the difference between the page number corresponding to the second page turning record and the page number corresponding to the first page turning record is greater than 1, that is, the electronic device 100 detects that the page number before the user turns the page is discontinuous with the page number turned by the user.
For example, as shown in table 1, the page turning record corresponding to the serial number 3 may be a second page turning record, and the page turning record corresponding to the serial number 2 may be a first page turning record, and it is easy to see that the page number corresponding to the second page turning record is page number 4, the page number corresponding to the first page turning record is page number 2, and the difference between page number 4 and page number 2 is 2,2, which is greater than 1, so that the electronic device 100 may determine that the user has a "page jump" behavior at time point 1, that is, from page 2 to page 4, the user has turned 2 pages at a time, and the page jump number is 1 page.
Similarly, as can be easily seen from table 1, the user has a "page jump" behavior again at time point 5, that is, from page 6 to page 8, the user turns 2 pages at a time, and the number of page jumps is 1 page.
(2) The electronic device 100 may determine that the page number corresponding to the unread page is included between the page numbers corresponding to the two page turning records, and the electronic device 100 determines that the user does not turn back to the next page of the page number corresponding to the first page turning record within a short time (e.g., within 10 seconds).
For example, assuming that the page number corresponding to the first page turning record is 2, and the page number corresponding to the second page turning record is 4, if the electronic device 100 detects that there is no page turning record containing the page number 3 between the two page turning records, the electronic device 100 may determine that the page number corresponding to the page that is not read by the user is 3, so as to determine that the page number corresponding to the page that is not read is contained between the page numbers corresponding to the two page turning records. Further, if the electronic device 100 further detects that the user does not turn back to the page corresponding to the page number 3 in a short time (for example, within 10 seconds), the electronic device 100 may determine that the user does not turn back to the next page corresponding to the page number corresponding to the first page turning record in a short time. Thus, the electronic device 100 may determine that the user has a "page jump" behavior, i.e., a jump from page 2 to page 4.
In some embodiments, the electronic device 100 may also combine the above two possible implementations (1) and (2) to comprehensively determine whether the user has a "page jump" behavior. For example, the electronic device 100 may preliminarily determine whether the user has a "page jump" behavior by using the method (1) above, if so, the electronic device 100 may further determine whether the user has a "page jump" behavior by using the method (2) above, and if so, the electronic device 100 may comprehensively determine that the user has a "page jump" behavior. Thus, the electronic device 100 can be prevented from generating a false recognition phenomenon.
Abnormal behavior 2, no page turning for a long time:
the "long time without page turning" may mean that the user does not have a page turning behavior for a period of time (e.g., within 1 minute) after the electronic device 100 completely plays the audio corresponding to the content of the page.
Possible implementations of the electronic device 100 recognizing the "long time not flipped" behavior:
the electronic device 100 may find two adjacent page turning records, and determine that a start time (i.e., "end time" in the first page turning record) corresponding to the second page turning record (i.e., the next page turning record) is delayed by more than a preset time threshold (e.g., 1 minute) from an end time of the play content corresponding to the first page turning record (i.e., the previous page turning record). It is easy to understand that the electronic device 100 may also compare the "end time" and the "end time of the playing content" in the same page turning record, and determine that the "end time" is delayed from the "end time of the playing content" by more than a preset time threshold (e.g. 1 minute).
The preset time threshold may also be referred to as a first time threshold.
For example, as shown in table 1, the page-turning record corresponding to the serial number 5 may be a second page-turning record, and the page-turning record corresponding to the serial number 4 may be a first page-turning record, and assuming that the preset time threshold is 1 minute, it is easy to see that the start time corresponding to the second page-turning record is 1 minute and 30 seconds later than the end time of the playing content corresponding to the first page-turning record, and exceeds the preset time threshold by 1 minute, so that the electronic device 100 may determine that the "long time non-page-turning" behavior occurs when the user is at time point 5.
Abnormal behavior 3, turning pages without reading:
the "turning pages without reading" may refer to that the electronic device 100 has a page turning behavior when only the audio corresponding to a part of the content of the page is played.
The electronic device 100 recognizes possible implementations of the "page turn without reading" behavior:
when the electronic device 100 detects a page turning behavior of the user, it may start to generate a new page turning record (i.e., a new page turning record), and at the same time, the electronic device 100 may find a previous page turning record of the new page turning record, and determine that a page reading completion ratio corresponding to the page turning record is smaller than 1, that is, when the new page turning record occurs, the previous page content is not completely read yet.
For example, as shown in table 1, the electronic device 100 may detect a page-turning behavior of the user at time point 5 at 40, and start to generate a page-turning record corresponding to sequence number 6, and at the same time, the electronic device 100 may find the page-turning record corresponding to sequence number 5, and since the page-turning-completion ratio corresponding to the page-turning record is 0.8,0.8 smaller than 1, the electronic device 100 may determine that the "page-turning without reading" behavior occurs at time point 5 by the user.
It is easy to understand that if the electronic device 100 detects that the "page unread-to-complete ratio" in a certain page turning record is less than 1, the electronic device 100 may determine that the user has a "page turning before reading" behavior.
Abnormal behavior 4, question not responding:
the "question is not responded to" may mean that the user does not have a voice response to the present question after the electronic device 100 initiates the question.
The electronic device 100 identifies possible implementations of the "question unresponsive" behavior:
when the picture book content includes a question interaction, that is, for a page number with the interaction, the electronic device 100 may initiate a question to the user, and record a question time, and within a preset time threshold (for example, within 10 seconds) after the electronic device 100 initiates the question, the electronic device 100 does not receive a voice response of the user to the question.
The preset time threshold may also be referred to as a second time threshold.
For example, as shown in table 1, as can be seen in the page turning record corresponding to the serial number 3, if the preset time threshold is 10 seconds for the question initiated by the electronic device 100 at time point 2.
Figure BDA0003184951950000101
TABLE 2
Referring to table 3 below, table 3 exemplarily shows information extracted by the electronic device 100 based on the information recorded in table 1, such as start and stop times corresponding to events that occur when the user jumps pages, leaves no pages for a long time, leaves pages without reading, and asks no response during one complete reading process.
Table 3 includes abnormal behaviors that may occur in a complete reading process of the user and information corresponding to the abnormal behaviors, i.e., start time, end time, page skip, page count, read completion ratio, and page non-turning time.
Figure BDA0003184951950000111
TABLE 3
It is easy to understand that the above-mentioned process of extracting the abnormal behavior of the user in step S302 may be performed after one page turning record is generated in step S301, rather than after all page turning records in one complete reading process are generated.
S303, the electronic device 100 determines whether the user needs to be reminded.
Specifically, during the reading process of the user, the electronic device 100 may first determine whether the user has an abnormal behavior in a short time, and further determine whether the user needs to be reminded.
If the electronic device 100 determines that the user has an abnormal behavior in a short time, the electronic device 100 may determine that the user needs to be reminded, and further, the electronic device 100 may execute step S304 to remind the user.
If the electronic device 100 determines that the user does not have the abnormal behavior in a short time, the electronic device 100 may determine that the user does not need to be reminded, and further, the electronic device 100 may not make a reminder.
The electronic device 100 may determine whether the user has a relatively abnormal behavior in a short time based on the information (e.g., information in table 3) corresponding to the abnormal behavior extracted in step S302.
Referring to table 4 below, table 4 exemplarily shows more abnormal behaviors that may occur in a short time during reading of a user, a corresponding prompting manner, and a feedback processing manner.
Figure BDA0003184951950000121
TABLE 4
The more abnormal behaviors that may occur in the reading process of the user may include, but are not limited to, the following 4 types:
more abnormal behavior 1, continuous page skipping within a certain period of time (e.g., within 1 minute) reaches a certain preset threshold number of times (e.g., 2 times), and the number of pages skipped reaches a certain preset threshold number of pages (e.g., 3 pages).
Taking "page skipping continues for 2 times in 1 minute and the number of skipped pages reaches 3 pages" in table 4 as an example, the electronic device 100 may determine whether the user skips pages for 2 times in 1 minute and the number of skipped pages reaches 3 pages based on the information corresponding to the "page skipping" behavior extracted in step S302.
The electronic device 100 may determine whether a difference between start times (or end times) corresponding to 2 adjacent "page skipping" behaviors is greater than 1 minute, and if so, the electronic device 100 determines that the user has skipped pages for 2 times within 1 minute. Further, the electronic device 100 may determine whether the total number of page skipping pages corresponding to the 2 adjacent "page skipping" behaviors is greater than or equal to 3 pages, and if so, the electronic device 100 determines that the number of pages skipped by the user reaches 3 pages.
And 2, a relatively abnormal behavior 2, that is, the page turning time is too long, and the single page turning-free time reaches a certain preset time threshold (for example, 2 minutes).
Taking "the page turning time is too long, and the single non-page turning time reaches 2 minutes" in table 4 as an example, the electronic device 100 may determine whether the single non-page turning time of the user is greater than or equal to 2 minutes based on the non-page turning time corresponding to the "long non-page turning" behavior extracted in step S302, and if so, the electronic device 100 determines that the single non-page turning time of the user reaches 2 minutes.
In the more abnormal behavior 3, continuous reading reaches a certain preset page number threshold (e.g. 3 pages) and is not completed, and the read completion ratio is smaller than a certain preset ratio threshold (e.g. 80%).
Taking "continuous reading reaches 3 pages unread and the reading completion ratio is less than 80%" in table 4 as an example, the electronic device 100 may determine whether the continuous reading reaches 3 pages unread and the reading completion ratio is less than 80% based on the reading completion ratio corresponding to the behavior of "page turning after continuous reading" extracted in step S302.
The electronic device 100 may determine whether the number of times that the user continuously has the "page turning without reading" behavior is greater than or equal to 3 times, if so, the electronic device 100 may determine that the user continuously reaches 3 pages without reading, further, the electronic device 100 may determine whether the read completion ratios corresponding to the "page turning without reading" behaviors for 3 times in succession are all less than 80%, and if so, the electronic device 100 may determine that the read completion ratios corresponding to the "page turning without reading" behaviors for 3 times in succession are all less than 80%.
More abnormal behavior 4, not putting the picture in recognizable range for a period of time (e.g., 1 minute).
Taking "do not put the picture book into the recognizable range for 1 minute continuously" in table 4 as an example, the electronic device 100 may perform picture book recognition by using the image recognition algorithm in the picture book recognition process, and if the electronic device 100 cannot recognize the picture book and the time length for which the picture book cannot be recognized is greater than or equal to 1 minute, the electronic device 100 may determine that the user does not put the picture book into the recognizable range for 1 minute continuously.
S304, the electronic device 100 reminds the user.
Specifically, when the electronic device 100 determines that the user needs to be reminded, the electronic device 100 may remind the user to concentrate on the attention or remind the user to end the reading in a corresponding reminding manner.
The corresponding prompting method used for prompting the user may be the one exemplarily shown in table 4:
for the above more abnormal behavior 1, the electronic device 100 may output a prompt "do i see you turning pages all the time, do not like the content? ".
Further, the electronic device 100 may receive a voice response made by the user for the prompt and make a corresponding feedback process for the voice response. For example, if the electronic device 100 detects that the voice response of the user represents an intention of "dislike," the electronic device 100 may prompt the user to change the book. For another example, if the electronic device 100 detects that the voice response of the user represents a "favorite" intention, the electronic device 100 may continue to perform the reading task (e.g., play an audio corresponding to the content of the page), and ignore the more abnormal behavior 1 that may occur subsequently in the reading process, that is, if the electronic device 100 detects that the abnormal behavior 1 occurs again, the electronic device 100 may not prompt.
For the above more abnormal behavior 2, the electronic device 100 may output a prompt "do you continue reading the notebook? Do not have to rest? ".
Further, the electronic device 100 may receive a voice response made by the user for the prompt and make a corresponding feedback process for the voice response. For example, if the electronic device 100 detects that the user's voice response represents an intention of "do not continue", the electronic device 100 may stop performing the sketch reading task (i.e., exit the sketch). For another example, if the electronic device 100 detects that the voice response of the user represents an intention of "continue", the electronic device 100 may continue to perform the reading task (e.g., play an audio corresponding to the content of the page), and ignore the more abnormal behavior 2 that may occur subsequently in the reading process, that is, if the electronic device 100 detects that the abnormal behavior 2 occurs again, the electronic device 100 may not prompt.
For the more abnormal behavior 3, electronic device 100 may output a prompt "none of the pages read, please endure read" as exemplarily shown in table 4.
Further, the electronic device 100 may continue to perform the reading task (e.g., play an audio corresponding to the content of the page), and ignore the more abnormal behavior 3 that may occur subsequently in the reading process, that is, if the electronic device 100 detects that the abnormal behavior 3 occurs again, the electronic device 100 may not prompt.
For the above more abnormal behavior 4, the electronic device 100 may output a prompt "i can not see the draw now, please put the draw in front of me" exemplarily shown in table 4.
Further, the electronic device 100 may continue to perform the sketch recognition by using the image recognition algorithm in the sketch recognition process, and if the electronic device 100 does not recognize the sketch for a certain time (e.g., 1 minute) after the prompt, the electronic device 100 may stop performing the sketch reading task (i.e., quitting the sketch). If the electronic device 100 recognizes the picture book after the prompt, the electronic device 100 continues to perform a picture book reading task (e.g., playing an audio corresponding to the content of the page).
Performing the feedback processing by the electronic apparatus 100 requires the electronic apparatus 100 to recognize the voice of the user and understand the intention of the user. The electronic device 100 may implement Speech Recognition and intention understanding by using an Automatic Speech Recognition (ASR) capability and a Natural Language Processing (NLP) capability of the cloud.
The prompts exemplarily shown in table 4 may be output by the electronic apparatus 100 through an output device (e.g., a speaker, a display screen, etc.).
It should be noted that the more abnormal behaviors, the corresponding prompting manners, and the feedback processing manners that may occur in a short time during the reading process of the user shown in table 4 are merely exemplary and should not be construed as a limitation of the present application.
By executing the method shown in fig. 3, in the reading stage, the electronic device 100 may find whether the user has an abnormal behavior in real time, and if so, the electronic device 100 may remind the user to concentrate attention during the reading process, or remind the user to end the reading.
In some embodiments, the electronic device 100 does not need to remind the user after determining that the user has the relatively abnormal behavior, but can remind the user to concentrate on the attention or finish reading in time after determining that the user has the abnormal behavior.
2. Statistical phase
After the reading phase is completed (i.e., after the reading of this time is finished), the electronic device 100 may enter a statistics phase, in the statistics phase, the electronic device 100 may record a total duration of the whole reading and texting process of the user this time, segment the total duration (for example, the total duration may be divided into M time periods, where M is a positive integer), record the number of times that the user has abnormal behavior in each time period, and then determine and output a concentration rating of the user in the corresponding time period based on the number of times that the user has abnormal behavior in each time period. Further, the electronic device 100 may calculate a value of the overall concentration degree of the user in the whole reading and drawing process of this time based on the duration of each time period in the reading process of this time and the concentration degree rating of the user in the corresponding time period, and determine the concentration degree rating of the user in the whole reading and drawing process of this time.
In an embodiment of the present application, the entering of the statistical phase by the electronic device 100 may be triggered by an instruction (e.g., a voice instruction) issued by a user. For example, the electronic device 100 may enter the statistics phase after receiving a voice command that the user utters "i have completed this reading".
In some embodiments, the electronic device 100 may also automatically enter the statistics phase after detecting that the content playing of the last page of the sketch is finished.
The specific process of the present invention reading method in the statistical stage is described in detail below.
Fig. 4 exemplarily shows a specific flow of the sketch reading method provided in the embodiment of the present application in the statistical phase. The specific steps of the method in the statistical phase are described in detail below:
s401, the electronic device 100 segments the total time of one reading process of the user.
Specifically, after the reading phase is finished, the electronic device 100 may segment the total duration of one reading process of the user according to a certain segmentation rule. For example, the electronic device 100 may divide the total time length into a plurality of time periods, and if a certain divided time point is located between time periods in which an abnormal behavior occurs, the time period may be extended, so that the time period includes the time period in which the abnormal behavior occurs, and the subsequent time periods may also be sequentially extended.
For example, the electronic apparatus 100 may segment the total time length of one reading process of the user in units of 1 minute. Taking table 1 as an example, referring to table 1, it can be seen from table 1 that the total time length of one reading process of the user is 10 minutes, and if the segmentation is performed in units of 1 minute, the segmentation can be divided into the following 10 time periods: 0; 1; 2; 3; 4; 5; 6; 7; 8; 9:00-10:00. However, the split time point 4 of period 3. It is easy to understand that the electronic device 100 may eventually divide the total duration of one reading process of the user in table 1 into the following 9 time periods: 0; 1; 2; 3; 5; 6; 7; 8; 9:00-10:00.
It should be noted that, the electronic device 100 may also use other segmentation rules for segmenting the total duration of one-time reading process of the user, which is not limited in this embodiment of the present application.
S402, the electronic device 100 records the times of the abnormal behaviors of the users in each time period, and ranks the concentration degree of the users in each time period based on the recorded times of the abnormal behaviors of the users in each time period.
Specifically, after the electronic device 100 segments the total duration of one reading process of the user, the electronic device 100 may record the number of times of the abnormal behavior of the user in each time period. The number of times that the user has abnormal behavior in a single time period and the concentration rating have a certain mapping relationship, the mapping relationship may be preset, and after the number of times that the user has abnormal behavior in a single time period is recorded, the electronic device 100 may obtain the corresponding concentration rating based on the mapping relationship.
Referring to table 5 below, table 5 exemplarily shows a mapping relationship between the number of times the user has abnormal behavior and the concentration rating in a single time period. As can be seen from table 5, the electronic apparatus 100 classifies the number of times of the user's abnormal behavior in a single time period into 4 cases: 0 times, 1 time, 2 times and 3 times. The electronic device 100 divides the concentration degree into 4 levels, and the 4 levels are sequentially from high to low: concentration, more concentration, less concentration, and less concentration. When the frequency of abnormal behaviors of the user in a single time period is 0, the concentration degree is graded as concentration; when the frequency of abnormal behaviors of the user in a single time period is 1 time, the concentration degree is graded as comparative concentration; when the frequency of abnormal behaviors of the user in a single time period is 2, the concentration degree is graded as not being very concentrated; when the number of times of abnormal behaviors of the user in a single time period is 3, the concentration degree is rated as inattention. It is easy to see that the more the abnormal behavior occurs, the lower the concentration rating, the less the abnormal behavior occurs, and the higher the concentration rating.
It should be noted that the mapping relationship between the number of times of the user having abnormal behavior in a single time period and the concentration rating shown in table 5 is only exemplary and should not be construed as a limitation of the present application.
In some embodiments, the mapping between the number of times a user has abnormal behavior in a single time period and the concentration rating may be other. The concentration degree can also be divided into more levels or less levels, and the frequency of abnormal behaviors of the user in a single time period corresponding to each concentration degree rating can also be other single numerical values or intervals.
In other embodiments, in determining the user's concentration rating for each time period, the concentration rating may also be more finely differentiated in combination with the degree of abnormality to which the user has exhibited abnormal behavior. The abnormal degree of the abnormal behavior of the user can be determined based on the time length of the user who has not turned pages for a long time, the number of pages skipped, and the like.
Number of times of abnormal behavior of user in single time period Concentration rating
0 Concentration on
1 Focus on
2 Is not focused too much
3 Do not concentrate on
TABLE 5
The electronic device 100 may obtain a concentration rating of the user in each time period based on the recorded number of times the user has abnormal behavior in each time period.
Further, the electronic device 100 may also compare the attentiveness ratings of the users in two adjacent time periods (i.e., the current time period and the time period immediately preceding the current time period), and if the attentiveness ratings of the users in the current time period and the time period immediately preceding the current time period are both non-attentive (e.g., more attentive, or less attentive, or not attentive), the attentiveness rating of the current time period may be adjusted, e.g., decreased by one level based on the originally due attentiveness rating, to obtain the adjusted attentiveness rating of the current time period. In this way, the concentration rating for the user may be adjusted downward for the period of time, with a determination that the user may have a longer period of non-concentration behavior.
Referring to table 6 below, table 6 exemplarily shows the concentration rating of the previous time period of the current time period, the concentration rating due to the current time period, and the adjusted concentration rating of the current time period.
As can be seen from table 6, when the concentration rating of time slot 1 is more concentrated/less concentrated and the concentration rating that should have been obtained in time slot 2 is less concentrated, the concentration rating that should have been obtained in time slot 2 needs to be adjusted to obtain the adjusted concentration rating of time slot 2 because the concentration rating of the user is not concentrated in both time slots. The adjusted concentration rating of time period 2 may be obtained by lowering the concentration rating of time period 2 by one step, i.e. lowering the concentration rating of time period 2 by one step to be not concentrated, and thus the adjusted concentration rating of time period 2 is not concentrated.
In some embodiments, where the concentration rating of time period 1 is more concentrated/less concentrated and the concentration rating due to time period 2 is less concentrated, the concentration rating due to time period 2 is the lowest level of the concentration rating of table 5, and thus, in this case, the concentration rating due to time period 2 does not need to be adjusted, i.e., the concentration rating due to time period 2 is consistent with the adjusted concentration rating of time period 2.
In the other 3 cases in table 6, since the concentration ratings of the users in the two time periods are not both non-concentration, the concentration rating originally obtained in the time period 2 does not need to be adjusted, and the concentration rating originally obtained in the time period 2 matches the concentration rating obtained in the adjusted time period 2.
Figure BDA0003184951950000171
TABLE 6
Based on the information in tables 5 and 6, it is easy to understand that, if the electronic device 100 divides the total duration of the reading and drawing process of the user into M time periods, where M is a positive integer, the concentration rating of the user corresponding to the mth time period of the M time periods is not the highest (e.g., the concentration rating is relatively concentrated/less concentrated), and the electronic device 100 determines the concentration rating of the user corresponding to the M +1 time period as a based on the occurrence number of abnormal behaviors in the M +1 time period of the M time periods, where the concentration rating a is not the highest or the lowest (e.g., the concentration rating a is relatively concentrated/less concentrated), the electronic device 100 may further determine to adjust the concentration rating of the user corresponding to the M +1 time period from the concentration rating a to a concentration rating B, where the concentration rating B is lower than the concentration rating a by one level (e.g., if the concentration rating a is relatively concentrated, the concentration rating B is not more concentrated), and M is less than the positive integer.
Based on the information in table 1 and table 3, the electronic device 100 may obtain the analysis result exemplarily shown in table 7 below by performing step S401 and step S402.
Referring to table 7, it can be seen from table 7 that in time periods 0-1, 00, 6.
In time period 1.
In time period 2.
In time period 3.
In time period 5.
Figure BDA0003184951950000181
TABLE 7
In some embodiments, the concentration rating of each time period in table 7 may also be the concentration rating originally due to each time period, that is, the electronic device 100 may not make any adjustment to the concentration rating originally due to each time period, and directly use the concentration rating originally due to each time period as the final concentration rating for each time period.
Further, based on the information in table 7, a schematic diagram of the trend of the user's concentration in one reading process, which is exemplarily shown in fig. 5, can be obtained.
As can be seen from fig. 5, the concentration of the user in one reading process shows a trend of change that is first decreased and then increased, i.e., the concentration of the user is graded as concentration (0-1 minute), more concentration (1-2 minutes), less concentration (2-5 minutes), and less concentration (5-6 minutes) in the 0-6 minute time period, i.e., the concentration shows a trend of change that is decreased; the user's concentration rating was all concentration (6-10 minutes) over the 6-10 minute time period, i.e., the concentration exhibited an increased trend of change compared to before 6 minutes.
Since fig. 5 is labeled with the abnormal behavior of the user during reading, it can also be seen from fig. 5 in which time period the abnormal behavior of the user occurs. As shown in fig. 5, the user has the "page jump" behavior at time period 1-00 and time period 5-6, the user has the "question not respond" behavior at time period 2.
S403, the electronic device 100 calculates the value of the concentration degree of the user reading this time based on the time length and the concentration degree rating of each time period.
Specifically, the electronic device 100 may calculate the value of the concentration of the user reading this time based on the duration of each time period and the concentration rating in various ways. A specific calculation process of calculating the value of the concentration degree of the user reading this time based on the time length of each time period and the concentration degree rating by the electronic device 100 is described below by taking a linear weighting manner as an example:
in one possible implementation, the value V of the concentration degree of the user reading this time may be calculated by using the following formula 1:
Figure BDA0003184951950000191
wherein, T 1 ,T 2 ,T 3 ,T 4 The four categories of concentration degrees are in turn: duration of concentration, more concentration, less concentration, no concentration; c 3 ,C 4 The times of the little or no attention (or the number of the time periods of the little or no attention) are sequentially set; t is the total time length of the reading of the user (namely the time interval between the starting time and the ending time of the reading process); t is p The total time length of the midway suspension of the reading process (namely the sum of the time intervals between the midway suspension of the drawing book by the user and the continuous reading of the drawing book by returning the user again in the reading process at one time); c p The number of times of suspension in the middle of the reading process; p is the total number of pages turned over (skipped pages are not counted); p f The total number of pages actually read by the user (i.e., the total number of pages turned over minus the number of pages turned over without reading).
In the embodiment of the present application, the implementation manner of the electronic device 100 calculating the time length of each pause halfway of the user in one reading process may be as follows: the electronic device 100 may record the time t1 when receiving an instruction (e.g., a voice instruction) of suspending reading from the user, and then the electronic device 100 may record the time t2 when receiving an instruction (e.g., a voice instruction) of resuming to continue reading from the user, and calculate a difference between the time t2 and the time t1, where the difference is a time length of one pause halfway between the user.
Wherein, the parameter a in the above formula 1 1 ,a 2 ,a 3 ,a 4 ,a p ,a f ,thr p The values of (A) can be preset, and the user can also set autonomously according to own standards. For example, duration ratio for concentration
Figure BDA0003184951950000192
Comparing duration of concentration
Figure BDA0003184951950000193
Forward weighting a may be given in turn 1 =1,a 2 And =0. For duration of low concentration
Figure BDA0003184951950000194
Duration of inattention to percentage of time
Figure BDA0003184951950000195
Duration ratio of pause
Figure BDA0003184951950000196
May in turn be given a negative weighting a 3 =-2,a 4 =-5,a p And (4) = -1. Ratio of number of pages read by user
Figure BDA0003184951950000197
Forward weighting a may be given f =2, and thr may be set p =0.9, if
Figure BDA0003184951950000198
If it exceeds 0.9, then
Figure BDA0003184951950000199
May be a positive value (i.e., a positive value).
Wherein the Penalty value Penalty is determined based on the number of periods of low and low concentration, the Penalty value Penalty may be calculated, for example, by the following equation 2:
Figure BDA0003184951950000201
as can be seen from equation 2 above, the number of time periods C of less attention is 3 And the number of unfocused time periods C 4 The more Penalty, the closer the value to 0. In equation 1, the Penalty value Penalty may be compared with
Figure BDA0003184951950000202
Multiplication.
Referring to table 8, table 8 illustrates a mapping relationship between the value of concentration V and the concentration rating.
As can be seen from table 8, if the value V of the concentration degree is greater than 0.5, it indicates that the user is concentrated in the reading this time; if the value V of the concentration degree is between-0.1 and 0.5, the user is relatively concentrated in the reading; if the value V of the concentration degree is between-3 and-0.1, the fact that the user is not concentrated in the reading is shown; if the value V of the concentration degree is smaller than-3, the user is not concentrated in the reading.
Value of concentration V Concentration rating
V>0.5 Focusing on
-0.1≤V≤0.5 Focus on
-3≤V<-0.1 Is not focused too much
V<-3 Do not concentrate on
TABLE 8
It should be noted that the mapping relationship between the concentration value V and the concentration rating shown in table 8 is merely exemplary, and should not be construed as limiting the present application. In some embodiments, the three thresholds 0.5, -0.1, -3 of the value V of concentration in table 8 may also be set to other values for distinguishing different concentration ratings.
Based on the information in tables 1 to 8, the electronic device 100 may calculate the value of the concentration degree of the user reading this time by using the above formula 1 and formula 2.
Specifically, it can be found from tables 1, 3 and 7 that: t is 1 Is 5 minutes (i.e., sum of time periods for each time period corresponding to the time period when the concentration rating is "concentration" in Table 7), T 2 Is 1 minute (i.e., sum of time periods corresponding to time periods when the concentration rating is "comparative concentration" in Table 7), T 3 Is 3 minutes (i.e., the sum of the durations of the time periods corresponding to the time periods in Table 7 when the concentration rating is "low concentration"), T 4 Is 1 minute (i.e. the sum of the durations of the time periods corresponding to the time period when the concentration degree is rated as "concentration" in table 7), T is 10 minutes (i.e. the time interval from the start time corresponding to the first page turning record to the end time corresponding to the last page turning record in table 2), P is 10 pages (i.e. the total number of rows corresponding to the column of "page number" in table 2), and P is f Is 9 pages (i.e. the total number of rows corresponding to one column of page number minus the total number of rows corresponding to one column of page read-out proportion less than 1), C 3 2 times (number of times the degree of concentration is rated "less concentrated" in Table 7), C 4 1 (number of times the concentration rating is "no concentration" in table 7). In this calculation process, assume T p Is 0 min, C p The number of the test pieces was 0.
Further, based on the determined specific values of the parameters in the above equation 1 and the above equation 2, the electronic device 100 may calculate the value of the concentration degree of the user reading this time to be about-0.7 by using the above equation 1 and the above equation 2.
Further, based on the mapping relationship between the value of concentration V and the concentration rating in table 8, the electronic device 100 may determine that the user is less concentrated on the reading.
At a 1 =1,a 2 =0,a 3 =-2,a 4 =-5,a p =-1,a f =2,thr p If =0.9, the electronic apparatus 100 calculates the maximum value of the concentration degree of the user in one reading by using the above equation 1 and the above equation 2 to be about 1.2, and the minimum value to be about-5.
Wherein, the maximum value of the concentration degree is calculated under the condition that the user has no abnormal behavior in the whole reading process, namely, the concentration degree of the user is 'concentration' (namely T) in the whole reading process 1 =T、T 2 =T 3 =T 4 = 0), not suspended in the middle (i.e. T) p =0、C p = 0), no page that has not been read out (i.e. not completely read out)
Figure BDA0003184951950000211
)。
Wherein the minimum value of the concentration degree is calculated under the following conditions: the concentration degree of the user in the whole reading process is 'not concentrated', namely T 4 =T、T 1 =T 2 =T 3 =0, ratio of total time length of midway pause to total time length of current reading
Figure BDA0003184951950000212
Number of pauses in midway C p The proportion of the number of pages read by the user is more than or equal to 3
Figure BDA0003184951950000213
It is easy to see that the higher the user's concentration rating for each time period, and/or the longer the duration corresponding to the time period with the higher concentration rating, the higher the user's concentration rating throughout the reading of the book.
It should be noted that the electronic device 100 may also use other weighting methods to calculate the value of the concentration degree of the user in the current reading, which is not limited in this embodiment of the application.
By performing the method shown in fig. 4, the electronic device 100 may output the time-interval result (e.g., the analysis result shown in table 7) of the statistical phase and the final calculation result (i.e., the value of the concentration degree and the concentration degree rating corresponding to the value of the concentration degree). The time-sharing result and the final calculation result can be used for evaluating the reading condition of the child user in the reading process for the reference of parents. In addition, the attentiveness trend graph labeled with the abnormal behavior (e.g., the attention trend graph of the user in one reading process shown in fig. 5) output by the electronic device 100 may be used as the detail of the reading condition of the child user for the parent to view.
In some embodiments, the electronic device 100 may determine whether the user is interested in the type of sketch or content based on the calculated value of the concentration and the concentration rating corresponding to the value of the concentration, e.g., a higher concentration of the user in reading may indicate a higher interest of the user in the type of sketch or content. If the electronic device 100 determines that the user is interested in this type of script or content, the electronic device 100 may output some scripts of the same type for recommendation to the user.
In other embodiments, the parameters in equation 1 and equation 2 may be set autonomously. In this way, parents of child users can be given certain control. For example, the specific meaning and importance degree of each parameter in the above formula 1 and the above formula 2 may be presented to the parents to facilitate the parents to understand, and if the parents consider that a certain parameter in the above formula 1 and/or formula 2 is not important, the parents may delete the parameter or autonomously adjust the value of the parameter.
The electronic device 100 in the embodiment of the application is an electronic device with a text-drawing reading capability, which may also be referred to as a text-drawing reading intelligent terminal device, and the electronic device 100 may be a text-drawing reading robot, a text-drawing reading mobile phone, a text-drawing reading tablet, or the like. The embodiment of the present application does not limit the type, physical form, and size of the electronic device 100.
In some embodiments, in the case that the electronic device 100 is a book-drawing reading robot, the electronic device 100 may establish a communication connection (e.g., a bluetooth communication connection, etc.) with an electronic device 200 (e.g., a mobile phone) with a display screen in advance, the electronic device 100 may send an analysis result (e.g., the aforementioned time-sharing result, a final calculation result, a concentration degree change trend diagram labeled with abnormal behavior, etc.) output by the aforementioned statistical stage to the electronic device 200, and the electronic device 200 may store the result, and the result may be displayed on the display screen of the electronic device 200 for a parent of a child user to view, so as to know a reading condition of the child user during reading. Alternatively, the parameters in the above equations 1 and 2 may be set autonomously by the parents on the electronic device 200.
The following describes a structure of an electronic device 100 according to an embodiment of the present application.
Fig. 6 schematically illustrates a structure of an electronic device 100 provided in an embodiment of the present application.
As shown in fig. 6, the electronic device 100 may include: processor 601, memory 602, speaker 603, microphone 604, bus 605, camera 606, power supply 607. These components may be connected by a bus 605. Wherein:
the processor 601 may be used to read and execute computer readable instructions, including one or more processing cores, and the processor 601 executes software programs and modules to thereby perform various functional applications and information processing. In particular implementations, the processor 601 may mainly include a controller, an operator, and a register. The controller is mainly responsible for instruction decoding and sending out control signals for operations corresponding to the instructions. The arithmetic unit is mainly responsible for executing fixed-point or floating-point arithmetic operation, shift operation, logic operation and the like, and can also execute address operation and conversion. The register is mainly responsible for storing register operands, intermediate operation results and the like temporarily stored in the instruction execution process. In a Specific implementation, the hardware architecture of the processor 601 may be an Application Specific Integrated Circuits (ASIC) architecture, an MIPS architecture, an ARM architecture, or an NP architecture.
The memory 602 is coupled to the processor 601 by a bus 605. The memory 602 may be used to store various software programs and/or sets of program instructions. The processor 601 is configured to execute at least one program instruction to implement the technical solutions of the above embodiments. The implementation principle and technical effect are similar to those of the embodiments related to the method, and are not described herein again.
A speaker 603 (which may also be referred to as an audio playback device or audio playback device) is coupled to the processor 601 via the bus 605 and may be used to convert electrical audio signals to sound signals. In this embodiment, the electronic device 100 may play audio corresponding to the text content through the speaker 603.
A microphone 604 is coupled to the processor 601 via the bus 605 and is operable to convert acoustic signals into electrical signals. In the embodiment of the present application, the electronic device 100 may receive a sound signal emitted by a user through the microphone 604, so as to perform voice interaction with the user. Optionally, the microphone 604 may not be included in the structure of the electronic device 100 when the electronic device 100 does not need to implement the function of voice interaction with the user.
A camera 606 is coupled to the processor 601 via the bus 605 and may be used to capture still images or video. In the embodiment of the present application, the camera 606 may be used to capture an image of a cover page of the sketch and/or an image of a content page of the sketch, and further, the electronic device 100 may perform sketch recognition by using an image captured by the camera 606 (for example, a camera of 3456 × 3456).
The power supply 607 may be used to power internal components of the processor 601, memory 602, speaker 603, microphone 604, camera 606, etc.
Optionally, the structure of the electronic device 100 may further include a steering engine, and the steering engine may be used to adjust the posture of the electronic device 100, so that the camera 606 of the electronic device 100 may capture a picture. For example, in a case that the electronic device 100 is a drawing reading robot, when the drawing reading robot executes a drawing reading task, the steering engine may control the drawing reading robot to lower the head (the camera may be mounted on the head, for example, the forehead), so that the drawing robot may shoot the drawing at a position in front of the body at a certain distance (for example, about 30 centimeters).
The electronic device 100 may also include a display (not shown) within its structure that may be used to output graphical image information. In this embodiment of the application, the display screen may display the concentration analysis result output by the electronic device 100 during one reading process of the user, for example, the analysis result shown in table 7, the calculated value of the concentration and the concentration rating corresponding to the value of the concentration, the concentration change trend diagram labeled with the abnormal behavior, and the like.
The electronic device 100 may further include a communication module (not shown in the figure), which may be used for the electronic device 100 to establish a communication connection with other electronic devices and communicate with other electronic devices through the communication connection.
The electronic device 100 may further include a timer (not shown in the figure) configured to record the page turning time, the question asking time, the end time of the playing content, and the like of the user by the electronic device 100.
In the embodiments of the present application, the processor may be a general-purpose processor, a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component, and may implement or execute the methods, steps, and logic blocks disclosed in the embodiments of the present application. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor.
In the embodiment of the present application, the memory may be a nonvolatile memory, such as a Hard Disk Drive (HDD) or a solid-state drive (SS), and may also be a volatile memory (RAM), for example. The memory is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, not limited thereto.
The memory in the embodiments of the present application may also be circuitry or any other device capable of performing a storage function for storing program instructions and/or data.
It should be understood that the electronic device 100 shown in fig. 6 is merely an example, and that the electronic device 100 may have more or fewer components than shown in fig. 6, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 6 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The following describes another structure of the electronic device 100 according to the embodiment of the present application.
Fig. 7 exemplarily shows a structure of another electronic device 100 provided in the embodiment of the present application.
As shown in fig. 7, the electronic device 100 may include: the mobile terminal includes a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, so as to implement a function of answering a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, audio module 170 and wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to implement the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other terminal devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device 100 through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may also be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be an Open Mobile Terminal Platform (OMTP) standard interface of 3.5mm, or a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The method can also be used for identifying the posture of the electronic equipment 100, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint characteristics to unlock a fingerprint, access an application lock, photograph a fingerprint, answer an incoming call with a fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid abnormal shutdown of the electronic device 100 due to low temperature. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 is also compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
It should be understood that the electronic device 100 shown in fig. 7 is merely an example, and that the electronic device 100 may have more or fewer components than shown in fig. 7, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 7 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The structure of the electronic device 200 may be partially or completely the same as that of the electronic device 100 shown in fig. 7, and is not described herein again.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the procedures or functions described in accordance with the present application are generated, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (13)

1. A picture book reading method is applied to an electronic device comprising a camera, and comprises the following steps:
the electronic equipment acquires a picture of a picture book by using the camera and identifies a picture book page based on the picture of the picture book;
the electronic equipment generates a page turning record when detecting that the page number of the picture book page changes;
the electronic equipment judges whether the user has abnormal behaviors or not based on the page turning records, and if so, the electronic equipment records the abnormal behaviors;
the electronic device determines a concentration rating of the user based on the anomalous behavior.
2. The method of claim 1, wherein the electronic device determines a concentration rating of the user based on the anomalous behavior, in particular comprising:
the electronic equipment records the total time length of the whole book reading and drawing process of the user and divides the total time length into M time periods, wherein M is a positive integer;
the electronic equipment records the occurrence frequency of the abnormal behaviors in the M time periods, and determines concentration degree ratings of the user corresponding to the M time periods respectively based on the occurrence frequency of the abnormal behaviors in the M time periods;
wherein the greater the number of occurrences of the anomalous behavior, the lower the concentration rating; the fewer the number of occurrences of the anomalous behavior, the higher the concentration rating.
3. The method of claim 2, further comprising:
if the electronic device determines that the concentration rating of the user corresponding to the mth time period of the M time periods is not the highest, and determines that the concentration rating of the user corresponding to the M +1 time period is a based on the number of occurrences of the abnormal behavior in the M +1 time period of the M time periods, wherein the concentration rating a is not the highest nor the lowest, the electronic device determines to adjust the concentration rating of the user corresponding to the M +1 time period from the concentration rating a to a concentration rating B, wherein the concentration rating B is one level lower than the concentration rating a, and M is a positive integer and smaller than M.
4. The method of claim 2 or 3, wherein after the electronic device determines the user's concentration rating based on the anomalous behavior, the method further comprises:
the electronic equipment determines the concentration degree rating of the whole reading and drawing process of the user based on the duration of the M time periods and the concentration degree rating of the user corresponding to the M time periods;
the higher the concentration rating of the user corresponding to the M time periods is, and/or the longer the duration corresponding to the time period with the higher concentration rating is, the higher the concentration rating of the user in the whole reading and painting process is.
5. The method of any one of claims 1-4, wherein the abnormal behavior comprises one or more of: a page jump behavior, a long-time page turning behavior, a page turning behavior after being read, and a question non-response behavior.
6. The method of any one of claims 1-5, wherein the page turn record includes one or more of: page number, starting time, page reading completion ratio, playing content ending time, question asking time, question response and ending time.
7. The method according to claim 5 or 6, wherein the determining, by the electronic device, whether the user has an abnormal behavior based on the page-turning record specifically includes:
if the electronic equipment detects that the page number before the user turns the page is not continuous with the page number turned by the user in the page turning record, the electronic equipment determines that the user has the page skipping behavior.
8. The method according to any one of claims 5 to 7, wherein the determining, by the electronic device, whether the user has an abnormal behavior based on the page-turning record specifically includes:
if the electronic device detects that the ending time in the page turning record exceeds a first time threshold value in comparison with the delay of the playing content ending time, the electronic device determines that the user has the long-time non-page turning behavior.
9. The method according to any one of claims 5 to 8, wherein the determining, by the electronic device, whether the user has an abnormal behavior based on the page-turning record specifically includes:
if the electronic device detects that the page reading completion ratio in the page turning record is smaller than 1, the electronic device determines that the user has the behavior of turning pages without reading.
10. The method according to any one of claims 5 to 9, wherein the determining, by the electronic device, whether the user has an abnormal behavior based on the page-turning record specifically includes:
the electronic equipment initiates a question to a user, records the question time, and determines that the user has the behavior of 'no response to the question' if the electronic equipment does not receive the response of the user within a second time threshold value after the electronic equipment initiates the question.
11. The method of any of claims 1-9, wherein after the electronic device determines that the user has the abnormal behavior, the method further comprises:
and the electronic equipment outputs a prompt corresponding to the abnormal behavior through an output device.
12. An electronic device, comprising one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors for storing computer program code, the computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of claims 1-11.
13. A computer storage medium, characterized in that the computer storage medium stores a computer program comprising program instructions that, when run on an electronic device, cause the electronic device to perform the method of any of claims 1-11.
CN202110860614.7A 2021-07-28 2021-07-28 Drawing book reading method and related equipment Pending CN115700847A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110860614.7A CN115700847A (en) 2021-07-28 2021-07-28 Drawing book reading method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110860614.7A CN115700847A (en) 2021-07-28 2021-07-28 Drawing book reading method and related equipment

Publications (1)

Publication Number Publication Date
CN115700847A true CN115700847A (en) 2023-02-07

Family

ID=85120641

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110860614.7A Pending CN115700847A (en) 2021-07-28 2021-07-28 Drawing book reading method and related equipment

Country Status (1)

Country Link
CN (1) CN115700847A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117217624A (en) * 2023-10-27 2023-12-12 广州道然信息科技有限公司 Child reading level prediction method and system based on drawing book reading record

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117217624A (en) * 2023-10-27 2023-12-12 广州道然信息科技有限公司 Child reading level prediction method and system based on drawing book reading record
CN117217624B (en) * 2023-10-27 2024-03-01 广州道然信息科技有限公司 Child reading level prediction method and system based on drawing book reading record

Similar Documents

Publication Publication Date Title
CN110035141B (en) Shooting method and equipment
US11889180B2 (en) Photographing method and electronic device
CN111061912A (en) Method for processing video file and electronic equipment
US20230421900A1 (en) Target User Focus Tracking Photographing Method, Electronic Device, and Storage Medium
CN110119684A (en) Image-recognizing method and electronic equipment
CN110851067A (en) Screen display mode switching method and device and electronic equipment
WO2021052139A1 (en) Gesture input method and electronic device
CN110012130A (en) A kind of control method and electronic equipment of the electronic equipment with Folding screen
CN112583957A (en) Display method of electronic device, electronic device and computer-readable storage medium
CN113467735A (en) Image adjusting method, electronic device and storage medium
EP4270245A1 (en) User determination method, electronic device, and computer-readable storage medium
CN111930335A (en) Sound adjusting method and device, computer readable medium and terminal equipment
CN113574525A (en) Media content recommendation method and equipment
CN114822525A (en) Voice control method and electronic equipment
CN113593567B (en) Method for converting video and sound into text and related equipment
CN111314763A (en) Streaming media playing method and device, storage medium and electronic equipment
CN115700847A (en) Drawing book reading method and related equipment
CN109285563B (en) Voice data processing method and device in online translation process
WO2022214004A1 (en) Target user determination method, electronic device and computer-readable storage medium
US20230402150A1 (en) Adaptive Action Evaluation Method, Electronic Device, and Storage Medium
CN114120987B (en) Voice wake-up method, electronic equipment and chip system
CN115393676A (en) Gesture control optimization method and device, terminal and storage medium
CN114466238A (en) Frame demultiplexing method, electronic device and storage medium
CN113918003A (en) Method and device for detecting time length of skin contacting screen and electronic equipment
CN113380374B (en) Auxiliary motion method based on motion state perception, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination