WO2023228391A1 - Portable information terminal and display control method for portable information terminal - Google Patents

Portable information terminal and display control method for portable information terminal Download PDF

Info

Publication number
WO2023228391A1
WO2023228391A1 PCT/JP2022/021661 JP2022021661W WO2023228391A1 WO 2023228391 A1 WO2023228391 A1 WO 2023228391A1 JP 2022021661 W JP2022021661 W JP 2022021661W WO 2023228391 A1 WO2023228391 A1 WO 2023228391A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
distance
information terminal
display
mobile information
Prior art date
Application number
PCT/JP2022/021661
Other languages
French (fr)
Japanese (ja)
Inventor
眞弓 中出
達也 山本
仁 秋山
和之 滝澤
康宣 橋本
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Priority to PCT/JP2022/021661 priority Critical patent/WO2023228391A1/en
Publication of WO2023228391A1 publication Critical patent/WO2023228391A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to an information display device and a display control method, and more particularly to a mobile information terminal and a display control method for a mobile information terminal that calls attention to the effects of screen viewing on visual acuity.
  • Patent Document 1 describes ⁇ a distance measuring means for detecting the distance to a human body existing in a predetermined range in front of a display device.'' and an alarm control means for outputting an alarm when the measured distance detected by the distance measuring means becomes less than a preset limit distance (Summary excerpt)" is disclosed. There is.
  • the present invention has been made in view of the above-mentioned problems, and aims to provide a mobile information terminal and a display control method for a mobile information terminal that maintains an appropriate distance between the eyes and the screen.
  • the present invention includes the configurations described in the claims.
  • the present invention is a mobile information terminal, which includes a display, a camera, a distance sensor, a timer, and a processor, and the processor is configured to provide user information based on the image taken by the camera. recognizes the face of the user or the eyes of the user, obtains the distance from the distance sensor to the face of the user or the eyes of the user, and determines a decrease in visual acuity based on the predetermined distance to the face of the user or the eyes of the user.
  • a first elapsed time from when the distance has fallen below is acquired from the timer, and when the first elapsed time becomes equal to or greater than a predetermined start determination time, visibility reduction control is performed to reduce the visibility of the display, and the A second elapsed time from when the distance to the user's face or the user's eyes exceeds the visual acuity deterioration determination distance is acquired from the timer, and when the second elapsed time exceeds a predetermined end determination time, the visual recognition is performed.
  • the display is returned to normal display by completing the performance reduction control.
  • FIG. 3 is a rear view of the smartphone. It is a hardware configuration diagram of a smartphone.
  • FIG. 2 is a functional block diagram according to the present embodiment.
  • FIG. 3 is an explanatory diagram of distances measured in this embodiment.
  • FIG. 2 is a diagram illustrating an overview of a visual acuity reduction prevention mode according to the present embodiment. It is a flowchart which shows the flow of processing of a 1st embodiment. It is a flow chart which shows the flow of visual acuity decline prevention processing in a 2nd embodiment. It is a figure which shows the example of a screen (size change) in which visibility reduction strength was changed according to the distance to an eye.
  • FIG. 7 is a flowchart showing the flow of processing according to a third embodiment.
  • FIG. 7 is a diagram showing the relationship between the execution time of visibility reduction processing and visibility reduction intensity. It is a flow chart which shows the flow of processing concerning a 4th embodiment. It is a flowchart which shows the flow of processing concerning a 5th embodiment. It is a figure which shows another example of the screen during execution of a visual acuity decline prevention process. It is a figure which shows another example of hardware. It is an external view of a head mounted display.
  • FIG. 7 is a flowchart showing the flow of processing according to a third embodiment.
  • FIG. 7 is a diagram showing the relationship between the execution time of visibility reduction processing and visibility reduction intensity. It is a flow chart which shows the flow of processing concerning a 4th embodiment. It is a flowchart which shows the flow of processing concerning a 5th embodiment. It is a figure which shows another example of the screen during execution of a visual acuity decline prevention process. It is a figure which
  • FIG. 2 is a hardware configuration diagram of a head-mounted display.
  • FIG. 3 is a diagram showing an example of visual acuity reduction prevention processing using a head-mounted display.
  • FIG. 2 is a diagram showing a system for transmitting a log of a child's viewing history to a parent.
  • the present invention is expected to contribute to the prevention of visual acuity deterioration caused by viewing on mobile information terminals. It can be expected to contribute to welfare.
  • a smartphone will be described as a specific example of a mobile information terminal, but the mobile information terminal is not limited to a smartphone as long as it is a mobile information terminal that can be held in the hand and operated while viewing the screen.
  • a tablet is also an example of a portable information terminal to which the present invention can be applied.
  • the present invention can also be applied to wearable terminals that do not need to be held in the hand, such as head-mounted displays and smart glasses.
  • FIG. 1A is a front view of the smartphone.
  • FIG. 1B is a rear view of the smartphone.
  • the front of the smartphone 100 is equipped with an in-camera 131, a display 133, a speaker 141, and a distance sensor 153.
  • an out camera 132 is provided on the back of the smartphone 100.
  • FIG. 2 is a diagram of the hardware configuration of the smartphone.
  • the smartphone 100 includes a main processor 101, a storage device 110, an input interface (I/F) 120, an image processing device 130, an audio processing device 140, a sensor group 150, a communication I/F 160, and an expansion I/F 172. , and a timer 173, which are connected to each other via the bus 102.
  • I/F input interface
  • the storage device 110 includes a RAM 111, a ROM 112, and a flash memory 113.
  • the input I/F 120 includes a button switch 121 and a touch panel 122.
  • the image processing device 130 includes an in-camera 131, an out-camera 132, and a display 133.
  • the audio processing device 140 includes a speaker 141 and a microphone 142.
  • the sensor group 150 includes a GPS receiver 151, a geomagnetic sensor 152, a distance sensor 153, an acceleration sensor 154, a gyro sensor 155, and a biological information acquisition sensor 156.
  • the communication I/F 160 includes a wireless communication I/F 161 and a telephone network communication I/F 162.
  • FIG. 3 is a functional block diagram according to this embodiment.
  • the smartphone 100 includes a face recognition section 101a, a visibility reduction processing section 101b, a timekeeping section 101c, a user authentication section 101d, and a mode setting section 101e. These are realized by the main processor 101 loading a vision reduction prevention program into the RAM 111 and executing it. Details of the functions of each part will be described later.
  • the flash memory 113 is formed with a various table storage section 113a and a user registration information storage section 113b.
  • the various table storage unit 113a records tables necessary for executing the visual acuity decline prevention program.
  • the user registration information stored in the user registration information storage section 113b is used to execute a visual acuity reduction prevention program related to the shared terminal, which will be described later.
  • FIG. 4A is an explanatory diagram of distances measured in this embodiment.
  • FIG. 4B is a diagram showing an overview of the visual acuity reduction prevention mode according to the present embodiment.
  • the distance D from the display 133 of the smartphone 100 to the user's face or the user's eyes is measured, and the operation in the visual acuity reduction prevention mode is performed based on this distance D.
  • a mode may be adopted in which the subject is recognized as the user (a person) without performing face recognition or eye recognition of the user.
  • the main processor 101 determines that when the state (near state) in which the distance D is less than or equal to a predetermined distance (Dth: visual acuity deterioration determination distance) continues for a certain period of time (TS: start determination time), Visibility reduction processing is executed on the screen of the display 133 (also referred to as visibility reduction control). Visibility reduction processing involves blurring the screen or reducing the display size.
  • the main processor 101 ends the visibility reduction process on the screen of the display 133 and returns the screen to the normal screen. Thereby, the display 133 is displayed in the display state before the visibility reduction process is performed.
  • the first embodiment is an embodiment in which visual acuity reduction prevention processing is performed on a smartphone operated by one user.
  • FIG. 5 is a flowchart showing the process flow of the first embodiment.
  • the face recognition unit 101a executes face recognition processing on the image of the in-camera 131, recognizes the user's face or eye position, and uses the distance sensor 153 to detect the face or Measure the distance D to the eye (S01, see FIG. 4).
  • the visibility reduction processing unit 101b compares the distance D and the visual acuity deterioration determination distance Dth, and determines that the distance D is larger than the visual acuity deterioration determination distance Dth, that is, the eyesight of the user A is lower than the visual acuity deterioration determination distance Dth. If it is determined that it is farther than the distance Dth (S02: NO), the process returns to step S01 and continues measuring the distance D.
  • the visibility reduction processing unit 101b determines that the distance D is less than or equal to the visual acuity deterioration determination distance Dth, that is, the user A's eyes are the same as or closer than the visual acuity deterioration determination distance Dth to the display 133 of the smartphone 100 (S02: YES), measurement of the first elapsed time T1 after the distance D becomes equal to or less than the visual acuity deterioration determination distance Dth is started (S03).
  • a timer 171 as hardware may be provided, and the timer 171 may be activated to measure the elapsed time and the execution time described below, or the timer 171 may be configured as software.
  • the timer 101c may acquire the time at the start of measurement and the time at the end of measurement from the RTC, and measure the elapsed time and execution time from the difference between them.
  • the visibility reduction processing unit 101b If the first elapsed time T1 is shorter than the start determination time TS (S04: NO), the visibility reduction processing unit 101b returns to step S01 and continues measuring the distance and counting the first elapsed time T1.
  • the main processor 101 When the first elapsed time T1 is equal to or longer than the start determination time TS (S04: YES), the main processor 101 initializes the first elapsed time T1 (S05) and The sex reduction process is started (S06).
  • the smartphone 100 continues measuring the distance D (S07), and the visibility reduction processing unit 101b compares the distance D with the visual acuity reduction determination distance Dth (S08).
  • the process returns to step S06 and continues the visibility reduction processing.
  • the visibility reduction processing unit 101b determines that the distance D is farther than the visual acuity reduction determination distance Dth (S08: NO)
  • the visibility reduction processing unit 101b determines the elapsed time (second elapsed time T2) after the distance D becomes equal to or less than the visual acuity reduction determination distance Dth. ) measurement is started (S09).
  • the process returns to step S06 and continues the visibility reduction processing.
  • the visibility reduction processing unit 101b stops the visibility reduction processing, switches to normal display, and initializes the second elapsed time T2 ( S11).
  • the start determination time TS when the start determination time TS has elapsed after the distance between the eyes and the mobile information terminal becomes less than or equal to the visual acuity deterioration determination distance, which is a distance at which deterioration of visual acuity is a concern, visibility reduction processing is performed; The user is alerted to the closeness of the display.
  • the visibility of the display since the visibility of the display is reduced, it is necessary to move the display away from the eyes in order to improve the display quality.
  • simply notifying that the screen is close to the eyes has little hope of improving the distance between the eyes and the display, but according to this embodiment, the distance between the eyes and the display can be improved to restore the visibility of the screen. Movement is required to maintain an appropriate distance between the user's eyes and the display.
  • the distance D from the user's eyes is more likely to change than with a fixed display.
  • the visibility reduction process is not performed unless the distance D approaches the visual acuity deterioration determination distance Dth for a time equal to or longer than the start determination time TS, and the visibility reduction process does not occur unless the distance D approaches the visual acuity deterioration determination distance Dth.
  • the second embodiment is an embodiment in which the visibility reduction strength is varied according to the distance between the eyes and the mobile information terminal in the visual acuity reduction prevention process.
  • FIG. 6 is a flowchart showing the flow of visual acuity reduction prevention processing in the second embodiment. Steps that are the same as those in the process flow of the first embodiment shown in FIG. 5 are given the same reference numerals, and redundant explanation will be omitted.
  • the visibility reduction processing unit 101b stops measuring the first elapsed time T1 when the first elapsed time T1 after the distance D becomes less than or equal to the visual acuity reduction determination distance Dth becomes equal to or greater than the start determination time TS (S05),
  • the visibility reduction strength is set according to the distance D (S11). The lower the visibility reduction strength is, the smaller the distance D is below the visual acuity reduction determination distance Dth, the lower the visibility of the display 133 is, that is, the stronger the visibility reduction strength is.
  • the visibility reduction processing unit 101b starts visibility reduction processing according to the determined visibility reduction intensity (S06).
  • the distance D is measured during the visibility reduction process (S07), the visibility reduction processing unit 101b resets the visibility reduction intensity according to the distance D (S12), and the visibility reduction processing unit 101b resets the visibility reduction intensity according to the reset visibility reduction intensity.
  • the performance reduction process is re-executed (S13).
  • the time required from measuring distance to setting the visibility reduction strength may exceed one frame, or the visibility reduction strength may be set at predetermined frame intervals to reduce processing load and power consumption. Processing may be performed. There is no particular limitation on the interval of the visibility reducing strength treatment.
  • the visibility reduction strength of the previous frame is inherited.
  • the distance measurement interval may be changed during the visual acuity deterioration prevention process and other than the visual acuity deterioration prevention process.
  • the interval may be every 10 frames when the visual acuity reduction prevention process is not performed, and the interval may be 1 frame during the visibility reduction process.
  • followability is not required so much, so priority may be given to the effect of reducing the processing load.
  • the process returns to step S07 to measure the distance D, reset the visibility reduction intensity (S12), and perform the visibility reduction process.
  • the re-execution of the process (S13) is repeated. This repeated process continues until the visibility reduction processing unit 101b determines that the distance D is greater than the visual acuity reduction determination distance Dth (S08: NO). Thereafter, measurement of the second elapsed time T2 is started (S09), and the second elapsed time T2 is compared with the end determination time TE (S10).
  • step S7 If the second elapsed time T2 is less than the end determination time TE (S10: NO), the process returns to step S07. If the second elapsed time T2 is equal to or longer than the end determination time TE (S10: YES), the process advances to step S11, switches to normal display, and stops measuring the second elapsed time T2 (S11). After that, the process ends.
  • FIG. 7 is a diagram showing an example of a screen (change in size) in which the visibility reduction strength is changed depending on the distance to the eye.
  • the size of the display screen is changed depending on the degree of visibility reduction. That is, as shown in the distance screen size table 1333, when the distance D to the eyes is less than or equal to the visual acuity deterioration determination distance Dth, as the distance D becomes smaller, the screen size is changed to a small screen 1332 that is smaller than the standard size normal screen 1331. . When the distance D is equal to or greater than the visual acuity deterioration determination distance Dth, the normal screen 1331 is displayed regardless of the distance D.
  • the visibility reduction process may include lowering the brightness of the display 133 as the distance D becomes smaller, or increasing the amplitude (intensity) of the vibrator and shaking the smartphone 100 to lower the visibility.
  • image processing is performed on the display 133 using a central projection method that uses the center of the screen as a reference point, without changing the screen size, and processing that reduces the overall visibility by lowering the visibility of the center of the screen. You may go. At this time, the center of the central projection method may be set farther away as the distance D becomes smaller.
  • a tactile device using a piezoelectric element or the like may be placed in a place where the smartphone 100 is held, and the smartphone 100 may be guided in the direction opposite to the eye by giving a pulling sensation due to pressure to the finger or hand.
  • FIG. 8 is a diagram showing an example of a screen (change in degree of blur) in which the intensity of visibility reduction is changed depending on the distance to the eye.
  • the present embodiment by reducing visibility as the distance D becomes shorter during visibility reduction processing, it is possible to urge the user to ensure a distance D that is equal to or greater than the visual acuity reduction determination distance Dth.
  • the third embodiment is an embodiment in which the start determination time TS until the visibility reduction process is started is changed according to the distance D.
  • FIG. 9 is a diagram showing a distance start determination time table.
  • the start determination time TS may be set to 0 seconds and the visibility reduction process may be started immediately. From beyond the close distance to the visual acuity deterioration determination distance Dth, the start determination time TS may be configured to become longer as the distance D becomes longer.
  • FIG. 10 is a flowchart showing the flow of processing according to the third embodiment.
  • the visibility reduction processing unit 101b measures the distance D (S01), compares it with the visual acuity reduction determination distance Dth (S02), and starts measuring the first elapsed time T1 ( S03).
  • the visibility reduction processing unit 101b refers to the distance start determination time table 1336 and determines the TS corresponding to the distance D measured in step S01 (S21).
  • the visibility reduction processing unit 101b compares the start determination time TS with the first elapsed time T1 (S04), and determines whether or not it is necessary to execute the visibility reduction processing. Thereafter, the process flow of the first embodiment continues to S05, and the process flow of the second embodiment continues to S11.
  • the visibility reduction process can be started immediately. Furthermore, even if the distance D is smaller than the visual acuity deterioration determination distance Dth, the closer the visual acuity deterioration determination distance Dth is, the less strain on the eyes will be placed on the eyes, so the start determination time TS can be made longer to delay the start of the visibility deterioration process. Even if the user's eyes accidentally come close to the display, it is possible to prevent visibility reduction processing from being executed frequently and improve operability.
  • the fourth embodiment is an embodiment in which the visibility reduction strength is increased as the execution time of the visibility reduction process runs out.
  • FIG. 11 is a diagram showing the relationship between the execution time of the visibility reduction process and the visibility reduction intensity.
  • the longer the execution time T3 after starting the visibility reduction process the stronger the visibility reduction intensity.
  • the visibility reduction intensity reaches the maximum value, the visibility of the screen does not change thereafter even if the execution time T3 of the visibility reduction process becomes longer.
  • FIG. 12 is a flowchart showing the flow of processing according to the fourth embodiment. Steps that are the same as those in the process flow of the second embodiment are given the same reference numerals and redundant explanations will be omitted.
  • the visibility reduction processing unit 101b stops measuring the first elapsed time T1 when the first elapsed time T1 after the distance D becomes less than or equal to the visual acuity reduction determination distance Dth becomes equal to or greater than the start determination time TS (S05), Visibility reduction processing is started (S06).
  • the visibility reduction intensity may be a visibility reduction intensity depending on the distance D as in S11, or a predetermined visibility reduction intensity.
  • the visibility reduction processing unit 101b resets the visibility reduction intensity according to the execution time T3 (S32), and re-executes the visibility reduction process according to the reset visibility reduction intensity (S33).
  • the process determines that the distance D is less than or equal to the visual acuity reduction determination distance Dth (S08: YES)
  • the process returns to step S31 and continues measuring the execution time T3, and repeats the subsequent steps. This repeated process continues until the visibility reduction processing unit 101b determines that the distance D is greater than the visual acuity reduction determination distance Dth (S08: NO).
  • the measurement of the second elapsed time T2 is started (S09), and the visibility reduction processing unit 101b compares the second elapsed time T2 and the end determination time TE (S10). If the second elapsed time T2 is less than the end determination time TE (S10: NO), the process returns to step S31.
  • the display is switched to normal display and the measurement of the second elapsed time T2 and execution time T3 is stopped (S34). After that, the process ends.
  • the visibility reduction intensity is increased according to the execution time T3 of the visibility reduction process, so even if the distance D is smaller than the visual acuity reduction determination distance Dth, even if the distance D is smaller than the visual acuity reduction determination distance Dth.
  • the user can be made aware of the recovery posture of the distance D by increasing the visibility.
  • the fifth embodiment is an embodiment in which, in a shared terminal, on/off of the visual acuity reduction prevention mode is controlled according to the age of the user.
  • FIG. 13 is a flowchart showing the flow of processing according to the fifth embodiment.
  • the user's face is photographed with the in-camera of the smartphone 100 (S41), and the face recognition unit 101a performs face recognition processing (S42).
  • the user authentication unit 101d compares the recognized user with user registration information that associates whether the visual acuity reduction mode is turned on or off for the user (S43).
  • the registered mode here means registered content such that user A (for example, a child) turns on the visual acuity loss prevention mode, and user B (for example, a parent) turns off the visual acuity loss prevention mode.
  • the mode setting unit 101e performs a process of estimating the user's age based on the recognized face image (S44).
  • the mode setting unit 101e determines that the estimated age is less than or equal to the designated age (S45: YES)
  • the mode setting unit 101e sets the visual acuity reduction prevention mode to ON (S46).
  • a button or the like may be displayed to make it easier to cancel the mode. For example, a message may be displayed that says, "Visual deterioration prevention mode has been set. To release it, please enter a passcode from the release button.” A passcode is set in advance by a parent, etc. in order to cancel the mode. If it is not set, you can simply cancel it.
  • the mode setting unit 101e determines that the estimated age is higher than the designated age (S45: NO), it sets the visual acuity decline prevention mode to OFF (S47).
  • the designated age is an age set in consideration of the age of infants and school children who are susceptible to visual acuity and esotropia due to screen viewing.
  • the vision deterioration prevention mode can be turned on and off according to registered contents set according to the user and, for unregistered users, according to the estimated age, even on a shared terminal, the vision deterioration prevention mode can be set according to the user. Prevention processing can be performed.
  • FIG. 14 is a diagram showing another example of the screen during execution of the visual acuity reduction prevention process.
  • a warning message 1337a may be displayed as shown in a screen 1337.
  • a gauge 1338a indicating the current distance D relative to the visual acuity decrease determination distance Dth may be displayed as shown in a screen 1338.
  • a screen 1339 may be used in which the optimal position of the smartphone 100 is shown as an AR display 1339b on a background 1339a taken by the out-camera 132.
  • the configuration may be configured such that when the user moves the smartphone 100 away from the eye and the actual smartphone 100 overlaps the AR display 1339b, the visual acuity deterioration determination distance Dth can be secured.
  • FIG. 15 is a diagram showing another example of hardware.
  • the smartphone 100 shown in FIG. 15 includes an embedded in-camera 131 and a distance sensor 153 on the back side of the display 133.
  • the in-camera 131 and the distance sensor 153 are not obstructed by hands or the like while the display 133 is being viewed, so that the visibility reduction prevention process can be executed stably.
  • FIG. 16 is an external view of the head mounted display
  • FIG. 17 is a hardware configuration diagram of the head mounted display.
  • an outside camera 232 and a distance sensor 253 are provided on the outside front of the HMD 200.
  • FIG. 17 is a hardware configuration diagram of the head mounted display.
  • the HMD 200 includes a main processor 201, a storage device 210, an input interface (I/F) 220, an image processing device 230, an audio processing device 240, a sensor group 250, a communication I/F 260, an expansion I/F 272, A timer 273 is provided, and these are connected to each other via the bus 202.
  • I/F input interface
  • the storage device 210 includes a RAM 211, a ROM 212, and a flash memory 213.
  • the input I/F 220 includes a button switch 221. Furthermore, when using gesture input and voice input, the out camera 232 and the microphone 242 also function as the input I/F 220.
  • the image processing device 230 includes an out camera 232 and a transmissive display 233.
  • the audio processing device 240 includes a speaker 241 and a microphone 242.
  • the sensor group 250 includes a GPS receiver 251, a geomagnetic sensor 252, a distance sensor 253, an acceleration sensor 254, a gyro sensor 255, and a biological information acquisition sensor 256.
  • the communication I/F 260 includes a wireless communication I/F 261.
  • FIG. 18 is a diagram illustrating an example of visual acuity reduction prevention processing using a head-mounted display.
  • the main processor 201 of the HMD 200 performs image recognition processing on the image from the out-camera 232, and recognizes whether the smartphone 100 is viewing a book.
  • the distance sensor 253 measures the distance D to the smartphone 100 or the book.
  • the same processing as in each of the above embodiments may be performed using the distance D.
  • the change of screen display size is executed by the smartphone 100 in cooperation with the smartphone 100 and the HMD 200.
  • a process of reducing the brightness or transparency of the transmissive display 233 may be used instead. That is, the process of lowering the visibility of the book or the smartphone 100 may be performed by the HMD 200 or a smartphone linked to the HMD 200.
  • the HMD 200 recognizes an action of bringing the screen of the smartphone 100 closer, it is predicted that the user will want to enlarge the screen. Therefore.
  • An enlarged screen of the smartphone 100 may be displayed on the transparent display 233 of the HMD 200. This makes it possible to view the screen of the smartphone 100 while keeping the smartphone 100 away. Furthermore, if the HMD 200 and the smartphone 100 are working together, the screen may be enlarged on the smartphone 100 side.
  • FIG. 19 is a diagram showing a system for transmitting a log of a child's viewing history to a parent.
  • a smartphone 100A viewed by user A is communicatively connected to an HMD 200 worn by user B (parent), or a smartphone 100B operated by user B.
  • a communication network 300 is connected to a server 310 and wireless routers 321 and 322.
  • the smartphone 100A connects to the communication network 300 via a wireless router 321, and the HMD 200 and smartphone 100B connect to the communication network 300 via a wireless router 322.
  • the viewing log of the smartphone 100A is recorded on the server 310.
  • This viewing log includes a viewing start time, a viewing end time, and a visibility reduction process execution log during that time.
  • the viewing log is sent from the server 310 to the HMD 200 or the smartphone 100B.
  • the present invention is not limited to the embodiments described above, and includes various modifications.
  • the above-described embodiments have been described in detail to explain the present invention in an easy-to-understand manner, and the present invention is not necessarily limited to having all the configurations described.
  • each of the above-mentioned configurations, functions, processing units, processing means, etc. may be partially or entirely realized by hardware, for example, by designing an integrated circuit.
  • each of the above configurations, functions, etc. may be realized by software by a processor interpreting and executing a program for realizing each function.
  • Information such as programs, tables, files, etc. that realize each function may be stored in a memory, a recording device such as a hard disk, an SSD (Solid State Drive), or a recording medium such as an IC card, SD card, or DVD. However, it may also be stored in a device on a communication network.
  • control lines and information lines are shown that are considered necessary for explanation, and not all control lines and information lines are necessarily shown in the product. In reality, almost all components may be considered to be interconnected.
  • the embodiment includes the following embodiments.
  • the processor includes: Recognizing the user's face or the user's eyes based on the image captured by the camera, obtaining the distance from the distance sensor to the user's face or the user's eyes; A first elapsed time from when the distance to the user's face or the user's eyes becomes less than or equal to a predetermined visual acuity deterioration determination distance is obtained from the timer, and the first elapsed time is greater than or equal to a predetermined start determination time.
  • the processor includes: Recognizing the user based on the image captured by the camera, obtaining the distance from the distance sensor to the user; A first elapsed time from when the distance to the user becomes less than or equal to a predetermined visual acuity deterioration determination distance is obtained from the timer, and when the first elapsed time becomes equal to or greater than a predetermined start determination time, visual recognition of the display is performed.
  • Visibility reduction control that reduces visibility A second elapsed time from when the distance to the user exceeds the visual acuity reduction determination distance is acquired from the timer, and when the second elapsed time exceeds a predetermined end determination time, the visibility reduction control is terminated. to return the display to normal display; Mobile information terminal.
  • a display control method in a mobile information terminal comprising:
  • the mobile information terminal includes a display, a camera, a distance sensor, a timer and a processor,
  • the processor includes: Recognizing the user's face or the user's eyes based on the image captured by the camera, and obtaining the distance from the distance sensor to the user's face or the user's eyes;
  • a first elapsed time from when the distance to the user's face or the user's eyes becomes less than or equal to a predetermined visual acuity deterioration determination distance is obtained from the timer, and the first elapsed time is greater than or equal to a predetermined start determination time.

Abstract

This portable information terminal recognizes the face of a user or an eye of the user on the basis of an image captured by a camera mounted thereto, acquires the distance from a distance sensor to a face of the user or to the eye of the user, and performs visibility reduction control for reducing the visibility of a display if a first elapsed time, from when the distance had become equal to or less than a predetermined eyesight reduction criterion distance, has become equal to or more than a start criterion time. In addition, if a second elapsed time, from when the distance to the face of the user or to the eye of the user had become more than the eyesight reduction criterion distance, has become equal to or more than an end criterion time, the visibility reduction control is ended and a normal display is restored.

Description

携帯情報端末及び携帯情報端末の表示制御方法Mobile information terminal and display control method for mobile information terminal
 本発明は、情報表示装置及び表示制御方法に係り、特に画面視聴が視力に及ぼす影響についての注意喚起を行う携帯情報端末及び携帯情報端末の表示制御方法に関する。 The present invention relates to an information display device and a display control method, and more particularly to a mobile information terminal and a display control method for a mobile information terminal that calls attention to the effects of screen viewing on visual acuity.
 スマートフォンなどの携帯情報端末の画面を長時間見ることにより、視力低下が懸念される。特に子供等の若年層は、至近距離においても容易に目の焦点を合わせることができるため、ゲームや映像等に夢中になり目に負担をかけ、気が付かないうちに視力低下を引き起こす。特に距離が近ければ近いほど影響は大きい。 There is a concern that staring at the screen of a mobile information device such as a smartphone for a long time may cause vision loss. In particular, young people such as children can easily focus their eyes even at close distances, so they become absorbed in games, videos, etc., putting strain on their eyes and causing visual acuity to deteriorate without them noticing. In particular, the closer the distance, the greater the influence.
 従来からTV視聴時に子供等が近くでTVを見ることに関して問題視されており、その対策例として特許文献1では「表示装置の正面の所定範囲に存在する人体までの距離を検出する距離計測手段と、前記距離計測手段にて検出した計測距離が、予め設定された限界距離以下になったときに、警報を出力する警報制御手段とを備える(要約抜粋)」画面接近警報装置が開示されている。 Conventionally, there has been a problem with children watching TV nearby when watching TV, and as an example of a countermeasure, Patent Document 1 describes ``a distance measuring means for detecting the distance to a human body existing in a predetermined range in front of a display device.'' and an alarm control means for outputting an alarm when the measured distance detected by the distance measuring means becomes less than a preset limit distance (Summary excerpt)" is disclosed. There is.
特開2010‐44607号公報Japanese Patent Application Publication No. 2010-44607
 テレビは大型且つ重量物であるのでテレビの位置は視聴中には動かない。またテレビ視聴中に人が移動することは少ない。よって特許文献1のように、人の位置とテレビ画面との距離はそれほど頻回には動かないため、人の位置とテレビ画面との距離を用いて警報出力の要否を判断することは可能である。 Since the television is large and heavy, the position of the television does not move while watching. Also, people rarely move around while watching TV. Therefore, as in Patent Document 1, since the distance between the person's position and the TV screen does not change very often, it is possible to determine whether or not to output an alarm using the distance between the person's position and the TV screen. It is.
 これに対して、スマートフォンのように手に持って操作しながら画面を視認する携帯情報端末では、人の位置が変化しなくてもスマートフォンを把持した手を動かすことで画面と目の距離は容易に変化する。従って、人の位置と画面との距離に基づいて制御を行う特許文献1の画面接近警報装置は携帯情報端末には適用できないため、携帯情報端末に好適な視力低下の防止技術が望まれている。 On the other hand, with mobile information terminals such as smartphones, which are held in the hand and operated while viewing the screen, the distance between the screen and the eyes can be easily adjusted by moving the hand holding the smartphone, even if the person's position does not change. Changes to Therefore, the screen proximity warning device of Patent Document 1, which performs control based on the distance between a person's position and the screen, cannot be applied to mobile information terminals, and therefore, a technology for preventing visual acuity deterioration suitable for mobile information terminals is desired. .
 本発明は、上記事情課題に鑑みてなされたもので、目と画面との距離の適正に保つ携帯情報端末及び携帯情報端末の表示制御方法を提供することを目的とする。 The present invention has been made in view of the above-mentioned problems, and aims to provide a mobile information terminal and a display control method for a mobile information terminal that maintains an appropriate distance between the eyes and the screen.
 上記目的を達成するために、本発明は、請求の範囲に記載の構成を備える。その一例をあげるならば、本発明は携帯情報端末であって、ディスプレイと、カメラと、距離センサと、タイマーと、プロセッサと、を備え、前記プロセッサは、前記カメラが撮像した画像を基にユーザの顔又は前記ユーザの目を認識し、前記距離センサから前記ユーザの顔又は前記ユーザの目までの距離を取得し、前記ユーザの顔又は前記ユーザの目までの距離が予め定めた視力低下判定距離以下となってからの第1経過時間を前記タイマーから取得し、前記第1経過時間が予め定めた開始判定時間以上になると、前記ディスプレイの視認性を低下させる視認性低下制御を行い、前記ユーザの顔又は前記ユーザの目までの距離が前記視力低下判定距離を超えてからの第2経過時間を前記タイマーから取得し、前記第2経過時間が予め定めた終了判定時間以上となると前記視認性低下制御を終了して前記ディスプレイを通常表示に復帰させる。 In order to achieve the above object, the present invention includes the configurations described in the claims. To give one example, the present invention is a mobile information terminal, which includes a display, a camera, a distance sensor, a timer, and a processor, and the processor is configured to provide user information based on the image taken by the camera. recognizes the face of the user or the eyes of the user, obtains the distance from the distance sensor to the face of the user or the eyes of the user, and determines a decrease in visual acuity based on the predetermined distance to the face of the user or the eyes of the user. A first elapsed time from when the distance has fallen below is acquired from the timer, and when the first elapsed time becomes equal to or greater than a predetermined start determination time, visibility reduction control is performed to reduce the visibility of the display, and the A second elapsed time from when the distance to the user's face or the user's eyes exceeds the visual acuity deterioration determination distance is acquired from the timer, and when the second elapsed time exceeds a predetermined end determination time, the visual recognition is performed. The display is returned to normal display by completing the performance reduction control.
 本発明によれば、目と画面との距離の適正に保つ携帯情報端末及び携帯情報端末の表示制御方法を提供することができる。なお、上記した以外の目的、構成及び効果は、以下の実施形態の説明により明らかにされる。 According to the present invention, it is possible to provide a mobile information terminal and a display control method for the mobile information terminal that maintain an appropriate distance between the eyes and the screen. Note that objects, configurations, and effects other than those described above will be made clear by the description of the embodiments below.
スマートフォンの正面図である。It is a front view of a smartphone. スマートフォンの背面図である。FIG. 3 is a rear view of the smartphone. スマートフォンのハードウェア構成図である。It is a hardware configuration diagram of a smartphone. 本実施形態に係る機能ブロック図である。FIG. 2 is a functional block diagram according to the present embodiment. 本実施形態で計測する距離の説明図である。FIG. 3 is an explanatory diagram of distances measured in this embodiment. 本実施形態に係る視力低下防止モードの概要を示す図である。FIG. 2 is a diagram illustrating an overview of a visual acuity reduction prevention mode according to the present embodiment. 第1実施形態の処理の流れを示すフローチャートである。It is a flowchart which shows the flow of processing of a 1st embodiment. 第2実施形態における視力低下防止処理の流れを示すフローチャートである。It is a flow chart which shows the flow of visual acuity decline prevention processing in a 2nd embodiment. 目までの距離に応じて視認性低下強度を変えた画面例(サイズ変更)を示す図である。It is a figure which shows the example of a screen (size change) in which visibility reduction strength was changed according to the distance to an eye. 目までの距離に応じて視認性低下強度を変えた画面例(ぼかし度変更)を示す図である。It is a figure which shows the example of a screen (blur degree change) in which visibility reduction strength was changed according to the distance to an eye. 距離開始判定時間テーブルを示す図である。It is a figure which shows the distance start determination time table. 第3実施形態に係る処理の流れを示すフローチャートである。7 is a flowchart showing the flow of processing according to a third embodiment. 視認性低下処理の実行時間と視認性低下強度の関係性を示す図である。FIG. 7 is a diagram showing the relationship between the execution time of visibility reduction processing and visibility reduction intensity. 第4実施形態に係る処理の流れを示すフローチャートである。It is a flow chart which shows the flow of processing concerning a 4th embodiment. 第5実施形態に係る処理の流れを示すフローチャートである。It is a flowchart which shows the flow of processing concerning a 5th embodiment. 視力低下防止処理の実行中画面の別例を示す図である。It is a figure which shows another example of the screen during execution of a visual acuity decline prevention process. ハードウェアの別例を示す図である。It is a figure which shows another example of hardware. ヘッドマウントディスプレイの外観図である。It is an external view of a head mounted display. ヘッドマウントディスプレイのハードウェア構成図である。FIG. 2 is a hardware configuration diagram of a head-mounted display. ヘッドマウントディスプレイを用いた視力低下防止処理の一例を示す図である。FIG. 3 is a diagram showing an example of visual acuity reduction prevention processing using a head-mounted display. 子供の視聴歴のログを親に送信するシステムを示す図である。FIG. 2 is a diagram showing a system for transmitting a log of a child's viewing history to a parent.
 以下、図面を参照して本発明に係る携帯情報端末及びその表示制御方法の実施形態について説明する。全図を通じて同一の発明には同一の符号を付し、重複説明を省略する。 Hereinafter, embodiments of a portable information terminal and a display control method thereof according to the present invention will be described with reference to the drawings. The same inventions are denoted by the same reference numerals throughout all the figures, and redundant explanation will be omitted.
 本発明は、携帯情報端末の視聴に起因する視力低下の防止の寄与することが期待できることから、国連の提唱する持続可能な開発目標(SDGs:Sustainable Development Goals)の3(すべての人に健康と福祉を)に貢献することが期待できる。 The present invention is expected to contribute to the prevention of visual acuity deterioration caused by viewing on mobile information terminals. It can be expected to contribute to welfare.
 本実施形態では、携帯情報端末の具体例としてスマートフォンを例にとり説明するが、手に把持して操作しながら画面を視認する携帯情報端末であればスマートフォンに限らない。例えば、タブレットも本発明を適用可能な携帯情報端末の例である。また手に把持する必要がないウェアラブル端末、例えばヘッドマウントディスプレイ、スマートグラスにも本発明を適応することができる。 In this embodiment, a smartphone will be described as a specific example of a mobile information terminal, but the mobile information terminal is not limited to a smartphone as long as it is a mobile information terminal that can be held in the hand and operated while viewing the screen. For example, a tablet is also an example of a portable information terminal to which the present invention can be applied. The present invention can also be applied to wearable terminals that do not need to be held in the hand, such as head-mounted displays and smart glasses.
 図1Aは、スマートフォンの正面図である。図1Bは、スマートフォンの背面図である。 FIG. 1A is a front view of the smartphone. FIG. 1B is a rear view of the smartphone.
 図1Aに示すようにスマートフォン100の正面には、インカメラ131、ディスプレイ133、スピーカー141、及び距離センサ153が備えられる。 As shown in FIG. 1A, the front of the smartphone 100 is equipped with an in-camera 131, a display 133, a speaker 141, and a distance sensor 153.
 また図1Bに示すようにスマートフォン100の背面には、アウトカメラ132が備えられる。 Further, as shown in FIG. 1B, an out camera 132 is provided on the back of the smartphone 100.
 図2は、スマートフォンのハードウェア構成図である。 FIG. 2 is a diagram of the hardware configuration of the smartphone.
 図2に示すようにスマートフォン100は、メインプロセッサ101、記憶装置110、入力インタフェース(I/F)120、画像処理装置130、音声処理装置140、センサ群150、通信I/F160、拡張I/F172、タイマー173を備え、これらがバス102を介して互いに接続される。 As shown in FIG. 2, the smartphone 100 includes a main processor 101, a storage device 110, an input interface (I/F) 120, an image processing device 130, an audio processing device 140, a sensor group 150, a communication I/F 160, and an expansion I/F 172. , and a timer 173, which are connected to each other via the bus 102.
 記憶装置110は、RAM111、ROM112、及びフラッシュメモリ113を含む。 The storage device 110 includes a RAM 111, a ROM 112, and a flash memory 113.
 入力I/F120は、ボタンスイッチ121及びタッチパネル122を含む。 The input I/F 120 includes a button switch 121 and a touch panel 122.
 画像処理装置130は、インカメラ131、アウトカメラ132、及びディスプレイ133を含む。 The image processing device 130 includes an in-camera 131, an out-camera 132, and a display 133.
 音声処理装置140は、スピーカー141及びマイク142を含む。 The audio processing device 140 includes a speaker 141 and a microphone 142.
 センサ群150は、GPS受信機151、地磁気センサ152、距離センサ153、加速度センサ154、ジャイロセンサ155、及び生体情報取得センサ156を含む。 The sensor group 150 includes a GPS receiver 151, a geomagnetic sensor 152, a distance sensor 153, an acceleration sensor 154, a gyro sensor 155, and a biological information acquisition sensor 156.
 通信I/F160は、無線通信I/F161、及び電話網通信I/F162を含む。 The communication I/F 160 includes a wireless communication I/F 161 and a telephone network communication I/F 162.
 図3は、本実施形態に係る機能ブロック図である。 FIG. 3 is a functional block diagram according to this embodiment.
 スマートフォン100は、顔認識部101a、視認性低下処理部101b、計時部101c、ユーザ認証部101d、及びモード設定部101eを含む。これらはメインプロセッサ101が視力低下防止プログラムをRAM111にロードして実行することにより実現する。各部の機能の詳細は後述する。 The smartphone 100 includes a face recognition section 101a, a visibility reduction processing section 101b, a timekeeping section 101c, a user authentication section 101d, and a mode setting section 101e. These are realized by the main processor 101 loading a vision reduction prevention program into the RAM 111 and executing it. Details of the functions of each part will be described later.
 またフラッシュメモリ113には、各種テーブル記憶部113a及びとユーザ登録情報記憶部113bが形成される。各種テーブル記憶部113aには視力低下防止プログラムの実行に必要なテーブルが記録される。ユーザ登録情報記憶部113bに記憶されるユーザ登録情報は、後述する共有端末に係る視力低下防止プログラムの実行に用いられる。 Further, the flash memory 113 is formed with a various table storage section 113a and a user registration information storage section 113b. The various table storage unit 113a records tables necessary for executing the visual acuity decline prevention program. The user registration information stored in the user registration information storage section 113b is used to execute a visual acuity reduction prevention program related to the shared terminal, which will be described later.
 図4Aは、本実施形態で計測する距離の説明図である。図4Bは、本実施形態に係る視力低下防止モードの概要を示す図である。 FIG. 4A is an explanatory diagram of distances measured in this embodiment. FIG. 4B is a diagram showing an overview of the visual acuity reduction prevention mode according to the present embodiment.
 図4Aに示すように、本実施形態ではスマートフォン100のディスプレイ133からユーザの顔又はユーザの目までの距離Dを計測し、この距離Dを基に視力低下防止モードでの動作を実行する。ユーザの顔認識又はユーザの目認識までは行わず、被写体がユーザ(人物である)と認識する態様でもよい。 As shown in FIG. 4A, in this embodiment, the distance D from the display 133 of the smartphone 100 to the user's face or the user's eyes is measured, and the operation in the visual acuity reduction prevention mode is performed based on this distance D. A mode may be adopted in which the subject is recognized as the user (a person) without performing face recognition or eye recognition of the user.
 そして図4Bに示すようにメインプロセッサ101は、距離Dが予め定めた所定距離(Dth:視力低下判定距離)以下となった状態(近い状態)が一定時間(TS:開始判定時間)継続すると、ディスプレイ133の画面に対して視認性低下処理を実行する(視認性低下制御ともいう)。視認性低下処理では、画面をぼかしたり表示サイズを小さくしたりする。 Then, as shown in FIG. 4B, the main processor 101 determines that when the state (near state) in which the distance D is less than or equal to a predetermined distance (Dth: visual acuity deterioration determination distance) continues for a certain period of time (TS: start determination time), Visibility reduction processing is executed on the screen of the display 133 (also referred to as visibility reduction control). Visibility reduction processing involves blurring the screen or reducing the display size.
 メインプロセッサ101は、距離Dが視力低下判定距離Dthよりも遠い状態が一定時間(TE:終了判定時間)継続すると、ディスプレイ133の画面に対する視認性低下処理を終了して通常画面に復帰させる。これにより、ディスプレイ133は、視認性低下処理が行われる前の表示状態で表示される。 If the distance D continues to be farther than the visual acuity deterioration determination distance Dth for a certain period of time (TE: end determination time), the main processor 101 ends the visibility reduction process on the screen of the display 133 and returns the screen to the normal screen. Thereby, the display 133 is displayed in the display state before the visibility reduction process is performed.
<第1実施形態>
 第1実施形態は、一人のユーザが操作するスマートフォンにおいて視力低下防止処理を実行する実施形態である。図5は、第1実施形態の処理の流れを示すフローチャートである。
<First embodiment>
The first embodiment is an embodiment in which visual acuity reduction prevention processing is performed on a smartphone operated by one user. FIG. 5 is a flowchart showing the process flow of the first embodiment.
 スマートフォン100において視認性低下防止モードがオンになると、インカメラ131の画像に対して顔認識部101aが顔認識処理を実行し、ユーザの顔あるいは目の位置を認識し、距離センサ153で顔あるいは目までの距離Dを計測する(S01、図4参照)。 When the visibility reduction prevention mode is turned on in the smartphone 100, the face recognition unit 101a executes face recognition processing on the image of the in-camera 131, recognizes the user's face or eye position, and uses the distance sensor 153 to detect the face or Measure the distance D to the eye (S01, see FIG. 4).
 視認性低下処理部101bは、距離Dと視力低下判定距離Dthとを比較し、距離Dが視力低下判定距離Dthよりも大きい、即ち、ユーザAの目がスマートフォン100のディスプレイ133よりも視力低下判定距離Dthより遠いと判断すると(S02:NO)ステップS01へ戻り、距離Dの計測を続ける。 The visibility reduction processing unit 101b compares the distance D and the visual acuity deterioration determination distance Dth, and determines that the distance D is larger than the visual acuity deterioration determination distance Dth, that is, the eyesight of the user A is lower than the visual acuity deterioration determination distance Dth. If it is determined that it is farther than the distance Dth (S02: NO), the process returns to step S01 and continues measuring the distance D.
 一方視認性低下処理部101bは、距離Dが視力低下判定距離Dth以下、即ち、ユーザAの目がスマートフォン100のディスプレイ133と視力低下判定距離Dthと同じ又はそれよりも近いと判断すると(S02:YES)、距離Dが視力低下判定距離Dth以下となってからの第1経過時間T1の計測を開始する(S03)。 On the other hand, when the visibility reduction processing unit 101b determines that the distance D is less than or equal to the visual acuity deterioration determination distance Dth, that is, the user A's eyes are the same as or closer than the visual acuity deterioration determination distance Dth to the display 133 of the smartphone 100 (S02: YES), measurement of the first elapsed time T1 after the distance D becomes equal to or less than the visual acuity deterioration determination distance Dth is started (S03).
 ハードウェアとしてのタイマー171を備えて、それを起動して経過時間及び後述する実行時間を計測してもよいし、タイマー171をソフトウェアとして構成してもよい。この場合は、RTCから計時部101cが計測開始時の時刻と計測終了時の時刻とを取得し、その差分から経過時間及び実行時間を計測すればよい。 A timer 171 as hardware may be provided, and the timer 171 may be activated to measure the elapsed time and the execution time described below, or the timer 171 may be configured as software. In this case, the timer 101c may acquire the time at the start of measurement and the time at the end of measurement from the RTC, and measure the elapsed time and execution time from the difference between them.
 視認性低下処理部101bは、第1経過時間T1が開始判定時間TSより短い場合は(S04:NO)、ステップS01へ戻り距離の計測と第1経過時間T1のカウントを継続する。 If the first elapsed time T1 is shorter than the start determination time TS (S04: NO), the visibility reduction processing unit 101b returns to step S01 and continues measuring the distance and counting the first elapsed time T1.
 視認性低下処理部101bは、第1経過時間T1が開始判定時間TSと同じかそれよりも長くなると(S04:YES)、メインプロセッサ101は、第1経過時間T1を初期化し(S05)、視認性低下処理を開始する(S06)。 When the first elapsed time T1 is equal to or longer than the start determination time TS (S04: YES), the main processor 101 initializes the first elapsed time T1 (S05) and The sex reduction process is started (S06).
 スマートフォン100は、距離Dの計測を継続し(S07)、視認性低下処理部101bは、距離Dと視力低下判定距離Dthとの比較を行う(S08)。 The smartphone 100 continues measuring the distance D (S07), and the visibility reduction processing unit 101b compares the distance D with the visual acuity reduction determination distance Dth (S08).
 視認性低下処理部101bは、距離Dが視力低下判定距離Dth以下であると判断すると(S08:YES)、ステップS06へ戻り視認性低下処理を継続する。 When the visibility reduction processing unit 101b determines that the distance D is less than or equal to the visual acuity reduction determination distance Dth (S08: YES), the process returns to step S06 and continues the visibility reduction processing.
 視認性低下処理部101bは、距離Dが視力低下判定距離Dthよりも遠いと判断すると(S08:NO)、距離Dが視力低下判定距離Dth以下となってからの経過時間(第2経過時間T2)の計測を開始する(S09)。 When the visibility reduction processing unit 101b determines that the distance D is farther than the visual acuity reduction determination distance Dth (S08: NO), the visibility reduction processing unit 101b determines the elapsed time (second elapsed time T2) after the distance D becomes equal to or less than the visual acuity reduction determination distance Dth. ) measurement is started (S09).
 視認性低下処理部101bは、第2経過時間T2が終了判定時間TEより短いと判断すると(S10:NO)、ステップS06へ戻り視認性低下処理を継続する。 When the visibility reduction processing unit 101b determines that the second elapsed time T2 is shorter than the end determination time TE (S10: NO), the process returns to step S06 and continues the visibility reduction processing.
 視認性低下処理部101bは、第2経過時間T2が終了判定時間TE以上となると(S10:YES)、視認性低下処理を停止して通常表示に切り替えると共に第2経過時間T2を初期化する(S11)。 When the second elapsed time T2 becomes equal to or greater than the end determination time TE (S10: YES), the visibility reduction processing unit 101b stops the visibility reduction processing, switches to normal display, and initializes the second elapsed time T2 ( S11).
 本実施形態によれば、目と携帯情報端末との距離が、視力の低下が懸念される距離である視力低下判定距離以下となってから開始判定時間TSが経過すると視認性低下処理を行い、ユーザにディスプレイまでの距離が近いことについての注意喚起を行う。その際、ディスプレイの視認性を低下させるので表示品質を改善するためには目からディスプレイを離す必要が生じる。これにより単に画面が目に近いことを通知するだけでは、目とディスプレイとの距離を改善させることの期待が薄いが、本実施形態によれば、画面の視認性を回復させるために距離を離す動作が必要となり、ユーザの目とディスプレイとの距離を適正に保つことができる。 According to the present embodiment, when the start determination time TS has elapsed after the distance between the eyes and the mobile information terminal becomes less than or equal to the visual acuity deterioration determination distance, which is a distance at which deterioration of visual acuity is a concern, visibility reduction processing is performed; The user is alerted to the closeness of the display. In this case, since the visibility of the display is reduced, it is necessary to move the display away from the eyes in order to improve the display quality. As a result, simply notifying that the screen is close to the eyes has little hope of improving the distance between the eyes and the display, but according to this embodiment, the distance between the eyes and the display can be improved to restore the visibility of the screen. movement is required to maintain an appropriate distance between the user's eyes and the display.
 また、携帯情報端末は手にもって操作するため、目との距離Dが設置型のディスプレイに比べて変化しやすいという特徴がある。本実施形態によれば、距離Dが視力低下判定距離Dthに近づいた状態が開始判定時間TS以上とならない限り視認性低下処理は行わず、また距離Dが視力低下判定距離Dthから遠ざかった状態が終了判定時間TE以上とならない限り通常画面に復帰させないことで、手振れやわずかな姿勢の変化で頻回に視認性低下処理が行われることを防ぎつつ、目と携帯情報端末との距離の適正化を図ることができる。 Additionally, since mobile information terminals are operated in the hand, the distance D from the user's eyes is more likely to change than with a fixed display. According to the present embodiment, the visibility reduction process is not performed unless the distance D approaches the visual acuity deterioration determination distance Dth for a time equal to or longer than the start determination time TS, and the visibility reduction process does not occur unless the distance D approaches the visual acuity deterioration determination distance Dth. By not returning to the normal screen unless the end judgment time TE is exceeded, the distance between the eyes and the mobile information terminal is optimized while preventing visibility reduction processing from being performed frequently due to camera shake or slight changes in posture. can be achieved.
<第2実施形態>
 第2実施形態は、視力低下防止処理において目と携帯情報端末の距離に応じて視認性低下強度を可変する実施形態である。図6は、第2実施形態における視力低下防止処理の流れを示すフローチャートである。図5に示す第1実施形態の処理フローと同一のステップには同一の符号を付し重複説明を省略する。
<Second embodiment>
The second embodiment is an embodiment in which the visibility reduction strength is varied according to the distance between the eyes and the mobile information terminal in the visual acuity reduction prevention process. FIG. 6 is a flowchart showing the flow of visual acuity reduction prevention processing in the second embodiment. Steps that are the same as those in the process flow of the first embodiment shown in FIG. 5 are given the same reference numerals, and redundant explanation will be omitted.
 視認性低下処理部101bは、距離Dが視力低下判定距離Dth以下となってからの第1経過時間T1が開始判定時間TS以上となると、第1経過時間T1の計測を停止し(S05)、距離Dに応じた視認性低下強度を設定する(S11)。視認性低下強度は、視力低下判定距離Dth以下において距離Dが更に小さいほどディスプレイ133の視認性を低下させる、すなわち視認性低下強度を強くする。 The visibility reduction processing unit 101b stops measuring the first elapsed time T1 when the first elapsed time T1 after the distance D becomes less than or equal to the visual acuity reduction determination distance Dth becomes equal to or greater than the start determination time TS (S05), The visibility reduction strength is set according to the distance D (S11). The lower the visibility reduction strength is, the smaller the distance D is below the visual acuity reduction determination distance Dth, the lower the visibility of the display 133 is, that is, the stronger the visibility reduction strength is.
 視認性低下処理部101bは、決定した視認性低下強度に従って視認性低下処理を開始する(S06)。 The visibility reduction processing unit 101b starts visibility reduction processing according to the determined visibility reduction intensity (S06).
 視認性低下処理中に距離Dを計測し(S07)、視認性低下処理部101bは距離Dに応じた視認性低下強度を再設定し(S12)、再設定した視認性低下強度に応じて視認性低下処理を再実行する(S13)。なお、制御部の性能や処理量により、距離の計測から視認性低下強度設定に要する時間が1フレームを超える場合や、処理負荷低減や消費電力低減の為に所定のフレーム間隔で視認性低下強度処理を行ってもよい。視認性低下強度処理の間隔は特に限定はしない。 The distance D is measured during the visibility reduction process (S07), the visibility reduction processing unit 101b resets the visibility reduction intensity according to the distance D (S12), and the visibility reduction processing unit 101b resets the visibility reduction intensity according to the reset visibility reduction intensity. The performance reduction process is re-executed (S13). Depending on the performance and processing capacity of the control unit, the time required from measuring distance to setting the visibility reduction strength may exceed one frame, or the visibility reduction strength may be set at predetermined frame intervals to reduce processing load and power consumption. Processing may be performed. There is no particular limitation on the interval of the visibility reducing strength treatment.
 視認性低下強度が更新されない場合は、前フレームの視認性低下強度が引き継がれる。 If the visibility reduction strength is not updated, the visibility reduction strength of the previous frame is inherited.
 距離計測間隔は視力低下防止処理中と視力低下防止処理以外で間隔を変えてもよい。例えば視力低下防止処理以外の時の間隔は10フレームおき、視認性低下処理中は1フレーム間隔でもよい。視認性低下処理に入る前は、それほど追従性は求められないので、処理負荷軽減効果を優先してもよい。 The distance measurement interval may be changed during the visual acuity deterioration prevention process and other than the visual acuity deterioration prevention process. For example, the interval may be every 10 frames when the visual acuity reduction prevention process is not performed, and the interval may be 1 frame during the visibility reduction process. Before starting the visibility reduction process, followability is not required so much, so priority may be given to the effect of reducing the processing load.
 視認性低下処理部101bは、距離Dが視力低下判定距離Dth以下であると判断すると(S08:YES)、ステップS07へ戻り距離Dの計測、視認性低下強度再設定(S12)、視認性低下処理の再実行(S13)を繰り返す。この繰り返し処理は、視認性低下処理部101bにより、距離Dが視力低下判定距離Dthよりも大きいと判断されるまで続ける(S08:NO)。その後、第2経過時間T2の計測を開始し(S09)、第2経過時間T2と終了判定時間TEとを比較する(S10)。第2経過時間T2が終了判定時間TE未満であれば(S10:NO)ステップS07へ戻る。第2経過時間T2が終了判定時間TE以上であれば(S10:YES)ステップS11へ進み、通常表示に切り替え、第2経過時間T2の計測を停止する(S11)。その後処理を終了する。 When the visibility reduction processing unit 101b determines that the distance D is less than or equal to the visual acuity reduction determination distance Dth (S08: YES), the process returns to step S07 to measure the distance D, reset the visibility reduction intensity (S12), and perform the visibility reduction process. The re-execution of the process (S13) is repeated. This repeated process continues until the visibility reduction processing unit 101b determines that the distance D is greater than the visual acuity reduction determination distance Dth (S08: NO). Thereafter, measurement of the second elapsed time T2 is started (S09), and the second elapsed time T2 is compared with the end determination time TE (S10). If the second elapsed time T2 is less than the end determination time TE (S10: NO), the process returns to step S07. If the second elapsed time T2 is equal to or longer than the end determination time TE (S10: YES), the process advances to step S11, switches to normal display, and stops measuring the second elapsed time T2 (S11). After that, the process ends.
 図7は、目までの距離に応じて視認性低下強度を変えた画面例(サイズ変更)を示す図である。 FIG. 7 is a diagram showing an example of a screen (change in size) in which the visibility reduction strength is changed depending on the distance to the eye.
 図7では視認性低下強度に応じて表示画面の大きさを変える。即ち、距離画面サイズテーブル1333に示すように、目までの距離Dが視力低下判定距離Dth以下では、距離Dが小さくなるにつれて画面サイズを標準サイズの通常画面1331よりも小さい小画面1332で表示する。距離Dが視力低下判定距離Dth以上となると、距離Dに関わらず通常画面1331で表示する。 In FIG. 7, the size of the display screen is changed depending on the degree of visibility reduction. That is, as shown in the distance screen size table 1333, when the distance D to the eyes is less than or equal to the visual acuity deterioration determination distance Dth, as the distance D becomes smaller, the screen size is changed to a small screen 1332 that is smaller than the standard size normal screen 1331. . When the distance D is equal to or greater than the visual acuity deterioration determination distance Dth, the normal screen 1331 is displayed regardless of the distance D.
 視認性低下処理は、画面サイズを変える他、距離Dが小さくにつれてディスプレイ133の輝度を下げたり、バイブレータの振幅数(強度)をあげてスマートフォン100を揺らすことで視認性を下げたりしてもよい。またディスプレイ133の表示に、画面サイズは変えずに、画面中心を基準点とする中心投影法を用いた画像処理を行い、画面中心の視認性を下げることで全体の視認性を低下させる処理を行ってもよい。その際、中心投影法の中心を距離Dが小さいほど遠くに設定してもよい。また、スマートフォン100を保持する場所に圧電素子等を用いた触覚デバイスを配置し、指や手に圧力による引き感を与えることで、スマートフォン100を目と反対方向に誘導するようにしてもよい。 In addition to changing the screen size, the visibility reduction process may include lowering the brightness of the display 133 as the distance D becomes smaller, or increasing the amplitude (intensity) of the vibrator and shaking the smartphone 100 to lower the visibility. . In addition, image processing is performed on the display 133 using a central projection method that uses the center of the screen as a reference point, without changing the screen size, and processing that reduces the overall visibility by lowering the visibility of the center of the screen. You may go. At this time, the center of the central projection method may be set farther away as the distance D becomes smaller. Further, a tactile device using a piezoelectric element or the like may be placed in a place where the smartphone 100 is held, and the smartphone 100 may be guided in the direction opposite to the eye by giving a pulling sensation due to pressure to the finger or hand.
 図8は、目までの距離に応じて視認性低下強度を変えた画面例(ぼかし度変更)を示す図である。 FIG. 8 is a diagram showing an example of a screen (change in degree of blur) in which the intensity of visibility reduction is changed depending on the distance to the eye.
 図8では、視認性低下強度が0(最小値)の場合は、全くぼかしていない通常画面1331が表示される。視認性低下強度が0(最小値)よりも大きく最大値未満、例えば中程度であればややぼけた中程度ぼけ画面1334表示される。そして視認性低下強度が強くなるとぼかし強度が大きい強度ぼけ画面1335が表示される。 In FIG. 8, when the visibility reduction strength is 0 (minimum value), a normal screen 1331 that is not blurred at all is displayed. If the visibility reduction strength is greater than 0 (minimum value) and less than the maximum value, for example, medium, a slightly blurred moderately blurred screen 1334 is displayed. When the degree of visibility reduction increases, an intensely blurred screen 1335 with a high degree of blurring is displayed.
 本実施形態によれば、視認性低下処理中に距離Dが近いほど視認性を低下させることで、ユーザに対して視力低下判定距離Dth以上の距離Dを確保するように促すことができる。 According to the present embodiment, by reducing visibility as the distance D becomes shorter during visibility reduction processing, it is possible to urge the user to ensure a distance D that is equal to or greater than the visual acuity reduction determination distance Dth.
<第3実施形態>
 第3実施形態は、視認性低下処理を開始するまでの開始判定時間TSを距離Dに応じて変更する実施形態である。図9は、距離開始判定時間テーブルを示す図である。
<Third embodiment>
The third embodiment is an embodiment in which the start determination time TS until the visibility reduction process is started is changed according to the distance D. FIG. 9 is a diagram showing a distance start determination time table.
 図9に示す距離開始判定時間テーブル1336が示すように、距離Dが至近距離の範囲内にある場合は、開始判定時間TSを0秒とし、すぐに視認性低下処理を開始してもよい。至近距離を超えてから視力低下判定距離Dthまでは、距離Dが長くなるにつれて開始判定時間TSが長くなるように構成してもよい。 As shown in the distance start determination time table 1336 shown in FIG. 9, if the distance D is within the close range, the start determination time TS may be set to 0 seconds and the visibility reduction process may be started immediately. From beyond the close distance to the visual acuity deterioration determination distance Dth, the start determination time TS may be configured to become longer as the distance D becomes longer.
 図10は、第3実施形態に係る処理の流れを示すフローチャートである。 FIG. 10 is a flowchart showing the flow of processing according to the third embodiment.
 図10に示すように、第3実施形態では、距離Dを計測し(S01)、視認性低下処理部101bは視力低下判定距離Dthと比較(S02)、第1経過時間T1の計測を開始(S03)する。 As shown in FIG. 10, in the third embodiment, the visibility reduction processing unit 101b measures the distance D (S01), compares it with the visual acuity reduction determination distance Dth (S02), and starts measuring the first elapsed time T1 ( S03).
 視認性低下処理部101bは、距離開始判定時間テーブル1336を参照し、ステップS01で計測した距離Dに対応するTSを決定する(S21)。 The visibility reduction processing unit 101b refers to the distance start determination time table 1336 and determines the TS corresponding to the distance D measured in step S01 (S21).
 そして、視認性低下処理部101bはその開始判定時間TSと、第1経過時間T1との比較を行い(S04)、視認性低下処理の実行要否を判断する。以後は、第1実施形態の処理の流れではS05へ、第2実施形態の処理の流れではS11へと続く。 Then, the visibility reduction processing unit 101b compares the start determination time TS with the first elapsed time T1 (S04), and determines whether or not it is necessary to execute the visibility reduction processing. Thereafter, the process flow of the first embodiment continues to S05, and the process flow of the second embodiment continues to S11.
 本実施形態によれば、距離Dが至近距離であればすぐに視認性低下処理を開始できる。また、距離Dが視力低下判定距離Dthよりも小さい場合でも、視力低下判定距離Dthに近いほど目への負担は少ないので開始判定時間TSをより長くし、視認性低下処理の実行開始を遅らせることで、偶発的に目がディスプレイに近くなっている場合も頻回に視認性低下処理が実行されることを防ぎ、操作性を向上させることができる。 According to this embodiment, if the distance D is a close distance, the visibility reduction process can be started immediately. Furthermore, even if the distance D is smaller than the visual acuity deterioration determination distance Dth, the closer the visual acuity deterioration determination distance Dth is, the less strain on the eyes will be placed on the eyes, so the start determination time TS can be made longer to delay the start of the visibility deterioration process. Even if the user's eyes accidentally come close to the display, it is possible to prevent visibility reduction processing from being executed frequently and improve operability.
<第4実施形態>
 第4実施形態は、視認性低下処理の実行時間が無くなるほど視認性低下強度を強くする実施形態である。図11は、視認性低下処理の実行時間と視認性低下強度の関係性を示す図である。
<Fourth embodiment>
The fourth embodiment is an embodiment in which the visibility reduction strength is increased as the execution time of the visibility reduction process runs out. FIG. 11 is a diagram showing the relationship between the execution time of the visibility reduction process and the visibility reduction intensity.
 図11に示すように、視認性低下処理を開始してからの実行時間T3が長くなるほど視認性低下強度を強くする。但し、視認性低下強度が最大値に達すると、その後は視認性低下処理の実行時間T3が長くなっても画面の視認性は変化がしない。 As shown in FIG. 11, the longer the execution time T3 after starting the visibility reduction process, the stronger the visibility reduction intensity. However, once the visibility reduction intensity reaches the maximum value, the visibility of the screen does not change thereafter even if the execution time T3 of the visibility reduction process becomes longer.
 図12は、第4実施形態に係る処理の流れを示すフローチャートである。第2実施形態の処理フローと同一のステップには同一の符号を付し重複説明を省略する。 FIG. 12 is a flowchart showing the flow of processing according to the fourth embodiment. Steps that are the same as those in the process flow of the second embodiment are given the same reference numerals and redundant explanations will be omitted.
 視認性低下処理部101bは、距離Dが視力低下判定距離Dth以下となってからの第1経過時間T1が開始判定時間TS以上となると、第1経過時間T1の計測を停止し(S05)、視認性低下処理を開始する(S06)。このときに視認性低下強度は、S11のように距離Dに応じ視認性低下強度でもよいし、予め定めた視認性低下強度でもよい。 The visibility reduction processing unit 101b stops measuring the first elapsed time T1 when the first elapsed time T1 after the distance D becomes less than or equal to the visual acuity reduction determination distance Dth becomes equal to or greater than the start determination time TS (S05), Visibility reduction processing is started (S06). At this time, the visibility reduction intensity may be a visibility reduction intensity depending on the distance D as in S11, or a predetermined visibility reduction intensity.
 視認性低下処理を開始すると実行時間T3の計測(S31)と距離Dの計測(S07)を開始する。 When the visibility reduction process is started, the measurement of the execution time T3 (S31) and the measurement of the distance D (S07) are started.
 視認性低下処理部101bは、実行時間T3に応じた視認性低下強度を再設定し(S32)、再設定した視認性低下強度に応じて視認性低下処理を再実行する(S33)。 The visibility reduction processing unit 101b resets the visibility reduction intensity according to the execution time T3 (S32), and re-executes the visibility reduction process according to the reset visibility reduction intensity (S33).
 視認性低下処理部101bは、距離Dが視力低下判定距離Dth以下であると判断すると(S08:YES)、ステップS31へ戻り実行時間T3の計測を続行し、以後のステップを繰り返す。この繰り返し処理は、視認性低下処理部101bにより距離Dが視力低下判定距離Dthよりも大きいと判断されるまで続ける(S08:NO)。その後、第2経過時間T2の計測を開始し(S09)、視認性低下処理部101bは、第2経過時間T2と終了判定時間TEとを比較する(S10)。第2経過時間T2が終了判定時間TE未満であれば(S10:NO)ステップS31へ戻る。第2経過時間T2が終了判定時間TE以上であれば(S10:YES)、通常表示に切り替え、第2経過時間T2、実行時間T3の計測を停止する(S34)。その後処理を終了する。 When the visibility reduction processing unit 101b determines that the distance D is less than or equal to the visual acuity reduction determination distance Dth (S08: YES), the process returns to step S31 and continues measuring the execution time T3, and repeats the subsequent steps. This repeated process continues until the visibility reduction processing unit 101b determines that the distance D is greater than the visual acuity reduction determination distance Dth (S08: NO). Thereafter, the measurement of the second elapsed time T2 is started (S09), and the visibility reduction processing unit 101b compares the second elapsed time T2 and the end determination time TE (S10). If the second elapsed time T2 is less than the end determination time TE (S10: NO), the process returns to step S31. If the second elapsed time T2 is equal to or longer than the end determination time TE (S10: YES), the display is switched to normal display and the measurement of the second elapsed time T2 and execution time T3 is stopped (S34). After that, the process ends.
 本実施形態によれば、視認性低下処理の実行時間T3に応じて視認性低下強度を強くするので、距離Dが視力低下判定距離Dthよりも小さいことの注意喚起を行っても距離Dが視力低下判定距離Dth以上に回復しない場合に、視認性を強くしていくことで距離Dの回復姿勢をユーザに対して意識づけることができる。 According to the present embodiment, the visibility reduction intensity is increased according to the execution time T3 of the visibility reduction process, so even if the distance D is smaller than the visual acuity reduction determination distance Dth, even if the distance D is smaller than the visual acuity reduction determination distance Dth. In the case where the recovery is not made beyond the deterioration determination distance Dth, the user can be made aware of the recovery posture of the distance D by increasing the visibility.
<第5実施形態>
 第5実施形態は、共有端末において、ユーザの年齢に応じて視力低下防止モードのオンオフを制御する実施形態である。
<Fifth embodiment>
The fifth embodiment is an embodiment in which, in a shared terminal, on/off of the visual acuity reduction prevention mode is controlled according to the age of the user.
 図13は、第5実施形態に係る処理の流れを示すフローチャートである。 FIG. 13 is a flowchart showing the flow of processing according to the fifth embodiment.
 スマートフォン100のインカメラでユーザの顔を撮影し(S41)、顔認識部101aは、顔認識処理を行う(S42)。ユーザ認証部101dは、認識したユーザと当該ユーザについての視力防止低下モードのオンまたはオフを関連付けたユーザ登録情報とを照合する(S43)。 The user's face is photographed with the in-camera of the smartphone 100 (S41), and the face recognition unit 101a performs face recognition processing (S42). The user authentication unit 101d compares the recognized user with user registration information that associates whether the visual acuity reduction mode is turned on or off for the user (S43).
 顔画像が示すユーザの登録がユーザ登録情報にあれば(S43:YES)、登録されたモードに設定し(S44)、処理を終了する。ここでいう登録されたモードとは、ユーザA(例えば子供)は視力低下防止モードをオンにする、ユーザB(例えば親)は視力低下防止モードをオフにするといった登録内容を意味する。 If the user indicated by the face image is registered in the user registration information (S43: YES), the registered mode is set (S44) and the process ends. The registered mode here means registered content such that user A (for example, a child) turns on the visual acuity loss prevention mode, and user B (for example, a parent) turns off the visual acuity loss prevention mode.
 ユーザ登録がなければ(S43:NO)、モード設定部101eは認識した顔画像を基にユーザの年齢の推定処理を行う(S44)。モード設定部101eは推定年齢が指定年齢以下であると判断すると(S45:YES)、視力低下防止モードをオンに設定する(S46)。 If there is no user registration (S43: NO), the mode setting unit 101e performs a process of estimating the user's age based on the recognized face image (S44). When the mode setting unit 101e determines that the estimated age is less than or equal to the designated age (S45: YES), the mode setting unit 101e sets the visual acuity reduction prevention mode to ON (S46).
 視力低下防止モードが自動でオンされた場合は、モード解除を簡単にするボタン等を表示してもよい。例えば、「視力低下防止モードが設定されました。解除するには、解除ボタンから、パスコードを入力してください。」という表示を行ってもよい。パスコードはモードの解除の為に、親等が予め設定しておく。設定していない場合はそのまま解除でもよい。 If the vision reduction prevention mode is automatically turned on, a button or the like may be displayed to make it easier to cancel the mode. For example, a message may be displayed that says, "Visual deterioration prevention mode has been set. To release it, please enter a passcode from the release button." A passcode is set in advance by a parent, etc. in order to cancel the mode. If it is not set, you can simply cancel it.
 一方、モード設定部101eは推定年齢が指定年齢より高いと判断すると(S45:NO)、視力低下防止モードをオフに設定する(S47)。ここでいう指定年齢とは、画面視聴による視力や内斜視への影響を受けやすい幼児の年齢や、学童年齢を考慮して設定される年齢である。 On the other hand, if the mode setting unit 101e determines that the estimated age is higher than the designated age (S45: NO), it sets the visual acuity decline prevention mode to OFF (S47). The designated age here is an age set in consideration of the age of infants and school children who are susceptible to visual acuity and esotropia due to screen viewing.
 本実施形態によれば、ユーザに応じて設定した登録内容や、未登録ユーザに対しては推定年齢に応じて視力低下防止モードのオンオフの設定が行えるので、共有端末でもユーザに応じた視力低下防止処理を行うことができる。 According to this embodiment, since the vision deterioration prevention mode can be turned on and off according to registered contents set according to the user and, for unregistered users, according to the estimated age, even on a shared terminal, the vision deterioration prevention mode can be set according to the user. Prevention processing can be performed.
<その他の変更例>
 図14は、視力低下防止処理の実行中画面の別例を示す図である。
<Other change examples>
FIG. 14 is a diagram showing another example of the screen during execution of the visual acuity reduction prevention process.
 視力低下防止処理として、画面1337に示すように警告文1337aを表示してもよい。 As a visual acuity reduction prevention process, a warning message 1337a may be displayed as shown in a screen 1337.
 また視力低下防止処理として、画面1338に示すように視力低下判定距離Dthに対する現在の距離Dを示すゲージ1338aを表示してもよい。 Additionally, as a visual acuity reduction prevention process, a gauge 1338a indicating the current distance D relative to the visual acuity decrease determination distance Dth may be displayed as shown in a screen 1338.
 また視力低下防止処理として、アウトカメラ132でとった背景1339aにスマートフォン100の最適位置をAR表示1339bとして示した画面1339を用いてもよい。ユーザはスマートフォン100を目から遠ざけて現実のスマートフォン100がAR表示1339bに重なると、視力低下判定距離Dthが確保できるように構成してもよい。 Furthermore, as a visual acuity reduction prevention process, a screen 1339 may be used in which the optimal position of the smartphone 100 is shown as an AR display 1339b on a background 1339a taken by the out-camera 132. The configuration may be configured such that when the user moves the smartphone 100 away from the eye and the actual smartphone 100 overlaps the AR display 1339b, the visual acuity deterioration determination distance Dth can be secured.
 図15は、ハードウェアの別例を示す図である。 FIG. 15 is a diagram showing another example of hardware.
 図15に示すスマートフォン100では、ディスプレイ133の裏面に埋め込み型のインカメラ131と距離センサ153を備える。これにより、ディスプレイ133の視認中はインカメラ131及び距離センサ153が手等で遮られることがないので視認性低下防止処理を安定して実行できる。 The smartphone 100 shown in FIG. 15 includes an embedded in-camera 131 and a distance sensor 153 on the back side of the display 133. As a result, the in-camera 131 and the distance sensor 153 are not obstructed by hands or the like while the display 133 is being viewed, so that the visibility reduction prevention process can be executed stably.
 上記では携帯情報端末としてスマートフォン100を例に挙げて説明したが、本発明はヘッドマウントディスプレイにも適用できる。図16は、ヘッドマウントディスプレイの外観図、図17は、ヘッドマウントディスプレイのハードウェア構成図である。 Although the smartphone 100 has been described above as an example of a portable information terminal, the present invention can also be applied to a head-mounted display. FIG. 16 is an external view of the head mounted display, and FIG. 17 is a hardware configuration diagram of the head mounted display.
 図16に示すようにHMD200の外側前面にはアウトカメラ232と距離センサ253が備えられる。 As shown in FIG. 16, an outside camera 232 and a distance sensor 253 are provided on the outside front of the HMD 200.
 図17は、ヘッドマウントディスプレイのハードウェア構成図である。 FIG. 17 is a hardware configuration diagram of the head mounted display.
 図17に示すようにHMD200は、メインプロセッサ201、記憶装置210、入力インタフェース(I/F)220、画像処理装置230、音声処理装置240、センサ群250、通信I/F260、拡張I/F272、タイマー273を備え、これらがバス202を介して互いに接続される。 As shown in FIG. 17, the HMD 200 includes a main processor 201, a storage device 210, an input interface (I/F) 220, an image processing device 230, an audio processing device 240, a sensor group 250, a communication I/F 260, an expansion I/F 272, A timer 273 is provided, and these are connected to each other via the bus 202.
 記憶装置210は、RAM211、ROM212、及びフラッシュメモリ213を含む。 The storage device 210 includes a RAM 211, a ROM 212, and a flash memory 213.
 入力I/F220は、ボタンスイッチ221を含む。またジェスチャー入力、音声入力を用いる場合は、アウトカメラ232、マイク242も入力I/F220として機能する。 The input I/F 220 includes a button switch 221. Furthermore, when using gesture input and voice input, the out camera 232 and the microphone 242 also function as the input I/F 220.
 画像処理装置230は、アウトカメラ232、及び透過型ディスプレイ233を含む。 The image processing device 230 includes an out camera 232 and a transmissive display 233.
 音声処理装置240は、スピーカー241及びマイク242を含む。 The audio processing device 240 includes a speaker 241 and a microphone 242.
 センサ群250は、GPS受信機251、地磁気センサ252、距離センサ253、加速度センサ254、ジャイロセンサ255、及び生体情報取得センサ256を含む。 The sensor group 250 includes a GPS receiver 251, a geomagnetic sensor 252, a distance sensor 253, an acceleration sensor 254, a gyro sensor 255, and a biological information acquisition sensor 256.
 通信I/F260は、無線通信I/F261を含む。 The communication I/F 260 includes a wireless communication I/F 261.
 図18は、ヘッドマウントディスプレイを用いた視力低下防止処理の一例を示す図である。 FIG. 18 is a diagram illustrating an example of visual acuity reduction prevention processing using a head-mounted display.
 HMD200のメインプロセッサ201は、アウトカメラ232の画像に対して画像認識処理を行い、スマートフォン100の視聴は本の閲覧をしているかの動作認識を行う。 The main processor 201 of the HMD 200 performs image recognition processing on the image from the out-camera 232, and recognizes whether the smartphone 100 is viewing a book.
 ついで、距離センサ253により、スマートフォン100または本までの距離Dを計測する。その距離Dを用いて上記各実施形態と同様の処理を行ってもよい。その場合、画面表示サイズの変更はスマートフォン100とHMD200とを連携させてスマートフォン100により実行する。また、状実施形態で視認性低下処理として表示画面をぼかす処理を行う場合は、透過型ディスプレイ233の輝度をさげたり、透過度をさげたりする処理で代用してもよい。すなわち、本やスマートフォン100の視認性を下げる処理をHMD200やHMD200に連携したスマートフォンで行えばよい。 Next, the distance sensor 253 measures the distance D to the smartphone 100 or the book. The same processing as in each of the above embodiments may be performed using the distance D. In that case, the change of screen display size is executed by the smartphone 100 in cooperation with the smartphone 100 and the HMD 200. Furthermore, in the case of performing a process of blurring the display screen as the visibility reduction process in the embodiment, a process of reducing the brightness or transparency of the transmissive display 233 may be used instead. That is, the process of lowering the visibility of the book or the smartphone 100 may be performed by the HMD 200 or a smartphone linked to the HMD 200.
 また、別例として、スマートフォン100の画面を近づける動作をHMD200が認識すると、ユーザが画面を拡大してみたいと予想される。よって。HMD200の透過型ディスプレイ233にスマートフォン100の拡大画面を表示してもよい。これにより、スマートフォン100を遠ざけつつ、スマートフォン100の画面視聴が可能となる。また、HMD200とスマートフォン100とが連携していれば、スマートフォン100側で画面拡大してもよい。 As another example, when the HMD 200 recognizes an action of bringing the screen of the smartphone 100 closer, it is predicted that the user will want to enlarge the screen. Therefore. An enlarged screen of the smartphone 100 may be displayed on the transparent display 233 of the HMD 200. This makes it possible to view the screen of the smartphone 100 while keeping the smartphone 100 away. Furthermore, if the HMD 200 and the smartphone 100 are working together, the screen may be enlarged on the smartphone 100 side.
 図19は、子供の視聴歴のログを親に送信するシステムを示す図である。 FIG. 19 is a diagram showing a system for transmitting a log of a child's viewing history to a parent.
 通信ネットワーク300を介して、ユーザA(子供)が視聴するスマートフォン100AとユーザB(親)が装着したHMD200、またはユーザBが操作するスマートフォン100Bと、通信接続する。通信ネットワーク300は、サーバ310と、無線ルータ321、322とが接続される。スマートフォン100Aは無線ルータ321を介して通信ネットワーク300に、HMD200、スマートフォン100Bは無線ルータ322を介して通信ネットワーク300に接続する。 Via the communication network 300, a smartphone 100A viewed by user A (child) is communicatively connected to an HMD 200 worn by user B (parent), or a smartphone 100B operated by user B. A communication network 300 is connected to a server 310 and wireless routers 321 and 322. The smartphone 100A connects to the communication network 300 via a wireless router 321, and the HMD 200 and smartphone 100B connect to the communication network 300 via a wireless router 322.
 スマートフォン100Aの視聴ログは、サーバ310に記録される。この視聴ログには視聴開始時間、視聴終了時間、及びその間の視認性低下処理実行ログを含む。 The viewing log of the smartphone 100A is recorded on the server 310. This viewing log includes a viewing start time, a viewing end time, and a visibility reduction process execution log during that time.
 HMD200又はスマートフォン100Bからサーバ310に対して視聴ログの送信要求を送ると、サーバ310から視聴ログがHMD200又はスマートフォン100Bに送信される。 When the HMD 200 or the smartphone 100B sends a viewing log transmission request to the server 310, the viewing log is sent from the server 310 to the HMD 200 or the smartphone 100B.
 これにより、ユーザB(親)はとユーザA(子供)の視聴時間だけでなく、その際の視認性低下処理実行履歴を知ることで、視聴時間には反映されない子供の視聴姿勢や視力への影響を推測することができる。 This allows User B (parent) to know not only the viewing time of User A (child), but also the history of the visibility reduction processing executed at that time, so that user B (parent) can learn about the viewing posture and eyesight of the child, which is not reflected in viewing time. The impact can be estimated.
 なお、本発明は上記した実施形態に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施形態は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施形態の構成の一部を他の実施形態の構成に置き換えることが可能であり、また、ある実施形態の構成に他の実施形態の構成を加えることも可能である。また、各実施形態の構成の一部について、他の構成の追加・削除・置換をすることが可能である。 Note that the present invention is not limited to the embodiments described above, and includes various modifications. For example, the above-described embodiments have been described in detail to explain the present invention in an easy-to-understand manner, and the present invention is not necessarily limited to having all the configurations described. Furthermore, it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment, and it is also possible to add the configuration of another embodiment to the configuration of one embodiment. Furthermore, it is possible to add, delete, or replace some of the configurations of each embodiment with other configurations.
 また、上記の各構成、機能、処理部、処理手段等は、それらの一部又は全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。各機能を実現するプログラム、テーブル、ファイル等の情報は、メモリや、ハードディスク、SSD(Solid State Drive)等の記録装置、又は、ICカード、SDカード、DVD等の記録媒体に格納されてもよいし、通信網上の装置に格納されてもよい。 Further, each of the above-mentioned configurations, functions, processing units, processing means, etc. may be partially or entirely realized by hardware, for example, by designing an integrated circuit. Furthermore, each of the above configurations, functions, etc. may be realized by software by a processor interpreting and executing a program for realizing each function. Information such as programs, tables, files, etc. that realize each function may be stored in a memory, a recording device such as a hard disk, an SSD (Solid State Drive), or a recording medium such as an IC card, SD card, or DVD. However, it may also be stored in a device on a communication network.
 また、制御線や情報線は説明上必要と考えられるものを示しており、製品上必ずしも全ての制御線や情報線を示しているとは限らない。実際には殆ど全ての構成が相互に接続されていると考えてもよい。 In addition, control lines and information lines are shown that are considered necessary for explanation, and not all control lines and information lines are necessarily shown in the product. In reality, almost all components may be considered to be interconnected.
 前記実施の形態は、以下の形態を含む。 The embodiment includes the following embodiments.
 (付記1)
 携帯情報端末であって、
 ディスプレイと、
 カメラと、
 距離センサと、
 タイマーと、
 プロセッサと、を備え、
 前記プロセッサは、
  前記カメラが撮像した画像を基にユーザの顔又は前記ユーザの目を認識し、
  前記距離センサから前記ユーザの顔又は前記ユーザの目までの距離を取得し、
  前記ユーザの顔又は前記ユーザの目までの距離が予め定めた視力低下判定距離以下となってからの第1経過時間を前記タイマーから取得し、前記第1経過時間が予め定めた開始判定時間以上になると、前記ディスプレイの視認性を低下させる視認性低下制御を行い、
  前記ユーザの顔又は前記ユーザの目までの距離が前記視力低下判定距離を超えてからの第2経過時間を前記タイマーから取得し、前記第2経過時間が予め定めた終了判定時間以上となると前記視認性低下制御を終了して前記ディスプレイを通常表示に復帰させる、
 携帯情報端末。
(Additional note 1)
A mobile information terminal,
display and
camera and
distance sensor;
timer and
comprising a processor;
The processor includes:
Recognizing the user's face or the user's eyes based on the image captured by the camera,
obtaining the distance from the distance sensor to the user's face or the user's eyes;
A first elapsed time from when the distance to the user's face or the user's eyes becomes less than or equal to a predetermined visual acuity deterioration determination distance is obtained from the timer, and the first elapsed time is greater than or equal to a predetermined start determination time. , perform visibility reduction control to reduce the visibility of the display,
A second elapsed time from when the distance to the user's face or the user's eyes exceeds the visual acuity deterioration determination distance is acquired from the timer, and when the second elapsed time exceeds a predetermined end determination time, the terminating the visibility reduction control and returning the display to normal display;
Mobile information terminal.
 (付記2)
 携帯情報端末であって、
 ディスプレイと、
 カメラと、
 距離センサと、
 タイマーと、
 プロセッサと、を備え、
 前記プロセッサは、
  前記カメラが撮像した画像を基にユーザを認識し、
  前記距離センサから前記ユーザまでの距離を取得し、
  前記ユーザまでの距離が予め定めた視力低下判定距離以下となってからの第1経過時間を前記タイマーから取得し、前記第1経過時間が予め定めた開始判定時間以上になると、前記ディスプレイの視認性を低下させる視認性低下制御を行い、
  前記ユーザまでの距離が前記視力低下判定距離を超えてからの第2経過時間を前記タイマーから取得し、前記第2経過時間が予め定めた終了判定時間以上となると前記視認性低下制御を終了して前記ディスプレイを通常表示に復帰させる、
 携帯情報端末。
(Additional note 2)
A mobile information terminal,
display and
camera and
distance sensor;
timer and
comprising a processor;
The processor includes:
Recognizing the user based on the image captured by the camera,
obtaining the distance from the distance sensor to the user;
A first elapsed time from when the distance to the user becomes less than or equal to a predetermined visual acuity deterioration determination distance is obtained from the timer, and when the first elapsed time becomes equal to or greater than a predetermined start determination time, visual recognition of the display is performed. Visibility reduction control that reduces visibility
A second elapsed time from when the distance to the user exceeds the visual acuity reduction determination distance is acquired from the timer, and when the second elapsed time exceeds a predetermined end determination time, the visibility reduction control is terminated. to return the display to normal display;
Mobile information terminal.
 (付記3)
 携帯情報端末における表示制御方法であって、
 携帯情報端末は、ディスプレイ、カメラ、距離センサ、タイマー及びプロセッサを備え、
 前記プロセッサは、
 前記カメラが撮像した画像を基にユーザの顔又は前記ユーザの目を認識し、前記距離センサから前記ユーザの顔又は前記ユーザの目までの距離を取得するステップと、
 前記ユーザの顔又は前記ユーザの目までの距離が予め定めた視力低下判定距離以下となってからの第1経過時間を前記タイマーから取得し、前記第1経過時間が予め定めた開始判定時間以上になると、前記ディスプレイの視認性を低下させる視認性低下制御を開始するステップと、
 前記ユーザの顔又は前記ユーザの目までの距離が前記視力低下判定距離を超えてからの第2経過時間を前記タイマーから取得し、前記第2経過時間が予め定めた終了判定時間以上となると前記視認性低下制御を終了して前記ディスプレイを通常表示に復帰させるステップと、を実行する、
 携帯情報端末の表示制御方法。
(Additional note 3)
A display control method in a mobile information terminal, the method comprising:
The mobile information terminal includes a display, a camera, a distance sensor, a timer and a processor,
The processor includes:
Recognizing the user's face or the user's eyes based on the image captured by the camera, and obtaining the distance from the distance sensor to the user's face or the user's eyes;
A first elapsed time from when the distance to the user's face or the user's eyes becomes less than or equal to a predetermined visual acuity deterioration determination distance is obtained from the timer, and the first elapsed time is greater than or equal to a predetermined start determination time. the step of starting visibility reduction control to reduce the visibility of the display;
A second elapsed time from when the distance to the user's face or the user's eyes exceeds the visual acuity deterioration determination distance is acquired from the timer, and when the second elapsed time exceeds a predetermined end determination time, the terminating the visibility reduction control and returning the display to normal display;
A display control method for a mobile information terminal.
100:スマートフォン、100A:スマートフォン、100B:スマートフォン、101:メインプロセッサ、101a:顔認識部、101b:視認性低下処理部、101c:計時部、101d:ユーザ認証部、101e:モード設定部、102:バス、110:記憶装置、111:RAM、112:ROM、113:フラッシュメモリ、113a:テーブル記憶部、113b:ユーザ登録情報記憶部、120:入力I/F、121:ボタンスイッチ、122:タッチパネル、130:画像処理装置、131:インカメラ、132:アウトカメラ、133:ディスプレイ、140:音声処理装置、141:スピーカー、142:マイク、150:センサ群、151:GPS受信機、152:地磁気センサ、153:距離センサ、154:加速度センサ、155:ジャイロセンサ、156:生体情報取得センサ、160:通信I/F、161:無線通信I/F、162:電話網通信I/F、171:タイマー、172:拡張I/F、173:タイマー、200:HMD、201:メインプロセッサ、202:バス、210:記憶装置、211:RAM、212:ROM、213:フラッシュメモリ、220:入力I/F、221:ボタンスイッチ、230:画像処理装置、232:アウトカメラ、233:透過型ディスプレイ、240:音声処理装置、241:スピーカー、242:マイク、250:センサ群、251:GPS受信機、252:地磁気センサ、253:距離センサ、254:加速度センサ、255:ジャイロセンサ、256:生体情報取得センサ、260:通信I/F、261:無線通信I/F、272:拡張I/F、273:タイマー、300:通信ネットワーク、310:サーバ、321:無線ルータ、322:無線ルータ、1331:通常画面、1332:小画面、1333:距離画面サイズテーブル、1334:画面、1335:画面、1336:距離開始判定時間テーブル、1337:画面、1337a:警告文、1338:画面、1338a:ゲージ、1339:画面、1339a:背景、1339b:AR表示、A:ユーザ、B:ユーザ、D:距離、Dth:視力低下判定距離、T1:第1経過時間、T2:第2経過時間、T3:実行時間、TE:終了判定時間、TS:開始判定時間 100: Smartphone, 100A: Smartphone, 100B: Smartphone, 101: Main processor, 101a: Face recognition section, 101b: Visibility reduction processing section, 101c: Time measurement section, 101d: User authentication section, 101e: Mode setting section, 102: bus, 110: storage device, 111: RAM, 112: ROM, 113: flash memory, 113a: table storage section, 113b: user registration information storage section, 120: input I/F, 121: button switch, 122: touch panel, 130: Image processing device, 131: In-camera, 132: Out-camera, 133: Display, 140: Audio processing device, 141: Speaker, 142: Microphone, 150: Sensor group, 151: GPS receiver, 152: Geomagnetic sensor, 153: distance sensor, 154: acceleration sensor, 155: gyro sensor, 156: biological information acquisition sensor, 160: communication I/F, 161: wireless communication I/F, 162: telephone network communication I/F, 171: timer, 172: Expansion I/F, 173: Timer, 200: HMD, 201: Main processor, 202: Bus, 210: Storage device, 211: RAM, 212: ROM, 213: Flash memory, 220: Input I/F, 221 : Button switch, 230: Image processing device, 232: Out camera, 233: Transparent display, 240: Audio processing device, 241: Speaker, 242: Microphone, 250: Sensor group, 251: GPS receiver, 252: Geomagnetic sensor , 253: distance sensor, 254: acceleration sensor, 255: gyro sensor, 256: biological information acquisition sensor, 260: communication I/F, 261: wireless communication I/F, 272: expansion I/F, 273: timer, 300 : Communication network, 310: Server, 321: Wireless router, 322: Wireless router, 1331: Normal screen, 1332: Small screen, 1333: Distance screen size table, 1334: Screen, 1335: Screen, 1336: Distance start judgment time table , 1337: Screen, 1337a: Warning text, 1338: Screen, 1338a: Gauge, 1339: Screen, 1339a: Background, 1339b: AR display, A: User, B: User, D: Distance, Dth: Visual acuity deterioration judgment distance, T1: First elapsed time, T2: Second elapsed time, T3: Execution time, TE: End determination time, TS: Start determination time

Claims (12)

  1.  携帯情報端末であって、
     ディスプレイと、
     カメラと、
     距離センサと、
     タイマーと、
     プロセッサと、を備え、
     前記プロセッサは、
      前記カメラが撮像した画像を基にユーザの顔又は前記ユーザの目を認識し、
      前記距離センサから前記ユーザの顔又は前記ユーザの目までの距離を取得し、
      前記ユーザの顔又は前記ユーザの目までの距離が予め定めた視力低下判定距離以下となってからの第1経過時間を前記タイマーから取得し、前記第1経過時間が予め定めた開始判定時間以上になると、前記ディスプレイの視認性を低下させる視認性低下制御を行い、
      前記ユーザの顔又は前記ユーザの目までの距離が前記視力低下判定距離を超えてからの第2経過時間を前記タイマーから取得し、前記第2経過時間が予め定めた終了判定時間以上となると前記視認性低下制御を終了して前記ディスプレイを通常表示に復帰させる、
     携帯情報端末。
    A mobile information terminal,
    display and
    camera and
    distance sensor and
    timer and
    comprising a processor;
    The processor includes:
    Recognizing the user's face or the user's eyes based on the image captured by the camera,
    obtaining the distance from the distance sensor to the user's face or the user's eyes;
    A first elapsed time from when the distance to the user's face or the user's eyes becomes less than or equal to a predetermined visual acuity deterioration determination distance is obtained from the timer, and the first elapsed time is greater than or equal to a predetermined start determination time. , perform visibility reduction control to reduce the visibility of the display,
    A second elapsed time from when the distance to the user's face or the user's eyes exceeds the visual acuity deterioration determination distance is acquired from the timer, and when the second elapsed time exceeds a predetermined end determination time, the terminating the visibility reduction control and returning the display to normal display;
    Mobile information terminal.
  2.  請求項1に記載の携帯情報端末であって、
     前記プロセッサは、前記視認性低下制御中に前記ユーザの顔又は前記ユーザの目までの距離に応じて前記ディスプレイの視認性低下強度を変更する、
     携帯情報端末。
    The mobile information terminal according to claim 1,
    The processor changes the visibility reduction strength of the display according to the distance to the user's face or the user's eyes during the visibility reduction control.
    Mobile information terminal.
  3.  請求項2に記載の携帯情報端末であって、
     前記視認性低下強度の変更は、前記ディスプレイの画面サイズ、前記ディスプレイの画面のぼかし度、及びバイブレータの強度の少なくとも一つを変更して行う、
     携帯情報端末。
    The mobile information terminal according to claim 2,
    The visibility reduction intensity is changed by changing at least one of the screen size of the display, the degree of blur of the screen of the display, and the intensity of the vibrator.
    Mobile information terminal.
  4.  請求項1に記載の携帯情報端末であって、
     前記プロセッサは、前記開始判定時間を前記ユーザの顔又は前記ユーザの目までの距離に応じて変更する、
     携帯情報端末。
    The mobile information terminal according to claim 1,
    The processor changes the start determination time according to the distance to the user's face or the user's eyes.
    Mobile information terminal.
  5.  請求項1に記載の携帯情報端末であって、
     前記プロセッサは、前記視認性低下制御の実行時間を前記タイマーから取得し、前記実行時間に応じて前記ディスプレイの視認性低下強度を変更する、
     携帯情報端末。
    The mobile information terminal according to claim 1,
    The processor obtains the execution time of the visibility reduction control from the timer, and changes the visibility reduction strength of the display according to the execution time.
    Mobile information terminal.
  6.  請求項1に記載の携帯情報端末であって、
     前記プロセッサは、
      前記携帯情報端末のユーザ毎に前記視認性低下制御のオンまたはオフを関連付けた登録情報と前記画像から認識したユーザを照合し、
      前記認識したユーザと前記登録情報に登録されたユーザとが一致すると、前記登録情報に従って前記視認性低下制御のオンまたはオフを制御し、
      前記認識したユーザと前記登録情報に登録されたユーザとが一致しなければ、当該ユーザの年齢を推定し、推定された年齢が予め指定された年齢以下であれば前記視認性低下制御をオンに設定する、
     携帯情報端末。
    The mobile information terminal according to claim 1,
    The processor includes:
    collating registered information that associates on or off of the visibility reduction control for each user of the mobile information terminal with the user recognized from the image;
    When the recognized user and the user registered in the registration information match, controlling on or off of the visibility reduction control according to the registration information,
    If the recognized user and the user registered in the registration information do not match, the age of the user is estimated, and if the estimated age is below a pre-specified age, the visibility reduction control is turned on. set,
    Mobile information terminal.
  7.  携帯情報端末であって、
     ディスプレイと、
     カメラと、
     距離センサと、
     タイマーと、
     プロセッサと、を備え、
     前記プロセッサは、
      前記カメラが撮像した画像を基にユーザを認識し、
      前記距離センサから前記ユーザまでの距離を取得し、
      前記ユーザまでの距離が予め定めた視力低下判定距離以下となってからの第1経過時間を前記タイマーから取得し、前記第1経過時間が予め定めた開始判定時間以上になると、前記ディスプレイの視認性を低下させる視認性低下制御を行い、
      前記ユーザまでの距離が前記視力低下判定距離を超えてからの第2経過時間を前記タイマーから取得し、前記第2経過時間が予め定めた終了判定時間以上となると前記視認性低下制御を終了して前記ディスプレイを通常表示に復帰させる、
     携帯情報端末。
    A mobile information terminal,
    display and
    camera and
    distance sensor;
    timer and
    comprising a processor;
    The processor includes:
    Recognizing the user based on the image captured by the camera,
    obtaining the distance from the distance sensor to the user;
    A first elapsed time from when the distance to the user becomes less than or equal to a predetermined visual acuity deterioration determination distance is obtained from the timer, and when the first elapsed time becomes equal to or greater than a predetermined start determination time, visual recognition of the display is performed. Visibility reduction control that reduces visibility
    A second elapsed time from when the distance to the user exceeds the visual acuity reduction determination distance is acquired from the timer, and when the second elapsed time exceeds a predetermined end determination time, the visibility reduction control is terminated. to return the display to normal display;
    Mobile information terminal.
  8.  請求項7に記載の携帯情報端末であって、
     前記認識は、前記ユーザの顔又は目であり、
     前記ユーザまでの距離は、前記距離センサから前記ユーザの顔又は目までの距離である、
     携帯情報端末。
    The mobile information terminal according to claim 7,
    The recognition is the user's face or eyes,
    The distance to the user is the distance from the distance sensor to the user's face or eyes.
    Mobile information terminal.
  9.  請求項8に記載の携帯情報端末であって、
     前記ディスプレイの視認性低下強度の変更は、前記ディスプレイの画面サイズ、前記ディスプレイの画面のぼかし度、及びバイブレータの強度の少なくとも一つを変更して行う、
     携帯情報端末。
    The portable information terminal according to claim 8,
    Changing the visibility reduction strength of the display is performed by changing at least one of the screen size of the display, the degree of blur of the screen of the display, and the intensity of the vibrator.
    Mobile information terminal.
  10.  請求項7に記載の携帯情報端末であって、
     前記プロセッサは、前記開始判定時間を前記ユーザの顔又は前記ユーザの目までの距離に応じて変更する、
     携帯情報端末。
    The mobile information terminal according to claim 7,
    The processor changes the start determination time according to the distance to the user's face or the user's eyes.
    Mobile information terminal.
  11.  請求項7に記載の携帯情報端末であって、
     前記プロセッサは、前記ディスプレイの視認性を低下させる視認性低下制御の実行時間を前記タイマーから取得し、前記実行時間に応じて前記ディスプレイの視認性低下強度を変更する、
     携帯情報端末。
    The mobile information terminal according to claim 7,
    The processor obtains an execution time of visibility reduction control for reducing the visibility of the display from the timer, and changes the visibility reduction intensity of the display according to the execution time.
    Mobile information terminal.
  12.  携帯情報端末における表示制御方法であって、
     携帯情報端末は、ディスプレイ、カメラ、距離センサ、タイマー及びプロセッサを備え、
     前記プロセッサは、
     前記カメラが撮像した画像を基にユーザの顔又は前記ユーザの目を認識し、前記距離センサから前記ユーザの顔又は前記ユーザの目までの距離を取得するステップと、
     前記ユーザの顔又は前記ユーザの目までの距離が予め定めた視力低下判定距離以下となってからの第1経過時間を前記タイマーから取得し、前記第1経過時間が予め定めた開始判定時間以上になると、前記ディスプレイの視認性を低下させる視認性低下制御を開始するステップと、
     前記ユーザの顔又は前記ユーザの目までの距離が前記視力低下判定距離を超えてからの第2経過時間を前記タイマーから取得し、前記第2経過時間が予め定めた終了判定時間以上となると前記視認性低下制御を終了して前記ディスプレイを通常表示に復帰させるステップと、を実行する、
     携帯情報端末の表示制御方法。
     
    A display control method in a mobile information terminal, the method comprising:
    The mobile information terminal includes a display, a camera, a distance sensor, a timer and a processor,
    The processor includes:
    Recognizing the user's face or the user's eyes based on the image captured by the camera, and obtaining the distance from the distance sensor to the user's face or the user's eyes;
    A first elapsed time from when the distance to the user's face or the user's eyes becomes less than or equal to a predetermined visual acuity deterioration determination distance is obtained from the timer, and the first elapsed time is equal to or greater than a predetermined start determination time. the step of starting visibility reduction control to reduce the visibility of the display;
    A second elapsed time from when the distance to the user's face or the user's eyes exceeds the visual acuity deterioration determination distance is acquired from the timer, and when the second elapsed time exceeds a predetermined end determination time, the terminating the visibility reduction control and returning the display to normal display;
    A display control method for a mobile information terminal.
PCT/JP2022/021661 2022-05-27 2022-05-27 Portable information terminal and display control method for portable information terminal WO2023228391A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/021661 WO2023228391A1 (en) 2022-05-27 2022-05-27 Portable information terminal and display control method for portable information terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/021661 WO2023228391A1 (en) 2022-05-27 2022-05-27 Portable information terminal and display control method for portable information terminal

Publications (1)

Publication Number Publication Date
WO2023228391A1 true WO2023228391A1 (en) 2023-11-30

Family

ID=88918785

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/021661 WO2023228391A1 (en) 2022-05-27 2022-05-27 Portable information terminal and display control method for portable information terminal

Country Status (1)

Country Link
WO (1) WO2023228391A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107480497A (en) * 2017-07-31 2017-12-15 广东欧珀移动通信有限公司 Mobile terminal and child mode implementation method, computer-readable recording medium
JP2019523514A (en) * 2016-07-21 2019-08-22 ビジョンアップ ソルーションズ エス.エル. System and method for preventing a reduction in visual acuity caused by proximity work in an apparatus having an electronic screen
JP2019530044A (en) * 2016-07-04 2019-10-17 シンガポール ヘルス サービシーズ ピーティーイー リミテッド Apparatus and method for monitoring device usage
KR20200046526A (en) * 2018-10-24 2020-05-07 이차주 Screen control device and method for sight protection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019530044A (en) * 2016-07-04 2019-10-17 シンガポール ヘルス サービシーズ ピーティーイー リミテッド Apparatus and method for monitoring device usage
JP2019523514A (en) * 2016-07-21 2019-08-22 ビジョンアップ ソルーションズ エス.エル. System and method for preventing a reduction in visual acuity caused by proximity work in an apparatus having an electronic screen
CN107480497A (en) * 2017-07-31 2017-12-15 广东欧珀移动通信有限公司 Mobile terminal and child mode implementation method, computer-readable recording medium
KR20200046526A (en) * 2018-10-24 2020-05-07 이차주 Screen control device and method for sight protection

Similar Documents

Publication Publication Date Title
US10891953B2 (en) Multi-mode guard for voice commands
US10324294B2 (en) Display control device, display control method, and computer program
US11340560B2 (en) Information processing apparatus, control method, and program
JP6549693B2 (en) Wearable device, control method and control program
WO2014084224A1 (en) Electronic device and line-of-sight input method
JP6750697B2 (en) Information processing apparatus, information processing method, and program
US20170163866A1 (en) Input System
US20240077725A1 (en) Glasses-type wearable information device, method for glasses-type wearable information device, and storage medium
CN113301247A (en) Voice input device, control method thereof and storage medium
US11394862B2 (en) Voice input apparatus, control method thereof, and storage medium for executing processing corresponding to voice instruction
US20240062583A1 (en) Electronic apparatus and method for controlling the same
WO2023228391A1 (en) Portable information terminal and display control method for portable information terminal
US20220334648A1 (en) Wearable information terminal, control method thereof, and storage medium
US20230018866A1 (en) Electronic device that displays a plurality of display items on a display and method for controlling electronic device
JP2013115633A (en) Information terminal and control program for the same
KR20170084443A (en) Method for controlling head mount display device
US11735181B2 (en) Voice input apparatus, control method thereof, and storage medium for executing processing corresponding to voice instruction
US11442244B2 (en) Imaging device
CN111176601B (en) Processing method and device
JP2015136030A (en) Imaging apparatus and electronic apparatus
US11556173B2 (en) Electronic apparatus, method for controlling the same, and storage medium
WO2022158280A1 (en) Imaging device, imaging control method, and program
CN115268627A (en) Intelligent glasses information processing method, intelligent glasses and computer readable storage medium
CN111556242A (en) Screen providing method and electronic device supporting the same
US20170160910A1 (en) Method and electronic device for screen adjustment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22943791

Country of ref document: EP

Kind code of ref document: A1