WO2014162570A1 - Dispositif de commande d'affichage, procédé de commande d'affichage, programme de commande d'affichage, et support d'enregistrement lisible par ordinateur - Google Patents

Dispositif de commande d'affichage, procédé de commande d'affichage, programme de commande d'affichage, et support d'enregistrement lisible par ordinateur Download PDF

Info

Publication number
WO2014162570A1
WO2014162570A1 PCT/JP2013/060388 JP2013060388W WO2014162570A1 WO 2014162570 A1 WO2014162570 A1 WO 2014162570A1 JP 2013060388 W JP2013060388 W JP 2013060388W WO 2014162570 A1 WO2014162570 A1 WO 2014162570A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
display control
user
movement state
unit
Prior art date
Application number
PCT/JP2013/060388
Other languages
English (en)
Japanese (ja)
Inventor
直明 堀内
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to JP2015509819A priority Critical patent/JP6023874B2/ja
Priority to PCT/JP2013/060388 priority patent/WO2014162570A1/fr
Publication of WO2014162570A1 publication Critical patent/WO2014162570A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3664Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures

Definitions

  • the present invention relates to a display control device, a display control method, a display control program, and a computer-readable recording medium.
  • a navigation device that displays a map around the current location of the user on a display and guides the user around the current location.
  • a touch panel display is adopted as the display of such a navigation device, and the navigation device also displays an operation button for accepting an operation from a user on the display.
  • Patent Document 1 there is a technique in which a user's destination is predicted from the user's movement history and an advertisement related to the predicted destination is presented to the user (see, for example, Patent Document 1 below).
  • a display control device comprising: a display unit that displays a moving state of a moving body and functions as an operation receiving unit that receives a user operation. Based on the current movement state of the moving body and the user's operation history when the movement state is the same as the current movement state in the past, an operation button for receiving a future operation from the user is displayed on the display unit.
  • the display control means to display is provided.
  • the display control method according to the invention of claim 5 is a display control method performed by a display control apparatus comprising: a display unit that displays a moving state of the moving body and functions as an operation receiving unit that receives a user operation. Based on the current movement state of the moving body and the user's operation history when the movement state is the same as the current movement state in the past, an operation button for accepting a future operation from the user is displayed on the display means. It includes a display control step for displaying.
  • the display control program according to the invention of claim 6 causes a computer to execute the display control method according to claim 5.
  • a computer-readable recording medium stores the display control program according to claim 6.
  • FIG. 1 is a block diagram showing a functional configuration of a display control apparatus according to an embodiment of the present invention.
  • FIG. 2 is a flowchart illustrating an example of processing performed by the display control apparatus according to the present embodiment.
  • FIG. 3 is a block diagram illustrating a hardware configuration of the navigation apparatus according to the present embodiment.
  • FIG. 4 is an explanatory diagram illustrating an example of operation history information according to the present embodiment.
  • FIG. 5 is an explanatory diagram (part 1) illustrating an example of processing performed by the navigation device according to the present embodiment.
  • FIG. 6 is an explanatory diagram (part 2) illustrating an example of processing performed by the navigation device of the present embodiment.
  • FIG. 7 is an explanatory diagram showing a display example of operation buttons by the navigation device of the present embodiment.
  • FIG. 1 is a block diagram showing a functional configuration of a display control apparatus according to an embodiment of the present invention.
  • the display control apparatus 100 shown in FIG. 1 is used by being mounted on a moving body such as a vehicle, for example.
  • a moving body on which the display control device 100 is mounted is simply abbreviated as “moving body”, and a user (user) of the display control device 100 is simply abbreviated as “user”.
  • the display control apparatus 100 may be carried and used by the user, and in this case, the moving body is the user himself.
  • the display control apparatus 100 includes a display unit 101 and a display control unit 102.
  • the display unit 101 functions as an operation receiving unit that displays a moving state of the moving body and receives a user operation.
  • the display unit 101 is realized by a touch panel display.
  • the display unit 101 is controlled by the display control unit 102.
  • the display control unit 102 controls the display unit 101 to display the moving state of the moving body on the display unit 101.
  • the moving state is a state including the position of the moving body, the date and time when the moving body is moving, the attribute of the road on which the moving body is moving, and the like.
  • the display control apparatus 100 includes an acquisition unit 103.
  • the acquisition unit 103 acquires information for specifying the moving state of the moving object.
  • the acquisition unit 103 acquires information indicating the current location of the moving object, information indicating the current date and time, and map data.
  • the display control unit 102 causes the display unit 101 to display a map around the current location of the mobile object based on the information indicating the current location of the mobile object acquired by the acquisition unit 103 and the map data. Then, the display control unit 102 displays an icon indicating that the moving body is located at a position corresponding to the current location of the moving body on the map. As a result, the display control unit 102 can cause the display unit 101 to display the current location of the moving body and the attributes of the road on which the moving body is moving (for example, information indicating whether the road is a toll road or a general road). Further, the display control unit 102 causes the display unit 101 to display the current date and time based on information indicating the current date and time acquired by the acquisition unit 103.
  • the display control unit 102 controls the display unit 101 to cause the display unit 101 to display an operation button for accepting an operation from the user. Specifically, the display control unit 102 receives a future operation from the user based on the current movement state of the moving object and the user's operation history when the movement state is the same as the current movement state in the past. Operation buttons are displayed on the display unit 101.
  • the display control apparatus 100 includes a prediction unit 104. Based on the current movement state of the moving object and the user's operation history when the movement state is the same as the current movement state in the past, the prediction unit 104 performs future operations from the user (operations desired by the user). ). For example, the prediction unit 104 calculates the probability that each operation has been performed by the user when the current movement state is the same as the current movement state in the past, and predicts the user's future operation based on the probability.
  • the display control unit 102 determines an operation button to be displayed and a display mode of the operation button based on a prediction result by the prediction unit 104.
  • the display mode of the operation buttons can include a display position, a display size, a display color, and the like.
  • the display control unit 102 sets the display position of each operation button so that the operation button corresponding to the operation that is predicted to be performed by the prediction unit 104 has a higher display position on the display screen. decide. For example, the display control unit 102 determines the display position of each operation button so that the operation button corresponding to the operation predicted to be performed by the prediction unit 104 has a larger display size. Further, for example, the display control unit 102 determines the display position of each operation button so that the operation button corresponding to the operation that is predicted to be performed by the prediction unit 104 is darker.
  • the display control unit 102 displays the operation buttons on the display unit 101 in the determined display mode.
  • the display control device 100 performs various processes corresponding to the operation button touched by the user when the user receives an operation from the user by touching the operation button displayed on the display unit 101.
  • FIG. 2 is a flowchart illustrating an example of processing performed by the display control apparatus according to the present embodiment.
  • the display control apparatus 100 performs the process shown in FIG. 2 at predetermined intervals (for example, every second) during startup.
  • the display control apparatus 100 acquires the current moving state of the moving body (step S201). Subsequently, the display control apparatus 100 causes the display unit 101 to display operation buttons based on the acquired current movement state and the user's operation history when the movement state is the same as the current movement state in the past (step S101). S202), the process shown in FIG.
  • the display control apparatus 100 executes various processes corresponding to the operation button touched by the user.
  • the display control apparatus 100 causes the display unit 101 to display operation buttons based on the user's operation history when the movement state is the same as the current movement state in the past. For this reason, the display control apparatus 100 can display the operation button corresponding to the operation desired by the user on the display unit 101, and can simplify the user's operation input using the operation button. Thereby, the display control apparatus 100 can improve the user operability by the operation buttons displayed on the display unit 101.
  • the display control device 100 can display operation buttons corresponding to operations predicted to have a high probability of being performed by the user in a display mode that is conspicuous for the user. For this reason, the user can easily find an operation button corresponding to a desired operation on the display screen of the display unit 101. Therefore, the display control apparatus 100 can improve user operability by the operation buttons displayed on the display unit 101.
  • the display control apparatus 100 does not display operation buttons corresponding to operations predicted to be low by the user on the display unit 101 or performs operations predicted to have a high probability performed by the user. It can be displayed smaller than the corresponding operation button.
  • the display control apparatus 100 efficiently uses the limited display area of the display unit 101 to change the moving state of the moving object (for example, a map around the current location of the moving object) using the wider display area. Can show. Therefore, the display control apparatus 100 can make it easy for the user to see the moving state of the moving object displayed on the display unit 101.
  • the embodiment of the present invention described below is an example when the present invention is applied to a navigation device mounted on a vehicle.
  • the user of the navigation device of the present embodiment is simply abbreviated as “user”, and the vehicle equipped with the navigation device of the present embodiment is abbreviated as “own vehicle”.
  • FIG. 3 is a block diagram illustrating a hardware configuration of the navigation apparatus according to the present embodiment.
  • the navigation device 300 includes a CPU (Central Processing Unit) 301, a ROM (Read Only Memory) 302, a RAM (Random Access Memory) 303, a magnetic disk drive 304, a magnetic disk 305, Optical disk drive 306, optical disk 307, audio I / F (Interface) 308, speaker 309, microphone 310, input device 311, video I / F 312, display 313, camera 314, and communication I / F 315 And a GPS unit 316 and various sensors 317.
  • the respective components 301 to 317 are connected by a bus 320, respectively.
  • CPU 301 governs overall control of navigation device 300.
  • Various programs such as a boot program and an operation button display control program are recorded in the ROM 302. Note that these programs are not limited to the ROM 302, and may be recorded on a non-volatile recording medium such as the magnetic disk 305 or the optical disk 307.
  • the RAM 303 is used as a work area for the CPU 301.
  • the CPU 301 controls the entire navigation device 300 by executing various programs recorded in the ROM 302 and the like while using the RAM 303 as a work area. Note that the processing results obtained by executing various programs are temporarily recorded in the RAM 303, for example, and read out as necessary. Further, the above processing result may be recorded in a nonvolatile memory such as the magnetic disk 305 or the optical disk 307.
  • the magnetic disk drive 304 controls reading and writing of data with respect to the magnetic disk 305 according to the control of the CPU 301. Data written under the control of the magnetic disk drive 304 is recorded on the magnetic disk 305.
  • Data written under the control of the magnetic disk drive 304 is recorded on the magnetic disk 305.
  • the magnetic disk 305 For example, as the magnetic disk 305, HD (Hard Disk) or FD (Flexible Disk) can be used.
  • the optical disc drive 306 controls reading and writing of data with respect to the optical disc 307 according to the control of the CPU 301.
  • the optical disk 307 is a detachable recording medium from which data is read according to the control of the optical disk drive 306.
  • a CD Compact Disc
  • DVD can be used as the optical disc 307.
  • a writable recording medium can be used as the optical disk 307.
  • the removable recording medium may be an MO (Magneto Optical Disk), a memory card, or the like.
  • map data used for specifying the current location, route search, route guidance, and the like.
  • the map data is composed of nodes and links, and includes road data representing roads on which a moving body (for example, a vehicle) can move, and image data drawn using features related to facilities and other terrain (mountains, rivers, land). It can be data.
  • the map data may include character data indicating the name and address of the facility. An image represented by these data is drawn two-dimensionally or three-dimensionally on the display screen of the display 313.
  • the road data includes information indicating the attributes of the road corresponding to each link such as the length (distance), road width, road type (highway, toll road, general road, private road) for each link. It may be included.
  • the map data is recorded on the magnetic disk 305 or the optical disk 307.
  • the map data is not limited to the one provided integrally with the hardware of the navigation device 300, and may be provided outside the navigation device 300.
  • the navigation apparatus 300 acquires map data from the outside via the network by the communication I / F 315, records the acquired map data in the RAM 303, the magnetic disk 305, and the like, and reads out as necessary. .
  • the audio I / F 308 is connected to a speaker 309 for audio output and a microphone 310 that converts the input audio into an audio signal.
  • the audio I / F 308 outputs a predetermined audio (for example, an alarm sound) from the speaker 309 or outputs an audio signal input from the microphone 310 to the CPU 301 under the control of the CPU 301.
  • Examples of the input device 311 include a remote controller having a plurality of keys for inputting characters, numerical values, various instructions, a keyboard, a mouse, a touch panel, and the like. The input device 311 inputs a signal corresponding to the key selected by the user into the apparatus.
  • a display 313 is connected to the video I / F 312.
  • the video I / F 312 includes a graphic controller that controls the entire display 313, a buffer memory such as a VRAM (Video RAM) that temporarily records image information that can be displayed immediately, and image data output from the graphic controller.
  • the display 313 is configured based on a control IC or the like.
  • the display 313 includes an icon (for example, a host vehicle icon for indicating the current location of the host vehicle), an operation button for instructing the navigation device 300 to perform a predetermined operation, a cursor, a menu, a window, or a character or image (for example, Various data such as maps are displayed.
  • an icon for example, a host vehicle icon for indicating the current location of the host vehicle
  • an operation button for instructing the navigation device 300 to perform a predetermined operation
  • a cursor for example, a menu, a window, or a character or image
  • a character or image for example, Various data such as maps are displayed.
  • a CRT a CRT, a TFT liquid crystal display, a plasma display, an organic EL display, or the like can be used.
  • the video I / F 312 is configured such that the operation buttons are displayed in front of the map image on the display screen by using a hidden surface removal method such as a Z buffer method. Since the Z buffer method is a known technique, a detailed description thereof is omitted here.
  • the camera 314 captures the inside of the host vehicle or the vicinity of the host vehicle, and inputs this shooting data to the video I / F 312.
  • the communication I / F 315 is connected to the network via wireless and functions as an interface between the navigation device 300 and the CPU 301.
  • the communication I / F 315 is further connected to a communication network such as the Internet via wireless, and also functions as an interface between the communication network and the CPU 301.
  • the communication I / F 315 receives a television broadcast or a radio broadcast.
  • Communication networks include LAN, WAN, public line network and mobile phone network.
  • the communication I / F 315 includes an FM tuner, a VICS / beacon receiver, a wireless navigation device, and other navigation devices 300, and acquires road traffic information such as traffic congestion and traffic regulations distributed from the VICS center.
  • VICS is a registered trademark.
  • the GPS unit 316 receives GPS signals (radio waves) from GPS satellites and measures the current position of the vehicle.
  • the current position measured by the GPS unit 316 is used when the CPU 301 specifies the current position of the vehicle together with output values of various sensors 317 described later.
  • Various sensors 317 output information for measuring the behavior of the host vehicle (or the navigation device 300).
  • the various sensors 317 include an acceleration sensor, an angular velocity sensor, a vehicle speed pulse sensor, and the like.
  • the output values of the various sensors 317 are used by the CPU 301 for specifying the current position of the vehicle (or the navigation device 300), detecting the acceleration, detecting the angular velocity, calculating the azimuth change amount, and the like.
  • the display unit 101 illustrated in FIG. 1 can realize its function by the input device 311, the video I / F 312, the display 313, and the like. Further, the display control unit 102 can realize its function by the CPU 301, the ROM 302, the RAM 303, and the like.
  • FIG. 4 is an explanatory diagram illustrating an example of operation history information according to the present embodiment.
  • the operation history information illustrated in FIG. 4 is stored in the magnetic disk 305 of the navigation device 300, for example.
  • the operation history information 400 stores a user's operation history for each predetermined movement state.
  • the movement state includes a movement state ⁇ , a movement state ⁇ , a movement state ⁇ , and a movement state ⁇ .
  • the movement state ⁇ is a movement state when the own vehicle is separated from the user's home by a predetermined distance or more.
  • the navigation device 300 acquires information indicating the current location of the host vehicle at a predetermined cycle (for example, every second) during startup. And the navigation apparatus 300 determines with the movement state (alpha), when the present location of the own vehicle has left
  • the user's home position is set in the navigation device 300 by the user at a predetermined timing (for example, when the navigation device 300 is activated for the first time).
  • the moving state ⁇ is a moving state when the user leaves the home between 7:00 and 9:00 on weekdays.
  • the navigation device 300 acquires information indicating the current date and time (date and time) and the current location of the host vehicle at the time of activation. And the navigation apparatus 300 determines with the movement state (beta), when the present date is from 7:00 to 9:00 of a weekday, and the present location of the own vehicle is a user's home.
  • the moving state ⁇ is a moving state when the user leaves home from 17:00 to 19:00 on weekdays.
  • the navigation device 300 acquires information indicating the current date and time (date and time) and the current location of the host vehicle at the time of activation. And the navigation apparatus 300 determines with the movement state (gamma), when the present date is between 17:00 to 19:00 on a weekday and the present location of the own vehicle is a user's home.
  • the movement state ⁇ is a movement state when the host vehicle is traveling on a toll road.
  • the navigation device 300 acquires information indicating the current location of the host vehicle at a predetermined cycle (for example, every second) during startup. And the navigation apparatus 300 determines with the movement state (delta), when the present location of the own vehicle is a position on the road defined as the toll road in map data.
  • the user's operations for 10 times are stored in association with the movement state ⁇ .
  • This operation is broken down from the top in FIG. 4 to “display route to home”, “display route to home”, “search for nearby restaurants”, “display route to home”, “peripheral” ”Search for restaurants in”, “Show route to home”, “Search for nearby restaurants”, “Search for nearby convenience stores”, “Search for nearby toilets”, “Search for nearby convenience stores” It is.
  • the probability of each operation by the user when in the movement state ⁇ is “4/10” for “display route to home” and “3/10” for “search for nearby restaurants”.
  • Search for nearby convenience stores is “2/10” and “Search for nearby toilets” is “1/10”. From these, when the navigation apparatus 300 is in the movement state ⁇ , the operation performed with the highest probability by the user is “display route to home”, and the operation performed with the next highest probability is “peripheral restaurants” It is possible to predict that the operation performed with the next highest probability is “search for nearby convenience stores”.
  • the navigation apparatus 300 can predict an operation with a high probability and a low operation performed by the user in each movement state.
  • the operation history information 400 may store information indicating the date and time when the operation is performed for each operation.
  • the navigation apparatus 300 can sequentially overwrite the history of old operations with the history of new operations.
  • the navigation device 300 can prevent the amount of data of the operation history information 400 from being enlarged.
  • the navigation apparatus 300 can prevent the operation performed by the user from being predicted using the history of the old operation, and can predict the operation in accordance with the current state of the user. .
  • FIG. 5 is an explanatory diagram (part 1) illustrating an example of processing performed by the navigation device according to the present embodiment.
  • the process illustrated in FIG. 5 is a process performed by the navigation device 300 to store the operation history information 400.
  • the navigation device 300 performs the process shown in FIG. 5 at a predetermined cycle (for example, every second) during startup.
  • the navigation apparatus 300 first determines whether or not there is an operation from the user (step S501). If there is no operation from the user (step S501: No), the navigation apparatus 300 ends the process shown in FIG. If there is an operation from the user (step S501: Yes), the navigation apparatus 300 acquires the moving state by acquiring information indicating the current date and time and the current location of the host vehicle (step S502).
  • the navigation device 300 determines whether or not the acquired movement state is a predetermined movement state (step S503).
  • the navigation apparatus 300 determines whether the moving state is the moving state ⁇ , the moving state ⁇ , the moving state ⁇ , or the moving state ⁇ described above.
  • the determination conditions for determining the movement state ⁇ , the movement state ⁇ , the movement state ⁇ , and the movement state ⁇ are as described above.
  • step S503: Yes If it is determined that the navigation device 300 is in a predetermined movement state (step S503: Yes), the operation history information 400 stores the movement state in association with the user operation that was affirmed in step S501 (step S504). The process shown in FIG. If it is determined in step S503 that the navigation device 300 is not in the predetermined movement state (step S503: No), the processing shown in FIG.
  • FIG. 6 is an explanatory diagram (part 2) illustrating an example of processing performed by the navigation device of the present embodiment.
  • the process illustrated in FIG. 6 is a process performed by the navigation device 300 to display operation buttons based on the user operation history on the display 313.
  • the navigation apparatus 300 performs the process shown in FIG. 6 at a predetermined cycle (for example, every second) during startup.
  • the navigation device 300 first acquires the moving state by acquiring information indicating the current date and time and the current location of the host vehicle (step S601). Subsequently, the navigation device 300 determines whether or not the acquired movement state is a predetermined movement state (step S602). In step S602, the navigation apparatus 300 determines whether the movement state ⁇ , the movement state ⁇ , the movement state ⁇ , or the movement state ⁇ described above is in the movement state.
  • the determination conditions for determining the movement state ⁇ , the movement state ⁇ , the movement state ⁇ , and the movement state ⁇ are as described above.
  • step S602 If it is determined that the navigation device 300 is not in the predetermined movement state (step S602: No), the processing shown in FIG. On the other hand, when it is determined that the navigation device 300 is in the predetermined movement state (step S602: Yes), the operation history information 400 is referred to and the navigation apparatus 300 is performed by the user when the movement state is the same as the current movement state in the past. The probability of each operation is calculated (step S603). The calculation method of this probability is as described above.
  • the navigation device 300 determines an operation button to be displayed on the display 313 based on the calculated probability of each operation (step S604).
  • the navigation device 300 determines to display operation buttons corresponding to a predetermined number (for example, three) of operations in descending order of the calculated probability.
  • the navigation device 300 determines the display position, display size, and display color of each operation button to be displayed on the display 313 based on the calculated probability of each operation (step S605). For example, in step S605, the navigation device 300 determines the display position of each operation button so that the operation buttons with the calculated probabilities are arranged on the display screen in descending order.
  • the navigation device 300 determines the display size of each operation button so that the operation button with the higher probability of operation is larger in display size. Furthermore, the navigation device 300 determines the display color of each operation button so that the operation button with the higher calculated probability is easier to see for the user (for example, a darker color). In addition, the navigation device 300 may determine the transmittance of each operation button so that it is easy for the user to see. In this case, for example, the navigation device 300 determines that the transmittance of the operation button with the higher calculated probability is lower.
  • the navigation device 300 displays the operation buttons determined to be displayed in step S604 on the display 313 in the display position, display size, and display color determined in step S605 (step S606), as shown in FIG.
  • the processing shown in FIG. Note that the navigation device 300 executes various processes corresponding to the operation button touched by the user when the user receives an operation from the user by touching the operation button displayed on the display 313. For example, when the operation button “display route to home” is touched by the user, the navigation device 300 searches for a route from the current location of the vehicle to the user's home and displays the searched route on the display 313. Let
  • FIG. 7 is an explanatory diagram showing a display example of operation buttons by the navigation device of the present embodiment.
  • FIG. 7 shows a display screen of the display 313 when the host vehicle is in the moving state ⁇ as an example of the present embodiment.
  • the navigation device 300 displays a map 700 around the current location of the host vehicle on the display 313.
  • the navigation apparatus 300 displays a host vehicle icon 701 for indicating the current location of the host vehicle at a position corresponding to the current location of the host vehicle in the map 700.
  • the navigation device 300 can also display a home icon for indicating the user's home position in the same manner as the own vehicle icon 701.
  • the own vehicle is in the moving state ⁇ (ie, the own vehicle icon 701).
  • the home icon is not displayed on the display 313 because the vehicle is more than a predetermined distance away from the user's home.
  • the navigation device 300 displays the operation buttons on the display 313 based on the operation history information 400 when in the movement state ⁇ .
  • the probability of each operation by the user when the operation history information 400 is in the movement state ⁇ is “4/10” for “display route to home”, “ Assume that “search for stores” is “3/10”, “search for nearby convenience stores” is “2/10”, and “search for nearby toilets” is “1/10”.
  • the navigation device 300 determines to display the operation buttons for the three operations in the descending order of the calculated probability, the operation button for “display route to home” and the operation button for “search for nearby restaurants” are displayed.
  • the operation button of “search for nearby convenience stores” is determined as an operation button to be displayed. Therefore, these operation buttons 711 to 713 are displayed on the display 313 as shown in FIG.
  • the navigation apparatus 300 may determine the operation buttons that display only the operation buttons with the highest calculated probability. Further, the navigation device 300 may determine the operation button for displaying an operation button for an operation whose calculated probability is equal to or higher than a predetermined probability. As the number of operation buttons to be displayed is reduced, the navigation apparatus 300 can reduce the display area of the display 313 occupied by the operation buttons.
  • the operation button 712 for “search for nearby restaurants” is operated from the operation button 711 for “display route to home”, “ It can be seen that the operation buttons 713 “search for convenience store” are arranged on the display screen in descending order from the operation buttons with the calculated probabilities.
  • the operation button 711 of “display route to home” is the largest, and then the operation button 712 of “search for nearby restaurants” is large.
  • the operation button 713 for “search for nearby convenience stores” is the smallest.
  • the navigation device 300 has an operation button with a high probability of operation, and the density of the display color of the operation button becomes darker and the operation button is easier to see for the user.
  • the navigation apparatus 300 predicts a desired operation based on an operation performed by the user when the movement state is the same as the current movement state in the past, and responds to this operation. Operation buttons can be displayed on the display 313. Thereby, the navigation apparatus 300 can simplify the user's operation input using the operation button displayed on the display 313.
  • the navigation device 300 predicts a desired operation based on an operation performed by the user when the movement state is the same as the current movement state in the past, and provides an operation button corresponding to the operation to the user. It can be displayed on the display 313 so as to be conspicuous. For this reason, the user can easily find an operation button corresponding to a desired operation on the display screen of the display 313. Therefore, the navigation apparatus 300 can improve the operability of the user by the operation buttons displayed on the display 313.
  • the navigation device 300 can display only the operation buttons predicted to have a high probability of being performed by the user on the display 313, and the display area of the display 313 is occupied by the operation buttons predicted to be less necessary for the user. Can be prevented. Thereby, the navigation apparatus 300 can use the display area of the display 313 efficiently. In other words, the navigation device 300 can reduce the area of the portion shielded by the operation button in the map displayed on the display 313, and can show a wider range of the map to the user, improving the convenience for the user. Can do.
  • both the visibility of the moving state of the moving body displayed on the display unit and the operability of the operation using the operation button displayed on the display unit can be achieved. This can improve user convenience.
  • the display control method described in this embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer, a workstation, or a smartphone.
  • This program is recorded on a computer-readable recording medium such as a hard disk, a memory card, a CD-ROM, and a DVD, and is executed by being read from the recording medium by the computer.
  • this program may be distributed via a network such as the Internet.

Abstract

L'invention concerne un dispositif de commande d'affichage (100) qui est équipé d'une unité d'affichage (101) et d'une unité de commande d'affichage (102). L'unité d'affichage (101) affiche l'état de mouvement d'un corps mobile et tient également lieu de moyen d'acceptation d'entrée utilisateur qui accepte une entrée utilisateur. L'unité d'affichage (101) est contrôlée par l'unité de commande d'affichage (102). Sur la base de l'état de mouvement en cours du corps mobile et d'un historique d'entrées utilisateur provenant d'une occurrence précédente au cours de laquelle l'état de mouvement était identique à l'état de mouvement en cours, l'unité de commande d'affichage (102) fait en sorte que l'unité d'affichage (101) affiche les boutons d'entrée utilisateur pour accepter l'entrée subséquente de l'utilisateur.
PCT/JP2013/060388 2013-04-04 2013-04-04 Dispositif de commande d'affichage, procédé de commande d'affichage, programme de commande d'affichage, et support d'enregistrement lisible par ordinateur WO2014162570A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2015509819A JP6023874B2 (ja) 2013-04-04 2013-04-04 表示制御装置、表示制御方法、表示制御プログラムおよびコンピュータが読み取り可能な記録媒体
PCT/JP2013/060388 WO2014162570A1 (fr) 2013-04-04 2013-04-04 Dispositif de commande d'affichage, procédé de commande d'affichage, programme de commande d'affichage, et support d'enregistrement lisible par ordinateur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2013/060388 WO2014162570A1 (fr) 2013-04-04 2013-04-04 Dispositif de commande d'affichage, procédé de commande d'affichage, programme de commande d'affichage, et support d'enregistrement lisible par ordinateur

Publications (1)

Publication Number Publication Date
WO2014162570A1 true WO2014162570A1 (fr) 2014-10-09

Family

ID=51657899

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/060388 WO2014162570A1 (fr) 2013-04-04 2013-04-04 Dispositif de commande d'affichage, procédé de commande d'affichage, programme de commande d'affichage, et support d'enregistrement lisible par ordinateur

Country Status (2)

Country Link
JP (1) JP6023874B2 (fr)
WO (1) WO2014162570A1 (fr)

Cited By (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105930061A (zh) * 2016-05-20 2016-09-07 深圳天珑无线科技有限公司 一种功能键的屏蔽方法和终端
JP2019057290A (ja) * 2015-05-27 2019-04-11 アップル インコーポレイテッドApple Inc. 関連コンテンツを先見的に特定し、タッチ感知デバイス上に表面化させるためのシステム及び方法
US10827330B2 (en) 2015-05-27 2020-11-03 Apple Inc. Systems and methods for proactively identifying and surfacing relevant content on an electronic device with a touch-sensitive display
US10978090B2 (en) 2013-02-07 2021-04-13 Apple Inc. Voice trigger for a digital assistant
US10984798B2 (en) 2018-06-01 2021-04-20 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
US11009970B2 (en) 2018-06-01 2021-05-18 Apple Inc. Attention aware virtual assistant dismissal
US11037565B2 (en) 2016-06-10 2021-06-15 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US11087759B2 (en) 2015-03-08 2021-08-10 Apple Inc. Virtual assistant activation
WO2021157126A1 (fr) * 2020-02-05 2021-08-12 国立大学法人九州工業大学 Programme d'aide à l'entrée de données, dispositif d'aide à l'entrée de données et procédé d'aide à l'entrée de données
US11120372B2 (en) 2011-06-03 2021-09-14 Apple Inc. Performing actions associated with task items that represent tasks to perform
US11126400B2 (en) 2015-09-08 2021-09-21 Apple Inc. Zero latency digital assistant
US11133008B2 (en) 2014-05-30 2021-09-28 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US11152002B2 (en) 2016-06-11 2021-10-19 Apple Inc. Application integration with a digital assistant
US11169616B2 (en) 2018-05-07 2021-11-09 Apple Inc. Raise to speak
US11237797B2 (en) 2019-05-31 2022-02-01 Apple Inc. User activity shortcut suggestions
US11257504B2 (en) 2014-05-30 2022-02-22 Apple Inc. Intelligent assistant for home automation
US11321116B2 (en) 2012-05-15 2022-05-03 Apple Inc. Systems and methods for integrating third party services with a digital assistant
US11348582B2 (en) 2008-10-02 2022-05-31 Apple Inc. Electronic devices with voice command and contextual data processing capabilities
US11380310B2 (en) 2017-05-12 2022-07-05 Apple Inc. Low-latency intelligent automated assistant
US11388291B2 (en) 2013-03-14 2022-07-12 Apple Inc. System and method for processing voicemail
US11405466B2 (en) 2017-05-12 2022-08-02 Apple Inc. Synchronization and task delegation of a digital assistant
US11423886B2 (en) 2010-01-18 2022-08-23 Apple Inc. Task flow identification based on user intent
US11431642B2 (en) 2018-06-01 2022-08-30 Apple Inc. Variable latency device coordination
US11467802B2 (en) 2017-05-11 2022-10-11 Apple Inc. Maintaining privacy of personal information
US11500672B2 (en) 2015-09-08 2022-11-15 Apple Inc. Distributed personal assistant
US11516537B2 (en) 2014-06-30 2022-11-29 Apple Inc. Intelligent automated assistant for TV user interactions
US11526368B2 (en) 2015-11-06 2022-12-13 Apple Inc. Intelligent automated assistant in a messaging environment
US11532306B2 (en) 2017-05-16 2022-12-20 Apple Inc. Detecting a trigger of a digital assistant
US11580990B2 (en) 2017-05-12 2023-02-14 Apple Inc. User-specific acoustic models
US11599331B2 (en) 2017-05-11 2023-03-07 Apple Inc. Maintaining privacy of personal information
US11657813B2 (en) 2019-05-31 2023-05-23 Apple Inc. Voice identification in digital assistant systems
US11671920B2 (en) 2007-04-03 2023-06-06 Apple Inc. Method and system for operating a multifunction portable electronic device using voice-activation
US11670289B2 (en) 2014-05-30 2023-06-06 Apple Inc. Multi-command single utterance input method
US11675829B2 (en) 2017-05-16 2023-06-13 Apple Inc. Intelligent automated assistant for media exploration
US11675491B2 (en) 2019-05-06 2023-06-13 Apple Inc. User configurable task triggers
US11696060B2 (en) 2020-07-21 2023-07-04 Apple Inc. User identification using headphones
US11705130B2 (en) 2019-05-06 2023-07-18 Apple Inc. Spoken notifications
US11710482B2 (en) 2018-03-26 2023-07-25 Apple Inc. Natural assistant interaction
US11727219B2 (en) 2013-06-09 2023-08-15 Apple Inc. System and method for inferring user intent from speech inputs
US11755276B2 (en) 2020-05-12 2023-09-12 Apple Inc. Reducing description length based on confidence
US11765209B2 (en) 2020-05-11 2023-09-19 Apple Inc. Digital assistant hardware abstraction
US11783815B2 (en) 2019-03-18 2023-10-10 Apple Inc. Multimodality in digital assistant systems
US11790914B2 (en) 2019-06-01 2023-10-17 Apple Inc. Methods and user interfaces for voice-based control of electronic devices
US11798547B2 (en) 2013-03-15 2023-10-24 Apple Inc. Voice activated device for use with a voice-based digital assistant
US11809783B2 (en) 2016-06-11 2023-11-07 Apple Inc. Intelligent device arbitration and control
US11809483B2 (en) 2015-09-08 2023-11-07 Apple Inc. Intelligent automated assistant for media search and playback
US11838734B2 (en) 2020-07-20 2023-12-05 Apple Inc. Multi-device audio adjustment coordination
US11853536B2 (en) 2015-09-08 2023-12-26 Apple Inc. Intelligent automated assistant in a media environment
US11853647B2 (en) 2015-12-23 2023-12-26 Apple Inc. Proactive assistance based on dialog communication between devices
US11854539B2 (en) 2018-05-07 2023-12-26 Apple Inc. Intelligent automated assistant for delivering content from user experiences
US11886805B2 (en) 2015-11-09 2024-01-30 Apple Inc. Unconventional virtual assistant interactions
US11888791B2 (en) 2019-05-21 2024-01-30 Apple Inc. Providing message response suggestions
US11893992B2 (en) 2018-09-28 2024-02-06 Apple Inc. Multi-modal inputs for voice commands
US11914848B2 (en) 2020-05-11 2024-02-27 Apple Inc. Providing relevant data items based on context
US11947873B2 (en) 2015-06-29 2024-04-02 Apple Inc. Virtual assistant for media playback

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005315682A (ja) * 2004-04-28 2005-11-10 Nissan Motor Co Ltd 情報提供システム、情報センタ、車載端末および情報提供方法
JP2007062559A (ja) * 2005-08-31 2007-03-15 Denso Corp 車載スイッチシステム
JP2007102442A (ja) * 2005-10-04 2007-04-19 Xanavi Informatics Corp タッチパネル表示装置およびタッチパネル表示装置を備えたナビゲーション装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5280778B2 (ja) * 2008-09-12 2013-09-04 富士通テン株式会社 情報処理装置、画像処理装置、及び情報処理方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005315682A (ja) * 2004-04-28 2005-11-10 Nissan Motor Co Ltd 情報提供システム、情報センタ、車載端末および情報提供方法
JP2007062559A (ja) * 2005-08-31 2007-03-15 Denso Corp 車載スイッチシステム
JP2007102442A (ja) * 2005-10-04 2007-04-19 Xanavi Informatics Corp タッチパネル表示装置およびタッチパネル表示装置を備えたナビゲーション装置

Cited By (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11671920B2 (en) 2007-04-03 2023-06-06 Apple Inc. Method and system for operating a multifunction portable electronic device using voice-activation
US11348582B2 (en) 2008-10-02 2022-05-31 Apple Inc. Electronic devices with voice command and contextual data processing capabilities
US11900936B2 (en) 2008-10-02 2024-02-13 Apple Inc. Electronic devices with voice command and contextual data processing capabilities
US11423886B2 (en) 2010-01-18 2022-08-23 Apple Inc. Task flow identification based on user intent
US11120372B2 (en) 2011-06-03 2021-09-14 Apple Inc. Performing actions associated with task items that represent tasks to perform
US11321116B2 (en) 2012-05-15 2022-05-03 Apple Inc. Systems and methods for integrating third party services with a digital assistant
US10978090B2 (en) 2013-02-07 2021-04-13 Apple Inc. Voice trigger for a digital assistant
US11636869B2 (en) 2013-02-07 2023-04-25 Apple Inc. Voice trigger for a digital assistant
US11557310B2 (en) 2013-02-07 2023-01-17 Apple Inc. Voice trigger for a digital assistant
US11862186B2 (en) 2013-02-07 2024-01-02 Apple Inc. Voice trigger for a digital assistant
US11388291B2 (en) 2013-03-14 2022-07-12 Apple Inc. System and method for processing voicemail
US11798547B2 (en) 2013-03-15 2023-10-24 Apple Inc. Voice activated device for use with a voice-based digital assistant
US11727219B2 (en) 2013-06-09 2023-08-15 Apple Inc. System and method for inferring user intent from speech inputs
US11810562B2 (en) 2014-05-30 2023-11-07 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US11670289B2 (en) 2014-05-30 2023-06-06 Apple Inc. Multi-command single utterance input method
US11133008B2 (en) 2014-05-30 2021-09-28 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US11257504B2 (en) 2014-05-30 2022-02-22 Apple Inc. Intelligent assistant for home automation
US11699448B2 (en) 2014-05-30 2023-07-11 Apple Inc. Intelligent assistant for home automation
US11838579B2 (en) 2014-06-30 2023-12-05 Apple Inc. Intelligent automated assistant for TV user interactions
US11516537B2 (en) 2014-06-30 2022-11-29 Apple Inc. Intelligent automated assistant for TV user interactions
US11842734B2 (en) 2015-03-08 2023-12-12 Apple Inc. Virtual assistant activation
US11087759B2 (en) 2015-03-08 2021-08-10 Apple Inc. Virtual assistant activation
US10827330B2 (en) 2015-05-27 2020-11-03 Apple Inc. Systems and methods for proactively identifying and surfacing relevant content on an electronic device with a touch-sensitive display
US11070949B2 (en) 2015-05-27 2021-07-20 Apple Inc. Systems and methods for proactively identifying and surfacing relevant content on an electronic device with a touch-sensitive display
JP2019057290A (ja) * 2015-05-27 2019-04-11 アップル インコーポレイテッドApple Inc. 関連コンテンツを先見的に特定し、タッチ感知デバイス上に表面化させるためのシステム及び方法
US11947873B2 (en) 2015-06-29 2024-04-02 Apple Inc. Virtual assistant for media playback
US11500672B2 (en) 2015-09-08 2022-11-15 Apple Inc. Distributed personal assistant
US11809483B2 (en) 2015-09-08 2023-11-07 Apple Inc. Intelligent automated assistant for media search and playback
US11853536B2 (en) 2015-09-08 2023-12-26 Apple Inc. Intelligent automated assistant in a media environment
US11550542B2 (en) 2015-09-08 2023-01-10 Apple Inc. Zero latency digital assistant
US11126400B2 (en) 2015-09-08 2021-09-21 Apple Inc. Zero latency digital assistant
US11954405B2 (en) 2015-09-08 2024-04-09 Apple Inc. Zero latency digital assistant
US11809886B2 (en) 2015-11-06 2023-11-07 Apple Inc. Intelligent automated assistant in a messaging environment
US11526368B2 (en) 2015-11-06 2022-12-13 Apple Inc. Intelligent automated assistant in a messaging environment
US11886805B2 (en) 2015-11-09 2024-01-30 Apple Inc. Unconventional virtual assistant interactions
US11853647B2 (en) 2015-12-23 2023-12-26 Apple Inc. Proactive assistance based on dialog communication between devices
CN105930061A (zh) * 2016-05-20 2016-09-07 深圳天珑无线科技有限公司 一种功能键的屏蔽方法和终端
US11657820B2 (en) 2016-06-10 2023-05-23 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US11037565B2 (en) 2016-06-10 2021-06-15 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US11152002B2 (en) 2016-06-11 2021-10-19 Apple Inc. Application integration with a digital assistant
US11809783B2 (en) 2016-06-11 2023-11-07 Apple Inc. Intelligent device arbitration and control
US11749275B2 (en) 2016-06-11 2023-09-05 Apple Inc. Application integration with a digital assistant
US11467802B2 (en) 2017-05-11 2022-10-11 Apple Inc. Maintaining privacy of personal information
US11599331B2 (en) 2017-05-11 2023-03-07 Apple Inc. Maintaining privacy of personal information
US11580990B2 (en) 2017-05-12 2023-02-14 Apple Inc. User-specific acoustic models
US11862151B2 (en) 2017-05-12 2024-01-02 Apple Inc. Low-latency intelligent automated assistant
US11380310B2 (en) 2017-05-12 2022-07-05 Apple Inc. Low-latency intelligent automated assistant
US11538469B2 (en) 2017-05-12 2022-12-27 Apple Inc. Low-latency intelligent automated assistant
US11837237B2 (en) 2017-05-12 2023-12-05 Apple Inc. User-specific acoustic models
US11405466B2 (en) 2017-05-12 2022-08-02 Apple Inc. Synchronization and task delegation of a digital assistant
US11532306B2 (en) 2017-05-16 2022-12-20 Apple Inc. Detecting a trigger of a digital assistant
US11675829B2 (en) 2017-05-16 2023-06-13 Apple Inc. Intelligent automated assistant for media exploration
US11710482B2 (en) 2018-03-26 2023-07-25 Apple Inc. Natural assistant interaction
US11900923B2 (en) 2018-05-07 2024-02-13 Apple Inc. Intelligent automated assistant for delivering content from user experiences
US11854539B2 (en) 2018-05-07 2023-12-26 Apple Inc. Intelligent automated assistant for delivering content from user experiences
US11169616B2 (en) 2018-05-07 2021-11-09 Apple Inc. Raise to speak
US11907436B2 (en) 2018-05-07 2024-02-20 Apple Inc. Raise to speak
US11487364B2 (en) 2018-05-07 2022-11-01 Apple Inc. Raise to speak
US10984798B2 (en) 2018-06-01 2021-04-20 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
US11009970B2 (en) 2018-06-01 2021-05-18 Apple Inc. Attention aware virtual assistant dismissal
US11431642B2 (en) 2018-06-01 2022-08-30 Apple Inc. Variable latency device coordination
US11630525B2 (en) 2018-06-01 2023-04-18 Apple Inc. Attention aware virtual assistant dismissal
US11360577B2 (en) 2018-06-01 2022-06-14 Apple Inc. Attention aware virtual assistant dismissal
US11893992B2 (en) 2018-09-28 2024-02-06 Apple Inc. Multi-modal inputs for voice commands
US11783815B2 (en) 2019-03-18 2023-10-10 Apple Inc. Multimodality in digital assistant systems
US11705130B2 (en) 2019-05-06 2023-07-18 Apple Inc. Spoken notifications
US11675491B2 (en) 2019-05-06 2023-06-13 Apple Inc. User configurable task triggers
US11888791B2 (en) 2019-05-21 2024-01-30 Apple Inc. Providing message response suggestions
US11657813B2 (en) 2019-05-31 2023-05-23 Apple Inc. Voice identification in digital assistant systems
US11237797B2 (en) 2019-05-31 2022-02-01 Apple Inc. User activity shortcut suggestions
US11790914B2 (en) 2019-06-01 2023-10-17 Apple Inc. Methods and user interfaces for voice-based control of electronic devices
WO2021157126A1 (fr) * 2020-02-05 2021-08-12 国立大学法人九州工業大学 Programme d'aide à l'entrée de données, dispositif d'aide à l'entrée de données et procédé d'aide à l'entrée de données
US11914848B2 (en) 2020-05-11 2024-02-27 Apple Inc. Providing relevant data items based on context
US11924254B2 (en) 2020-05-11 2024-03-05 Apple Inc. Digital assistant hardware abstraction
US11765209B2 (en) 2020-05-11 2023-09-19 Apple Inc. Digital assistant hardware abstraction
US11755276B2 (en) 2020-05-12 2023-09-12 Apple Inc. Reducing description length based on confidence
US11838734B2 (en) 2020-07-20 2023-12-05 Apple Inc. Multi-device audio adjustment coordination
US11696060B2 (en) 2020-07-21 2023-07-04 Apple Inc. User identification using headphones
US11750962B2 (en) 2020-07-21 2023-09-05 Apple Inc. User identification using headphones

Also Published As

Publication number Publication date
JP6023874B2 (ja) 2016-11-09
JPWO2014162570A1 (ja) 2017-02-16

Similar Documents

Publication Publication Date Title
JP6023874B2 (ja) 表示制御装置、表示制御方法、表示制御プログラムおよびコンピュータが読み取り可能な記録媒体
EP4258239A2 (fr) Notifications de circulation pendant la navigation
JP4821527B2 (ja) 情報機器
JP5605186B2 (ja) 制御装置、制御装置の制御方法及びコンピュータプログラム
JP5742314B2 (ja) 画像表示システム、画像表示装置、画像表示方法及びコンピュータプログラム
JP2009134105A (ja) 表示装置、表示制御方法、表示制御プログラム、および記録媒体
JP2011192231A (ja) 車載入力装置及び車載入力装置用入力プログラム
JP4276292B2 (ja) 音声案内装置、音声案内方法、音声案内プログラム、および記録媒体
CN101813487B (zh) 导航装置以及导航装置的显示方法
JP4575491B2 (ja) ナビゲーション装置及びナビゲーション方法
JP2009222409A (ja) 情報出力装置、情報出力方法、情報出力プログラムおよび記録媒体
JP2007232390A (ja) 情報機器、案内情報提供方法及びプログラム
JP4531819B2 (ja) ナビゲーション装置、処理制御方法、処理制御プログラムおよびコンピュータに読み取り可能な記録媒体
WO2007105500A1 (fr) Dispositif, procede et programme de navigation et support d'enregistrement correspondant lisible sur ordinateur
JP5059572B2 (ja) 情報通知装置、情報通知方法、情報通知プログラム、および記録媒体
JP4603621B2 (ja) 経路誘導装置、経路誘導方法、経路誘導プログラムおよび記録媒体
JP6333446B2 (ja) 地図表示装置、地図表示方法、及び地図表示プログラム
JP2009156697A (ja) 経路探索装置、経路探索方法、経路探索プログラム、および記録媒体
JP2009115718A (ja) ナビゲーション装置、ナビゲーション方法、ナビゲーションプログラム、および記録媒体
JP2008160445A (ja) 放送波情報表示装置、放送波情報表示方法、放送波情報表示プログラム、および記録媒体
JP2008064480A (ja) ナビゲーション装置、方法及びプログラム
JP5705946B2 (ja) 地図表示装置、地図表示方法、及び地図表示プログラム
JP2021192063A (ja) 地図表示装置、地図表示方法、及び地図表示プログラム
JP5405611B2 (ja) 地図表示装置、地図表示方法、及び地図表示プログラム
JPWO2015151154A1 (ja) 表示制御装置、表示制御方法および表示制御プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13881240

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015509819

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13881240

Country of ref document: EP

Kind code of ref document: A1