US20190250776A1 - Vehicular display apparatus - Google Patents
Vehicular display apparatus Download PDFInfo
- Publication number
- US20190250776A1 US20190250776A1 US16/394,403 US201916394403A US2019250776A1 US 20190250776 A1 US20190250776 A1 US 20190250776A1 US 201916394403 A US201916394403 A US 201916394403A US 2019250776 A1 US2019250776 A1 US 2019250776A1
- Authority
- US
- United States
- Prior art keywords
- operation screen
- pointing device
- section
- grains
- object group
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000012545 processing Methods 0.000 claims abstract description 29
- 238000006073 displacement reaction Methods 0.000 claims description 7
- 230000000875 corresponding effect Effects 0.000 description 34
- 230000000694 effects Effects 0.000 description 8
- 230000002596 correlated effect Effects 0.000 description 7
- 238000001514 detection method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000000034 method Methods 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 239000012212 insulator Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/60—Instruments characterised by their location or relative disposition in or on vehicles
-
- B60K37/02—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
- G06F3/0383—Signal control means within the pointing device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/11—Instrument graphical user interfaces or menu aspects
- B60K2360/111—Instrument graphical user interfaces or menu aspects for controlling multiple devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/11—Instrument graphical user interfaces or menu aspects
- B60K2360/115—Selection of menu items
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/143—Touch sensitive instrument input devices
- B60K2360/1438—Touch screens
- B60K2360/1442—Emulation of input devices
-
- B60K2370/143—
-
- B60K2370/1442—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Definitions
- the disclosure relates to a vehicular display apparatus that displays an operation screen.
- a vehicular display apparatus that displays a screen (an operation screen) enabling various operations of in-vehicle equipment by using a pointing device, which recognizes a position of a user's finger (hereinafter referred to as an “operation finger”), in an operation section has been known (for example, see Japanese Patent Application Publication No. 2004-345549 (JP 2004-345549 A)).
- JP 2004-345549 A a technique of recognizing the position of the operation finger in the operation section and moving a cursor that selects an icon on the operation screen on the basis of position information of the operation finger when the user contacts the operation section of the pointing device and performs a slide operation of a specified amount or larger has been disclosed.
- the operation finger leaves the operation section even in the slightest amount due to a vibration or the like of a vehicle, there is a possibility that the pointing device can no longer recognize the operation finger.
- the user cannot determine whether the pointing device recognizes the operation finger and thus can operate the operation screen or the pointing device does not recognize the operation finger and thus cannot operate the operation screen. That is, as long as the user checks whether the cursor can be moved or whether the list or the like can be scrolled, the user cannot determine whether the pointing device can operate the operation screen.
- the disclosure provides a vehicular display apparatus that allows a user to easily determine whether a pointing device can operate an operation screen.
- a vehicular display apparatus includes: a display section; a pointing device that includes an operation section and a recognition section that recognizes an operation finger operating the operation section; and a display control section that is configured to display an operation screen on the display section, the operation screen including a plurality of selection targets, a selection operation of which is able to be performed by using the pointing device.
- the display control section is configured to: generate at least one of an image in which a cursor for selecting one selection target from the plurality of selection targets is moved and an image in which the plurality of selection targets that are arranged along a specified axis on the operation screen are scrolled in accordance with a position of the operation finger, which is recognized by the recognition section, on the operation section; generate an image that includes one of an object and an object group across a specified area on the operation screen; and execute processing of setting a part of the one of the object and the object group in a different display mode from the other part of the one of the object and the object group when the operation finger is recognized by the recognition section.
- the pointing device the recognition section
- the part of the object or the object group, which is displayed on the operation screen is in the different display mode (for example, a color, luminance, a shape, size, displacement, or the like differs) from the other part thereof. Accordingly, the user visually recognizes presence or absence of a partial change in the display mode of the object or the object group on the operation screen, and thus can easily determine whether the operation screen can be operated by the pointing device.
- the display control section may be configured to generate an image in which the part in the different display mode of the one of the object and the object group is moved along a moving direction of the operation finger when the operation finger, which is recognized by the recognition section, is moved with respect to the operation section.
- the part in the different display mode of the object or the object group is moved along the moving direction of the operation finger.
- the operation finger possibly leaves the operation section due to a vibration of a vehicle or the like while the operation finger is moved.
- the user visually recognizes whether the part in the different display mode is moved along the moving direction of the operation finger, and thus can easily determine whether a moving operation of the cursor or a scrolling operation of a list or the like by the pointing device can appropriately be continued.
- the display control section may be configured to generate the operation screen by composing a foreground image that includes the plurality of selection targets and a background image that includes one of the object and the object group.
- the object or the object group is displayed as a background of the plurality of selection targets (a plurality of icons or the list configured by including a plurality of items) or the like that are displayed as a foreground.
- the cursor indicative of a selection status of the plurality of selection targets, or the like worsens, which is resulted from a change in the display mode or the like of the object group, can be suppressed.
- the disclosure can provide the vehicular display apparatus that allows the user to easily determine whether the operation screen can be operated by the pointing device.
- FIG. 1 is a block diagram that schematically shows one example of a configuration of a display apparatus
- FIG. 2 is a view of one example of an in-vehicle mode of the display apparatus (a pointing device and a display);
- FIG. 3A is a view of one example of a foreground image by a display apparatus according to a first embodiment
- FIG. 3B is a view of one example of a background image by the display apparatus according to the first embodiment
- FIG. 3C is a view of one example of an operation screen by the display apparatus according to the first embodiment
- FIG. 4A is a view of the one example of the foreground image by the display apparatus according to the first embodiment
- FIG. 4B is a view of the one example of the background image by the display apparatus according to the first embodiment
- FIG. 4C is a view of the one example of the operation screen by the display apparatus according to the first embodiment.
- FIG. 5A is a view of the one example of the foreground image by the display apparatus according to the first embodiment
- FIG. 5B is a view of the one example of the background image by the display apparatus according to the first embodiment
- FIG. 5C is a view of the one example of the operation screen by the display apparatus according to the first embodiment.
- FIG. 6A is a view of another example of the foreground image by the display apparatus according to the first embodiment.
- FIG. 6B is a view of another example of the background image by the display apparatus according to the first embodiment.
- FIG. 6C is a view of another example of the operation screen by the display apparatus according to the first embodiment.
- FIG. 7A is a view of another example of the foreground image by the display, apparatus according to the first embodiment.
- FIG. 7B is a view of another example of the background image by the display apparatus according to the first embodiment.
- FIG. 7C is a view of another example of the operation screen by the display apparatus according to the first embodiment.
- FIG. 8A is a view of another example of the foreground image by the display apparatus according to the first embodiment.
- FIG. 8B is a view of another example of the background image by the display apparatus according to the first embodiment.
- FIG. 8C is a view of another example of the operation screen by the display apparatus according to the first embodiment.
- FIG. 9A is a view of yet another example of the operation screen by the display apparatus according to the first embodiment.
- FIG. 9B is a view of yet another example of the operation screen by the display apparatus according to the first embodiment.
- FIG. 9C is a view of yet another example of the operation screen by the display apparatus according to the first embodiment.
- FIG. 10A is a view of one example of a foreground image by a display apparatus according to a second embodiment
- FIG. 10B is a view of one example of a background image by the display apparatus according to the second embodiment.
- FIG. 10C is a view of one example of an operation screen by the display apparatus according to the second embodiment.
- FIG. 11A is a view of the one example of the foreground image by the display apparatus according to the second embodiment
- FIG. 11B is a view of the one example of the background image by the display apparatus according to the second embodiment.
- FIG. 11C is a view of the one example of the operation screen by the display apparatus according to the second embodiment.
- FIG. 12A is a view of the one example of the foreground image by the display apparatus according to the second embodiment
- FIG. 12B is a view of the one example of the background image by the display apparatus according to the second embodiment.
- FIG. 12C is a view of the one example of the operation screen by the display apparatus according to the second embodiment.
- FIG. 1 is a block diagram that schematically shows one example of a configuration of a display apparatus 1 according to this embodiment.
- the display apparatus 1 includes a pointing device 10 , an electronic control unit (ECU) (a display control section) 20 , and a display (a display section) 30 .
- the display apparatus 1 is mounted on a vehicle and displays a screen (an operation screen) that includes a plurality of selection targets (a plurality of icons or the like) on the display 30 , a selection operation of which can be performed by using the pointing device 10 .
- a “vehicle” means the vehicle on which the display apparatus 1 is mounted unless otherwise noted.
- the pointing device 10 is an input device that specifies a position on the operation screen, and is a touch pad, for example.
- the pointing device 10 includes an operation section 11 and a recognition section 12 .
- the operation section. 11 is a portion of the pointing device 10 that is operated by a finger (an operation finger) of a user (for example, an occupant such as a driver of the vehicle).
- the operation section 11 is a touch operation screen of the touch pad.
- the recognition section 12 recognizes the operation finger on the operation section 11 .
- the recognition section 12 is an electrostatic pad of the touch pad.
- the electrostatic pad has such a structure that an electrode (an electrostatic sensor) extends linearly in each of an X-direction and a Y-direction on a plane with an insulator being interposed therebetween, and outputs a detection signal of these electrodes (a signal corresponding to a change amount of electric charges stored in the electrodes).
- the recognition section 12 successively (that is, in every specified cycle) outputs a signal related to an operation state by the user (far example, a detection signal output by the electrostatic pad, hereinafter referred to as a “state signal”) to the ECU 20 .
- the recognition section 12 outputs the state signal that corresponds to each coordinate (an x-coordinate and a y-coordinate) in a predetermined x-y-coordinate system by the operation section 11 .
- the pointing device 10 is not limited to the touch pad or the like as long as adopting a mode of including: the operation section 11 ; and the recognition section 12 that recognizes the operation finger on the operation section 11 .
- the pointing device 10 may be a gesture recognition device that includes: a predetermined operation space (one example of the operation section 11 ); and a processing device (one example of the recognition section 12 ) that recognizes a position of the operation finger on the operation space on the basis of an image of the operation space captured by a camera.
- the pointing device 10 is provided at an appropriate place in a vehicle cabin and is preferably arranged at a position where the driver can easily operate the pointing device 10 , more specifically, a position where the driver's hand can reach the pointing device 10 while keeping a driving posture.
- the pointing device 10 (the touch pad in this example) may be arranged in a console box or around the console box.
- the ECU 20 is one example of the display control section that displays the operation screen on the display 30 , and is an electronic control unit that executes processing of generating the operation screen.
- the ECU 20 moves a cursor that selects one selection target of the plurality of selection targets on the operation screen in accordance with the position of the operation finger, which is recognized by the recognition section 12 , on the operation section 11 .
- the ECU 20 is configured by including a microcomputer as a central component and realizes various types of control processing by executing various programs stored in ROM on a CPU.
- the ECU 20 includes a state reception section 21 , a foreground image generation section 22 , a background image generation section 23 , and a composition processing section 24 .
- the ECU 20 includes a storage section 29 as a predetermined storage area of non-volatile internal memory.
- the state reception section 21 executes processing of receiving the state signal that is received from the pointing device 10 in every specified cycle.
- the state reception section 21 transmits the received state signal to the foreground image generation section 22 and the background image generation section 23 .
- the foreground image generation section 22 generates a foreground image of the operation screen that includes: the plurality of selection targets (a plurality of target components such as the icons), the selection operation of which can be performed by using the pointing device 10 ; and a cursor for selecting the one selection target from the plurality of selection targets.
- the cursor indicates the selected target component by emphatically displaying the target component such as the icon. For example, the cursor displays the selected target component in a mode of increasing luminance of the selected target component to be higher than that of the other target components, a mode of displaying the color of the selected target component in a different mode from the color of the other target components, a mode of surrounding the selected target component with a frame, or the like.
- the foreground image generation section 22 determines a cursor position (that is, what is selected from the plurality of target components) on the basis of the state signal received from the state reception section 21 . More specifically, the foreground image generation section 22 first computes the position on the operation screen that corresponds to the state signal. For example, the foreground image generation section 22 stores first corresponding relationship information in the storage section 29 in advance, the first corresponding relationship information making change amounts of the x-coordinate and the y-coordinate on the pointing device 10 , which are based on the state signal, correlate with change amounts of the x-coordinate and the y-coordinate on the operation screen.
- the foreground image generation section 22 can compute the change amounts of the x-coordinate and the y-coordinate on the operation screen that correspond to an operation of the pointing device 10 . Then, with initial values of the x-coordinate and the y-coordinate (representative coordinates indicative of an initial position of the cursor upon activation of the display apparatus 1 ) on the operation screen being references, the foreground image generation section 22 can successively update the x-coordinate and the y-coordinate on the operation screen in accordance with the successively received state signal from the pointing device 10 .
- the foreground image generation section 22 determines a position of the cursor in accordance with the computed position on the operation screen. For example, the foreground image generation section 22 stores second corresponding relationship information in the storage section 29 in advance, the second corresponding relationship information making the position of the cursor (for example, a centroid coordinate of the cursor) on the operation screen correlate with the position on the operation screen. In this way, on the basis of said second corresponding relationship information, the foreground image generation section 22 can determine the position (the x-coordinate and the y-coordinate) of the cursor on the operation screen.
- the foreground image generation section 22 generates a foreground image in a mode of arranging the plurality of target components such as the icons at predetermined positions and arranging the cursor at the determined position on the operation screen (that is, on the one target component that is arranged at the same position as the determined position on the operation screen).
- the background image generation section 23 generates a background image of the operation screen that includes a graphic object or a graphic object group as collection of a number of the graphic objects (individual objects).
- the graphic object or the graphic object group is arranged across a specified area (for example, an area that includes an area where the plurality of target components are arranged) on the operation screen.
- the graphic object may be a curved-surface shape surface object that is arranged across the specified area on the operation screen and imitates a water surface.
- the graphic object group may be collection of a number of granular graphic objects (the individual objects) that are arranged across the specified area on the operation screen.
- the background image generation section 23 computes the position on the operation screen that corresponds to the state signal received from the state reception section 21 .
- the background image generation section 23 determines whether the pointing device 10 (the recognition section 12 ) recognizes the operation finger. That is, in the case where the pointing device 10 is the touch pad, the background image generation section 23 determines whether the user's finger is in contact with the touch pad. In the case where the pointing device 10 is the gesture recognition device, the background image generation section 23 determines whether the user's finger is held on the operation space.
- the background image generation section 23 when determining that the pointing device 10 (the recognition section 12 ) does not recognize the operation finger, the background image generation section 23 generates the background image that includes the graphic object or the graphic object group in a predetermined display mode. For example, the background image generation section 23 may generate the background image that includes the graphic object or the graphic object group in the predetermined display mode in which a difference that visually attracts the user's attention is not included in the entire graphic object or the entire graphic object group.
- the background image generation section 23 changes a display mode of a part of the graphic object to differ from a display mode of the other parts of the graphic object or changes a display mode of a part of the graphic object group to differ from a display mode of the other parts of the graphic object group. More specifically, the background image generation section 23 defines third corresponding relationship information in advance and stores the third corresponding relationship information in the storage section 29 , the third corresponding relationship information making the part of the graphic object or the part of the graphic object group in the background image correlate with the position on the operation screen.
- the background image generation section 23 determines the part of the graphic object or the part of the graphic object group in the background image that corresponds to the computed position on the operation screen. Then, the background image generation section 23 generates the background image for which the determined part of the graphic object has the different display mode from the other parts of the graphic object or for which the determined part of the graphic object group has the different display mode from the other parts of the graphic object group.
- the “different display mode” means a display mode with a difference that is easily recognizable by the user who looks at the operation screen.
- the “different display mode” possibly includes a display mode with a different color, a display mode with difference luminance, a display mode with different size or a different shape more specifically, a display mode in which a shape or size of the individual object corresponding to the part of the graphic object group differs from a shape or size of each of the other individual objects), a display mode with a different amount of displacement (including an amount of displacement in a virtual three-dimensional space in the background image), and the like.
- the background image that is generated by the background image generation section 23 will be described in detail below.
- the composition processing section 24 executes processing of composing the foreground image, which is generated by the foreground image generation section 22 , and the background image, which is generated by the background image generation section 23 , to generate the operation screen. Then, the composition processing section 24 transmits a command signal that includes the generated operation screen to the display 30 . In this way, the operation screen can be displayed on the display 30 .
- the storage section 29 stores the first corresponding relationship information in advance, the first corresponding relationship information defining a corresponding relationship between a change amount of an x-y coordinate (the x-coordinate and the y-coordinate) on the pointing device 10 and a change amount of an x-y coordinate (the x-coordinate and the y-coordinate) on the operation screen.
- the storage section 29 also stores the second corresponding relationship information in advance, the second corresponding relationship information defining a corresponding relationship between the position (the x-coordinate and the y-coordinate) on the operation screen and the position of the cursor on the operation screen.
- the storage section 29 further stores the third corresponding relationship information in advance, the third corresponding relationship information making the part of the graphic object or the part of the graphic object group in the background image correlate with the position on the operation screen.
- an x-axis direction is one example of the first direction
- a y-axis direction is one example of the second direction.
- the display 30 is arranged at a remote position from the pointing device 10 and displays the operation screen that can be operated by the pointing device 10 in accordance with the command signal from the ECU 20 (the composition processing section 24 ).
- the display 30 is arranged at an appropriate position in the vehicle cabin, that is, at a position where the display 30 can easily and visually be recognized by the user (the driver).
- the display 30 may be arranged in a central section of an instrument panel.
- the display 30 may be display means that directly displays the operation screen on eyesight of the user.
- FIG. 3A to FIG. 5C are views, each of which shows the one example of the operation screen by the display apparatus 1 (more specifically, the foreground image generation section 22 , the background image generation section 23 , and the composition processing section 24 ) according to this embodiment.
- FIG. 3B , and FIG. 3C respectively show a foreground image 40 a , a background image 40 b , and an operation screen 40 in a case where the pointing device 10 (the recognition section 12 ) does not recognize the operation finger.
- FIG. 4C, 5C respectively show the foreground image 40 a , the background image 40 b , and the operation screen 40 in a case where the pointing device 10 (the recognition section 12 ) recognizes the operation finger. More specifically, FIG. 4A , FIG. 4B , and FIG. 4C respectively show the foreground image 40 a , the background image 40 b , and the operation screen 40 at a time when a state shown in FIG. 3A , FIG. 3B , and FIG. 3C (a state where the pointing device 10 the recognition section 12 ) does not recognize the operation finger) is shifted to a state where the pointing device 10 (the recognition section 12 ) recognizes the operation finger, FIG. 5A , FIG. 5B , and FIG.
- 5C respectively show the foreground image 40 a , the background image 40 b , and the operation screen 40 after the operation finger that is recognized by the recognition section 12 starts being moved from the state shown in FIG. 4A , FIG. 4B , and FIG. 4C .
- the foreground image 40 a includes: icons 41 to 45 as the selection targets; and a cursor 46 that selects either one of the icons 41 to 45 .
- the icon 41 is a switch component that shifts to an operation screen for an air conditioner mounted on the vehicle.
- the icon 42 is a switch component that shifts to an operation screen for making a phone call.
- the icon 43 is a switch component that shifts to an operation screen for a navigation device mounted on the vehicle.
- the icon 44 is a switch component that shifts to an operation screen for an audio device mounted on the vehicle.
- the icon 45 is a switch component that shifts to an operation screen for making various types of setting of the display apparatus 1 , for example.
- Positions of the icons 41 to 45 on the operation screen 40 are defined in advance, and the icons 41 to 45 are sequentially arranged from a left end to a right end at a slightly lower position from center in a vertical direction of the foreground image 40 a (that is, the operation screen 40 ).
- a mode of surrounding the selected one of the icons 41 to 45 with a frame is adopted for the cursor 46 in this example.
- the foreground image generation section 22 determines a position of the cursor 46 on the foreground image 40 a (that is, the operation screen 40 ). Then, the foreground image generation section 22 arranges the icons 41 to 45 at the predetermined positions and generates the foreground image 40 a of the operation screen 40 on which the cursor 46 is arranged at the determined position (more specifically, the same position as either one of the icons 41 to 45 ). Because the pointing device 10 (the recognition section 12 ) does not recognize the operation finger in the state shown in FIGS.
- the position of the cursor 46 corresponds to a position at termination of the last moving operation of the cursor 46 or the predetermined initial position of the cursor 46 .
- the cursor 46 is arranged at the same position as the icon 41 (that is, on the icon 41 ) and is in a state of selecting the icon 41 .
- the background image 40 b includes a graphic object group (hereinafter referred to as an “object group”) 47 that is configured by including a large number of the granular graphic objects (hereinafter simply referred to as “grains”).
- the object group 47 is arranged in a mode of arranging the grains as components in a lateral line across a specified area on the operation screen 40 , more specifically, a moving area of the cursor 46 in a lateral direction of the operation screen 40 at a lower end of the background image 40 b (that is, the operation screen 40 ).
- the background image generation section 23 When determining that the pointing device 10 (the recognition section 12 ) does not recognize the operation finger, the background image generation section 23 generates the background image 40 b that includes the object group 47 in the predetermined display mode, that is, the display mode in which the grains are arranged in the lateral line (the object group 47 is arranged along the lateral direction) in this example.
- the composition processing section 24 composes the foreground image 40 a in FIG. 3A and the background image 40 h in FIG. 3B to generate the operation screen 40 .
- the operation screen 40 includes: the icons 41 to 45 that are arranged in line from the left end to the right end at the slightly lower position from the center in the vertical direction; and the cursor 46 that is arranged at the same position as the icon 41 (that is, on the icon 41 ).
- the operation screen 40 also includes the object group 47 that is configured by including the grains arranged in line from the left end to the right end at a position further below the icons 41 to 45 and the cursor 46 .
- the foreground image generation section 22 generates the same foreground image 40 a as in FIG. 3A .
- the background image generation section 23 When determining that the pointing device 10 (the recognition section 12 ) recognizes the operation finger on the basis of the state signal from the state reception section 21 , as shown in FIG. 4B , the background image generation section 23 generates the background image 40 b in which grains 47 a as a part of the object group 47 , which is correlated with the position on the operation screen 40 corresponding to the state signal, are in a different display mode from the grains of the other part of the object group 47 . More specifically, the grains 47 a are significantly displaced in the vertical direction (the x-axis direction) with respect to the other grains in the object group 47 .
- the composition processing section 24 composes the foreground image 40 a in FIG. 4A and the background image 40 b in FIG. 4B to generate the operation screen 40 .
- the operation screen 40 includes: the icons 41 to 45 that are arranged in line from the left end to the right end at the slightly lower position from the center in the vertical direction; and the cursor 46 that is arranged at the same position as the icon 41 (that is, on the icon 41 ).
- the operation screen 40 also includes the object group 47 that is arranged from the left end to the right end in the mode in which the grains 47 a are significantly displaced in the vertical direction with respect to the other grains in the object group 47 at the position further below the icons 41 to 45 and the cursor 46 .
- the grains 47 a are configured by including five grains with a lateral position on the operation screen 40 , which corresponds to the state signal from the state reception section 21 (the pointing device 10 ), being center and are displayed in a mode of partially overlapping the icon 41 , which is selected by the cursor 46 , in a background.
- an amount of displacement of each of the grains included in the grains 47 a in the vertical direction differs.
- the grain closest to the lateral position on the operation screen 40 which corresponds to the state signal, has the largest amount of the displacement in the vertical direction, and the amount of the displacement of the grain in the vertical direction is gradually reduced as the grain separates from the lateral position on the operation screen 40 , which corresponds to the state signal.
- the part (the grains 47 a ) of the object group 47 is set in the different display mode from the other part of the object group 47 . More specifically, the grains 47 a of the object group 47 correlated with the position on the operation screen 40 , which corresponds to the state signal from the pointing device 10 , (in other words, the position of the cursor 46 ) are set in the different display mode from the other grains in the object group 47 . In this way, the user can easily comprehend the operation state on the operation screen 40 . For example, there is a case where the operation finger of the user possibly leaves the operation section 11 due to a vibration or the like of the vehicle.
- the user recognizes that the grains 47 a are displaced (raised) in the vertical direction, and thus can easily determine whether the operation screen 40 can be operated by the pointing device 10 (more specifically, whether an operable state continues).
- the user visually recognizes that the grains 47 a are displaced (raised) in the vertical direction at substantially the same lateral position as the cursor 46 on the operation screen 40 , and thus can easily comprehend (an indication of) the position of the cursor 46 at a start of the operation by using the pointing device 10 .
- the user cannot gaze at the display 30 .
- the grains 47 a as the part of the object group 47 are raised in the vertical direction.
- the user can easily comprehend whether the operation screen 40 can be operated by the pointing device 10 , the position of the cursor 46 , and the like.
- the grains 47 a are displayed as the background of the icons 41 to 45 and the cursor 46 .
- occurrence of a situation where visibility of the icons 41 to 45 and the cursor 46 worsens can be suppressed.
- the position of the cursor 46 is updated from the state shown in FIG. 4A along with the movement of the operation finger on the operation section 11 . More specifically, based on the state signal from the state reception section 21 , the foreground image generation section 22 determines (changes) the position of the cursor 46 to the same position as the icon 42 (that is, on the icon 42 ) on an immediate right side of the icon 41 and generates the foreground image 40 a shown in FIG. 5A .
- the background image generation section 23 when determining that the pointing device 10 (the recognition section 12 ) recognizes the operation finger on the basis of the state signal from the state reception section 21 , as shown in FIG. 5B , the background image generation section 23 generates the background image 40 b in which grains 47 h as a part of the object group 47 , which is correlated with the position on the operation screen 40 corresponding to the state signal, are in a different display mode from the other grains of the object group 47 . At this time, the grains constituting the part of the object group 47 that is raised in the vertical direction are shifted from the grains 47 a to the grains 47 b in accordance with the movement of the operation finger on the operation section 11 .
- the grains 47 b are arranged on a right side of the grains 47 a , and the grains constituting the part of the object group 47 that is raised in the vertical direction are shifted to the right direction along a moving direction of the operation finger on the operation section 11 . That is, the third corresponding relationship information stored in the above-described storage section 29 is set such that the grains constituting the part of the object group 47 that is raised in the vertical direction are shifted in the same direction by following the movement of the position on the operation screen 40 that corresponds to the state signal from the state reception section 21 .
- the composition processing section 24 composes the foreground image 40 a in FIG. 5A and the background image 40 b in FIG. 5B to generate the operation screen 40 .
- the operation screen 40 includes: the icons 41 to 45 that are arranged in line from the left end to the right end (in the lateral direction) at the slightly lower position from the center in the vertical direction; and the cursor 46 that is arranged at the same position as the icon 42 (that is, on the icon 42 ).
- the operation screen 40 also includes the object group 47 that is arranged from the left end to the right end in the mode in which the grains 47 h are significantly displaced in the vertical direction with respect to the other grains at the position further below the icons 41 to 45 and the cursor 46 .
- the grains 47 b are configured by including the five grains with the lateral position on the operation screen 40 , which corresponds to the state signal from the state reception section 21 (the pointing device 10 ), being the center and are displayed in a mode of partially overlapping the icon 42 , which is selected by the cursor 46 , in the background.
- the part of the object group 47 in the different display mode is moved along the moving direction of the operation finger. More specifically, the grains (the grains 47 a , 47 b ) as the part of the object group 47 that are displayed in the different display mode (the mode of being raised in the vertical direction) are shifted in the same direction (an outlined arrow in FIG. 5C ) by following the movement of the position on the operation screen 40 that corresponds to the state signal from the state reception section 21 (the pointing device 10 ).
- the operation finger possibly leaves the operation section 11 due to the vibration or the like of the vehicle while the operation finger is moved on the operation section 11 .
- the user visually recognizes whether the part of the object group 47 in the different display mode is moved (shifted) along the moving direction of the operation finger, and thus can easily determine whether the moving operation of the cursor 46 by the pointing device 10 can appropriately be continued.
- the user recognizes the direction in which the grains, which constitute the part of the object group 47 and are displayed in the different display mode, are shifted, thereby easily comprehending the moving direction of the cursor 46 on the operation screen 40 .
- the object group 47 configured by arranging a number of the grains in the lateral direction is used in this example; however, the grains of the object group 47 may be joined to create the one graphic object.
- the state where the pointing device 10 (the recognition section 12 ) does not recognize the operation finger is shifted to the state where the pointing device 10 (the recognition section 12 ) recognizes the operation finger, a part of the graphic object that corresponds to the grains 47 a is raised in the vertical direction.
- the part of the graphic object raised in the vertical direction is shifted from the part that corresponds to the grains 47 a to a part that corresponds to the grains 47 b (that is, moved along the moving direction of the operation finger). Also, in such a modified example, the same operational effects as those in this example can be realized.
- the mode of significantly displacing (raising in the vertical direction) the part (the grains 47 a , 47 b ) of the graphic object group with respect to the other part of the graphic object group is adopted as the “different display mode” in this example.
- the disclosure is not limited to said mode.
- FIG. 6A to FIG. 8C a description will hereinafter be made on an operation screen 50 that includes another example of the “different display mode.”
- FIG. 6A to FIG. 8C are views that show another example of the operation screen by the display apparatus 1 (more specifically, the foreground image generation section 22 , the background image generation section 23 , and the composition processing section 24 ) according to this embodiment.
- FIG. 6A , FIG. 6B , and FIG. 6C respectively show a foreground image 50 a , a background image 50 b , and the operation screen 50 in the case where the pointing device 10 (the recognition section 12 ) does not recognize the operation finger.
- FIG. 7C, 8C respectively show the foreground image 50 a , the background image 50 b , and the operation screen 50 in the case where the pointing device 10 (the recognition section 12 ) recognizes the operation finger. More specifically, FIG. 7A , FIG. 7B , and FIG. 1C respectively show the foreground image 50 a , the background image 50 b , and the operation screen 50 at a time when a state shown in FIG. 6A , FIG. 6B , and FIG. 6C the state where the pointing device 10 (the recognition section 12 ) does not recognize the operation finger) is shifted to the state where the pointing device 10 (the recognition section 12 ) recognizes the operation finger, FIG. 8A , FIG. 8B , and FIG.
- FIG. 8C respectively show the foreground image 50 a , the background image 50 b , and the operation screen 50 after the operation finger that is recognized by the recognition section 12 starts being moved from the state shown in FIG. 7A .
- FIG. 7B , and FIG. 7C respectively show the foreground image 50 a , the background image 50 b , and the operation screen 50 after the operation finger that is recognized by the recognition section 12 starts being moved from the state shown in FIG. 7A .
- FIG. 7B , and FIG. 7C respectively show the foreground image 50 a , the background image 50 b , and the operation screen 50 after the operation finger that is recognized by the recognition section 12 starts being moved from the state shown in FIG. 7A .
- FIG. 7B , and FIG. 7C respectively show the foreground image 50 a , the background image 50 b , and the operation screen 50 after the operation finger that is recognized by the recognition section 12 starts being moved from the state shown in FIG. 7A .
- the foreground image 50 a includes: songs 51 to 55 as the selection targets; and the cursor 56 that selects either one of the songs 51 to 55 .
- the cursor 56 selects either one of the songs 51 to 55 by using the pointing device 10
- either one of the songs 51 to 55 can be played from a specified audio source (for example, a CD or the like) in audio equipment that is mounted on the vehicle.
- Positions of the songs 51 to 55 on the operation screen 50 are defined in advance and are sequentially arranged from an upper end to a lower end within an area in the lateral direction that excludes a left end of the foreground image 50 a that is, the operation screen 50 ).
- a mode of surrounding the selected one of the songs 51 to 55 with the frame is adopted for the cursor 56 in this example.
- the foreground image generation section 22 determines a position of the cursor 56 on the foreground image 50 a (that is, the operation screen 50 ). Then, the foreground image generation section 22 arranges the songs 51 to 55 at the predetermined positions and generates the foreground image 50 a of the operation screen 50 on which the cursor 56 is arranged at the determined position (more specifically, the same position as either one of the songs 51 to 55 ). Because the pointing device 10 (the recognition section 12 ) does not recognize the operation finger in the state shown in FIGS.
- the position of the cursor 56 corresponds to a position at termination of the last moving operation of the cursor 56 or a predetermined initial position of the cursor 56 .
- the cursor 56 is arranged at the same position as the song 52 (that is, on the song 52 ) and is in a state of selecting the song 52 .
- the background image 50 b includes a graphic object group (hereinafter referred to as an “object group”) 57 that is configured by including a large number of the granular graphic objects (hereinafter simply referred to as “grains”).
- the object group 57 is arranged such that the grains as components are arranged at equally spaced intervals from right to left and up to down across a specified area on the background image 50 b (that is, the operation screen 50 ), more specifically, an entire area on the operation screen 50 in the vertical direction and the lateral direction.
- the background image generation section 23 When determining that the pointing device 10 (the recognition section 12 ) does not recognize the operation finger, the background image generation section 23 generates the background image 50 b that includes the object group 57 in the predetermined display mode, that is, the display mode in which the grains in the same color are arranged at the equally spaced intervals from the right to the left and the up to the down in this example.
- the foreground image generation section 22 generates the same foreground image 50 a as in FIG. 6A .
- the background image generation section 23 When determining that the pointing device 10 (the recognition section 12 ) recognizes the operation finger on the basis of the state signal from the state reception section 21 , as shown in FIG. 7B , the background image generation section 23 generates the background image 50 b in which grains 57 a as a pan of the object group 57 , which is correlated with the position on the operation screen 50 corresponding to the state signal, are in a different display mode from the grains of the other part of the object group 57 . More specifically, of the grains included in the object group 57 , the grains 57 a are grains in two rows near a vertical position on the operation screen 50 , which corresponding to the state signal, and have a different shape from the other grains in the object group 57 .
- the color of the grains 57 a which constitute the part of the object group 57 , is displayed in the different mode from the color of the grains constituting the other part of the object group 57 .
- the grains 57 a are configured by including the grains in the two rows with the vertical position on the operation screen 50 , which corresponds to the state signal from the state reception section 21 (the pointing device 10 ), being center and are displayed in a mode of partially overlapping the song 52 , which is selected by the cursor 56 , in the background.
- the part (the grains 47 a ) of the object group 47 is set in the different display mode from the other part of the object group 47 . More specifically, the grains 57 a of the object group 57 correlated with the position on the operation screen 50 , which corresponds to the state signal from the pointing device 10 , are set in the different display mode from the other grains in the object group 57 . In this way, the same operational effects as those in the example shown in FIG. 3A to FIG. 5C can be realized.
- the user recognizes a color change of the grains 57 a as the part of the object group 57 , and thus can easily determine whether the operation screen 50 can be operated by the pointing device 10 (more specifically, whether the operable state continues).
- the user can easily comprehend (an indication of) the position of the cursor 56 at the start of the operation by using the pointing device 10 .
- the grains 57 a are displayed as the background of the songs 51 to 55 and the cursor 56 . Thus, even when the color of the grains 57 a is changed, occurrence of a situation where visibility of the songs 51 to 55 and the cursor 56 worsens can be suppressed.
- the position of the cursor 56 is updated from the state shown in FIG. 7A along with the movement of the operation finger on the operation section 11 . More specifically, based on the state signal from the state reception section 21 , the foreground image generation section 22 determines (changes) the position of the cursor 56 to the same position as the song 53 (that is, on the song 53 ) on an immediate lower side of the song 52 and generates the foreground image 50 a shown in FIG. 8A .
- the background image generation section 23 when determining that the pointing device 10 (the recognition section 12 ) recognizes the operation finger on the basis of the state signal from the state reception section 21 , as shown in FIG. 8B , the background image generation section 23 generates the background image 50 b in which grains 57 b as a part of the object group 57 , which is correlated with the position on the operation screen 50 corresponding to the state signal, are in a different display mode from the other grains of the object group 57 . At this time, the grains that constitute the part in the different color of the object group 57 are shifted from the grains 57 a to the grains 57 h in accordance with the movement of the operation finger on the operation section 11 .
- the grains 57 b are arranged on a lower side of the grains 57 a , and the grains that constitute the part in the different color of the object group 57 are shifted downward along the moving direction of the operation finger on the operation section 11 . That is, the third corresponding relationship information stored in the above-described storage section 29 is set such that the grains constituting the part in the different color of the object group 57 are shifted in the same direction by following the movement of the position on the operation screen 50 that corresponds to the state signal from the state reception section 21 .
- the composition processing section 24 composes the foreground image 50 a in FIG. 5A and the background image 50 b in FIG. 8B to generate the operation screen 50 .
- the operation screen 50 includes: the songs 51 to 55 that are arranged in line in the vertical direction; and the cursor 56 that is arranged at the same position as the song 53 (that is, on the song 53 ).
- the operation screen 50 also includes the object group 57 configured by including the grains, which are arranged at the equally spaced intervals from the right to the left and the up to the bottom across the entire area, in the background, and the color of the grains 57 b as the part of the object group 57 is displayed in the different mode from the color of the other grains.
- the grains 57 b are configured by including the grains in the two rows with the vertical position on the operation screen 50 , which corresponds to the state signal from the state reception section 21 (the pointing device 10 ), being the center and are displayed in a mode of partially overlapping the song 53 , which is selected by the cursor 56 , in the background.
- the part of the object group 47 in the different display mode is moved along the moving direction of the operation finger. More specifically, the grains (the grains 57 a , 57 b ) that constitute the part of the object group 57 displayed in the different display mode (the mode of the different color from the other grains) are shifted in the same direction (an outlined arrow in FIG. 8C ) by following the movement of the position on the operation screen 50 that corresponds to the state signal from the state reception section 21 (the pointing device 10 ).
- the same operational effects as those in the example shown in FIG. 3A to FIG. 5C can be realized.
- the user visually recognizes whether the part of the object group 57 in the different display mode is moved (shifted) along the moving direction of the operation finger, and thus can easily determine whether the moving operation of the cursor 56 by the pointing device 10 can appropriately be continued.
- the user recognizes the direction in which the part of the object group 57 in the different display mode are shifted, and can thereby easily comprehend the moving direction of the cursor 56 on the operation screen 50 .
- the object group 57 configured by arranging a number of the grains at the equally spaced intervals from the right to the left and the up to the down is used in this example; however, a planar graphic object (a plane object) that covers the area where the grains of the object group 57 are arranged may be used, for example.
- a planar graphic object a plane object that covers the area where the grains of the object group 57 are arranged may be used, for example.
- the part in the different color of the plane object is shifted from the part that corresponds to the grains 57 a to a part that corresponds to the grains 57 b . Also, in such a modified example, the same operational effects as those in this example can be realized.
- the graphic object group that is arranged in two dimensions is used in this example.
- a virtual three-dimensional space may be set in the background image 50 b , and the graphic object group that is arranged in said three-dimensional space may be used.
- a description will hereinafter be made on the operation screen that includes the graphic object group arranged in the virtual three-dimensional space with reference to FIGS. 9A to 9C .
- FIGS. 9A to 9C show yet another example of the operation screen by the display apparatus 1 (more specifically, the foreground image generation section 22 , the background image generation section 23 , and the composition processing section 24 ) according to this embodiment.
- FIG. 9A shows an operation screen 60 in the case where the pointing device 10 (the recognition section 12 ) does not recognize the operation finger.
- FIGS. 9B, 9C show the operation screen 60 in the case where the pointing device 10 (the recognition section 12 ) recognizes the operation finger. More specifically, FIG. 9B shows the operation screen 60 at a time when a state shown in FIG.
- FIG. 9A (the state where the pointing device 10 (the recognition section 12 ) does not recognize the operation finger) is shifted to a state where the pointing device 10 (the recognition section 12 ) recognizes the operation finger.
- FIG. 9C shows the operation screen 60 after the operation finger, which is recognized by the recognition section 12 , starts being moved from the state shown in FIG. 9B .
- the operation screen 60 includes, as components of a foreground image: icons 61 to 67 as the selection targets; and a cursor 68 that selects either one of the icons 61 to 67 .
- the icons 61 to 67 are sequentially arranged from the left to the right at a lower end of the operation screen 60 .
- a position of the cursor 68 corresponds to a position at termination of the last moving operation of the cursor 68 or a predetermined initial position of the cursor 68 .
- the cursor 68 is arranged at the same position as the icon 64 (that is, on the icon 64 ) and is in a state of selecting the icon 64 .
- the operation screen 60 includes, as components of a background image, a graphic object group (hereinafter simply referred to as an “object group”) 69 that is arranged in the virtual three-dimensional space on the operation screen 60 .
- the object group 69 is configured by including a large number of the granular graphic objects (hereinafter simply referred to as the “grains”) arranged in the virtual three-dimensional space in accordance with a specified rule, and are arranged (along the lateral direction) across a left end to a right end (an area including a moving area of the cursor 68 that moves in the lateral direction) of the operation screen 60 .
- grains 69 a constituting a part of the object group 69 are displayed in a different display mode from the other grains of the object group 69 . More specifically, the grains 69 a as the part of the object group 69 are displaced (raised) in a specified direction (the vertical direction) in the virtual three-dimensional space with respect to the grains as the other part of the object group 69 , and are displayed in a mode of partially overlapping the icon 64 , which is selected by the cursor 68 , in a background.
- the user can further easily comprehend the operation state on the operation screen 60 . That is, the user recognizes a raise in the grains 69 a , which constitutes the part of the object group 69 , and thus can easily comprehend whether the operation screen 40 can be operated by the pointing device 10 , the position of the cursor 68 , and the like.
- the cursor 68 is moved in the right direction along with the movement of the operation finger on the operation section 11 and is arranged at the same position as the icon 65 (that is, on the icon 65 ).
- the position on the operation screen 60 that corresponds to the state signal is moved in the right direction in accordance with the movement of the operation linger on the operation section 11 , the grains that constitute the part in the different display mode of the object group 69 are shifted from the grains 69 a to grains 69 b .
- the grains that constitute the raised part of the object group 69 are shifted in the right direction along the moving direction of the operation finger on the operation section 11 .
- the similar operational effects as those in examples shown in FIG. 3A to FIG. 5C and FIG. 6A to FIG. 8C can be realized.
- the user visually recognizes whether the part in the different display mode of the object group 69 is moved (shifted) along the moving direction of the operation finger, and thus can easily determine whether the moving operation of the cursor 66 by the pointing device 10 can appropriately be continued.
- the user recognizes a direction in which the grains constituting the part in the different display mode are shifted, and can thereby easily comprehend the moving direction of the cursor 46 on the operation screen 60 .
- the object group 69 is arranged in the virtual three-dimensional space, and design of the operation screen 60 can thereby be produced.
- the object group 69 which is configured by arranging a large number of the grains in the virtual three-dimensional space, is used in this example; however, one graphic object. (surface) that is formed by joining the grains of the object group 69 may be used, for example.
- a part of the surface that corresponds to the grain 69 a is raised in the virtual three-dimensional space.
- the part of the surface raised in the virtual three-dimensional space is shifted from the part that corresponds to the grains 69 a to a part that corresponds to the grains 69 b . Also, in such a modified example, the same operational effects as those in this example can be realized.
- a display apparatus 1 uses a pointing device 10 to display an operation screen that includes a plurality of selection targets (a plurality of items constituting a scrollable list or the like), a selection operation of which can be performed, on a display 30 . More specifically, the display apparatus 1 according to this embodiment differs from the display apparatus 1 according to the first embodiment in a point that the operation screen including the plurality of selection targets (the plurality of items constituting the list or the like) that are arranged along a specified axis on the operation screen, and a scrolling operation of which can be performed in a direction of said specified axis by using the pointing device 10 , is displayed on the display 30 .
- the operation screen including the plurality of selection targets the plurality of items constituting the list or the like
- FIG. 1 A configuration of the display apparatus 1 according to this embodiment is shown in FIG. 1 as in the first embodiment.
- An ECU 20 is another example of the display control section that displays the operation screen on the display 30 , and is an electronic control unit that executes processing of generating the operation screen.
- the ECU 20 scrolls the plurality of selection targets (the plurality of items constituting the list or the like) that are arranged along the specified axis on the operation screen.
- the ECU 20 includes a state reception section 21 , a foreground image generation section 22 , a background image generation section 23 , a composition processing section 24 , and a storage section 29 .
- the foreground image generation section 22 determines arrangement of the plurality of selection targets (the plurality of items constituting the list or the like) on the operation screen. Then, the foreground image generation section 22 generates a foreground image of the operation screen that includes at least a part of the selection targets of the plurality of selection targets. This is because the number of the plurality of selection targets (the number of the items constituting the list) that can be selected by the scrolling operation using the pointing device 10 is usually larger than the number of the selection targets that can be displayed on the operation screen.
- the foreground image generation section 22 determines a scrolling amount, a scrolling direction, and the like of (the plurality of selection targets constituting) the list or the like. For example, the foreground image generation section 22 confirms a content of the scrolling operation (a type or the like of the scrolling operation) on the basis of the state signal. For example, in the case where the pointing device 10 is a touch pad, a scrolling operation by “dragging (tracing)” and a scrolling operation by “flicking” are available as the types of the scrolling operation.
- the scrolling operation by “dragging” is an operation of moving a finger at a relatively low speed while the finger remains in contact with an operation surface of the touch pad.
- the scrolling operation by “flicking” is an operation of moving the finger at a relatively high speed in a mode of snapping the operation surface of the touch pad with the finger in a direction of the scrolling operation. For example, based on a determination on whether change amounts of the x-coordinate and the y-coordinate on the touch pad, which are based on the state signal received from the state reception section 21 , are each equal to or larger than a specified threshold, the foreground image generation section 22 can determine whether the scrolling operation is by “dragging” or by “flicking”.
- the foreground image generation section 22 determines the scrolling amount and the scrolling direction on the basis of the change amounts of the x-coordinate and the y-coordinate on the touch pad based on the state signal. For example, the foreground image generation section 22 stores fourth corresponding relationship information in the storage section 29 in advance, the fourth corresponding relationship information making the scrolling amount and the scrolling direction correlate with the change amounts of the x-coordinate and the y-coordinate on the touch pad. In this way, based on the fourth corresponding relationship information, the foreground image generation section 22 can determine the scrolling amount and the scrolling direction.
- the foreground image generation section 22 determines a specified value, which is defined in advance for the scrolling operation by “flicking”, as the scrolling amount, and determines the scrolling direction on the basis of the change amounts of the x-coordinate and the y-coordinate on the touch pad based on the state signal.
- the foreground image generation section 22 determines the arrangement of the selection targets in accordance with the determined scrolling amount and the determined scrolling direction, and generates the foreground image of the operation screen that includes at least the part of the selection targets of the plurality of selection targets (the plurality of items constituting the list).
- types corresponding to the scrolling operation by “dragging” and the scrolling operation by “flicking” on the touch pad can be provided in accordance with a speed of a gesture.
- the background image generation section 23 generates a background image of the operation screen that includes a graphic object arranged across a specified area on the operation screen or a graphic object group as collection of a large number of the graphic objects (individual objects).
- the background image generation section 23 determines whether the pointing device 10 (the recognition section 12 ) recognizes the operation finger.
- the background image generation section 23 when determining that the pointing device 10 (the recognition section 12 ) does not recognize the operation finger, the background image generation section 23 generates the background image that includes the graphic object or the graphic object group in a predetermined display mode.
- the background image generation section 23 may generate the background image that includes the graphic object or the graphic object group in a predetermined display mode in which a difference that visually attracts the user's attention is not included in the entire graphic object or the entire graphic object group.
- the background image generation section 23 when determining that the pointing device 10 (the recognition section 12 ) recognizes the operation finger, the background image generation section 23 changes a display mode of a part of the graphic object to differ from a display mode of the other parts of the graphic object or changes a display mode of a part of the graphic object group to differ from a display mode of the other parts of the graphic object group. Then, in accordance with the content of the scrolling operation (the type or the like of the scrolling operation), the background image generation section 23 shifts (moves) the part in the different display mode of the graphic object or the graphic object group in the same direction as the scrolling operation.
- the background image generation section 23 confirms the content of the scrolling operation (the type or the like of the scrolling operation) on the basis of the state signal received from the state reception section 21 . Then, when determining that the scrolling operation is by “dragging”, the background image generation section 23 determines the scrolling amount and the scrolling direction on the basis of the fourth corresponding relationship information.
- the background image generation section 23 determines the specified value that is defined in advance for the scrolling operation by “flicking” as the scrolling amount, and determines the scrolling direction on the basis of the change amounts of the x-coordinate and the y-coordinate on the touch pad based on the state signal. Then, in accordance with the determined scrolling amount and the determined scrolling direction, the background image generation section 23 generates the background image in a mode in which the part in the different display mode of the graphic object or the graphic object group is shifted (moved) in the same direction as the scrolling operation in a period that corresponds to the content of the scrolling operation.
- the background image generated by the background image generation section 23 will be described in detail below.
- the “period that corresponds to the content of the scrolling operation” means a period that is defined in advance in accordance with the content of the scrolling operation (the type of the scrolling operation).
- “period that corresponds to the content of the scrolling operation” may be a period in which the scrolling operation continues (that is, a period in which the recognition section 12 recognizes the operation finger).
- a predetermined period is set for the scrolling operation by “flicking”.
- the plurality of operation targets are scrolled at a relatively high speed in the scrolling operation by “flicking”.
- the predetermined period may be a relatively short period.
- FIG. 10A to FIG. 12C shows the one example of the operation screen by the display apparatus 1 (more specifically, the foreground image generation section 22 , the background image generation section 23 , and the composition processing section 24 ) according to this embodiment.
- FIG. 10A , FIG. 10B , and FIG. 10C respectively show a foreground image 70 a , a background image 70 b , and an operation screen 70 in the case where the pointing device 10 (the recognition section 12 ) does not recognize the operation finger.
- FIG. 11C, 12C respectively show the foreground image 70 a , the background image 70 b , and the operation screen 70 in the case where the pointing device 10 (the recognition section 12 ) recognizes the operation finger. More specifically, FIG. 11A , FIG. 11B , and FIG. 11C respectively show the foreground image 70 a , the background image 70 b , and the operation screen 70 at a time when a state shown in FIG. 10A FIG. 10B , and FIG. 10C (a state where the pointing device 10 (the recognition section 12 ) does not recognize the operation finger) is shifted to a state where the pointing device 10 (the recognition section 12 ) recognizes the operation finger.
- a song list 71 includes a plurality of songs 71 A to 71 B ( 26 songs), an example of a plurality of selection targets, that are more than displayable songs ( 5 songs) on the operation screen 70 .
- the foreground image 70 a includes: the songs 71 A to 71 E as a part of the songs included in the song list 71 , and a fixed cursor 76 that selects either one of the songs 71 A to 71 Z included in the song list 71 .
- a specified audio source for example, a CD or the like
- the songs 71 A to 71 E are sequentially arranged from an upper end to a lower end within an area in a lateral direction that excludes a left end of the foreground image 70 a (that is, the operation screen 70 ).
- the cursor 76 is fixed at the lower end within the area in the lateral direction that excludes the left end of the foreground image 70 a (that is, the operation screen 70 ).
- the cursor 76 and the song 71 E are arranged at the same position (the cursor 76 is arranged on the song 71 E), and the song 71 E is selected.
- the background image 70 b includes a graphic object group (hereinafter referred to as an “object group”) 77 that is configured by including a large number of granular graphic objects (hereinafter simply referred to as “grains”).
- the object group 77 is arranged such that the grains as components are arranged at equally spaced intervals from right to left and up to down across a specified area on the background image 70 b (that is, the operation screen 70 ), more specifically, an entire area on the operation screen 70 in a vertical direction and the lateral direction.
- the background image generation section 23 When determining that the pointing device 10 (the recognition section 12 ) does not recognize the operation finger, the background image generation section 23 generates the background image 70 b that includes the object group 77 in a predetermined display mode, that is, a display mode in which the grains in the same color are arranged at the equally spaced intervals from the right to the left and the up to the down in this example.
- a predetermined display mode that is, a display mode in which the grains in the same color are arranged at the equally spaced intervals from the right to the left and the up to the down in this example.
- the composition processing section 24 composes the foreground image 70 a in FIG. 10A and the background image 70 b in FIG. 10B to generate the operation screen 70 .
- the operation screen 70 includes: the songs 71 A to 71 E as the part of the song list 71 that is arranged in line in the vertical direction; and the cursor 76 that is arranged at the same position as the song 71 E (that is, on the song 71 E).
- the operation screen 70 also includes the object group 77 configured by, including the grains in the same color, which are arranged at the equally spaced intervals from the right to the left and the up to the down across the entire area, in a background.
- the foreground image generation section 22 generates the same foreground image 50 a as in FIG. 10A .
- the background image generation section 23 when determining that the pointing device 10 (the recognition section 12 ) recognizes the operation finger on the basis of the state signal from the state reception section 21 , as shown in FIG. 11B , the background image generation section 23 generates the background image 70 b in which grains 77 a that constitute a part of the object group 77 are in a different display mode from the other grains of the object group 77 . More specifically, of the grains included in the object group 77 , the grains 77 a are grains in two rows that are located at the upper end of the operation screen 70 , and have a different color from the other grains of the object group 77 .
- the composition processing section 24 composes the foreground image 70 a in FIG. 11A and the background image 70 b in FIG. 11B to generate the operation screen 70 .
- the operation screen 70 includes: the songs 71 A to 71 E as the part of the song list 71 that is arranged in line in the vertical direction; and the cursor 76 that is arranged at the same position as the song 71 E (that is, on the song 71 E).
- the operation screen 70 also includes the object group 77 configured by including the grains, which are arranged at the equally spaced intervals from the right to the left and the up to the down across the entire area, in the background, and the color of the grains 77 a that constitute the part of the object group 77 is displayed in the different mode from the color of the other grains of the object group 77 .
- the grains 77 a are configured by including the grains in the two rows that are located at the upper end of the operation screen 70 .
- the part (the grains 77 a ) of the object group 77 is set in the different display mode from the other part (the other grains) of the object group 77 .
- the user can further easily comprehend the operation state on the operation screen. More specifically, the user recognizes a color change of the grains 77 a as the part of the object group 77 , and thus can easily determine whether the operation screen 50 can be operated by the pointing device 10 (more specifically, whether the operable state continues).
- the user can easily comprehend that the song list 71 is scrolled in accordance with the operation using the pointing device 10 .
- the displayable songs are updated from the state shown in FIG. 11A in accordance with movement of the operation finger on the operation section 11 . More specifically, when the song list 71 is scrolled downward, the foreground image 70 a no longer includes the songs 71 A, 71 B but includes the songs 71 Y, 71 Z instead. In a state shown in FIG. 12A , the cursor 76 and the song 71 C are arranged at the same position (the cursor 76 is arranged on the song 71 C), and the song 71 C is selected.
- the background image generation section 23 generates the background image 70 b in which grains 77 b that constitutes a part of the object group 77 are in a different display mode from the other grains of the object group 77 .
- the grains as the part in the different color of the object group 77 are shifted from the grains 77 a to the grains 77 b in accordance with the movement of the operation finger on the operation section 11 .
- the grains 77 b are arranged on a lower side of the grains 77 a , and the grains that constitute the part in the different color of the object group 77 are shifted downward along the moving direction of the operation finger on the operation section 11 .
- the composition processing section 24 composes the foreground image 70 a in FIG. 12A and the background image 70 b in FIG. 12B to generate the operation screen 70 .
- the operation screen 70 includes: the songs 71 Y, 71 Z and the songs 71 A to 71 C as the part of the song list 71 that is arranged in line in the vertical direction; and the cursor 76 that is arranged at the same position as the song 71 C (that is, on the song 71 C).
- the operation screen 70 also includes the object group 77 configured by including the grains, which are arranged at the equally spaced intervals from the right to the left and the up to the down across the entire area, in the background, and the color of the grains 77 b that constitute the part of the object group 77 is displayed in the different mode from the color of the other grains of the object group 77 .
- the grains 77 b are configured by including the grains in the two rows of the object group 77 that are located near center in the vertical direction of the operation screen 70 .
- the grains (the grains 77 a , 77 b ) that constitute the part of the object group 77 displayed in the different display mode (the mode of the different color from the other grains) are shifted along the moving direction (the down direction) of the operation finger (an outlined arrow in FIG. 12C ). Accordingly, the user visually recognizes whether the part in the different display mode of the object group 77 is moved (shifted) along the moving direction of the operation finger, and thus can easily determine whether the scrolling operation of the song list 71 by the pointing device 10 can appropriately be continued.
- the user recognizes a direction in which the grains constituting the part displayed in the different display mode of the object group 77 displayed as the background are shifted, and can thereby easily comprehend the scrolling direction on the operation screen 70 in accordance with the operation using the pointing device 10 .
- the object group 77 configured by including a number of the grains, which are arranged at the equally spaced intervals from the right to the left and the up to the down, is used in this example; however, a planar graphic object (a plane object) that covers an area where the grains of the object group 77 are arranged may be used, for example.
- a planar graphic object a plane object that covers an area where the grains of the object group 77 are arranged may be used, for example.
- the part in the different color of the plane object is shifted from the part that corresponds to the grains 77 a to a part that corresponds to the grains 77 b . Also, in such a modified example, the same operational effects as those in this example can be realized.
- the graphic object group that is arranged in a plane on the operation screen is used in this example.
- a virtual three-dimensional space may be set in the background image 70 b , and a graphic object or a graphic object group that is arranged in said three-dimensional space may be used.
- the same operational effects as those in this example can be realized.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A vehicular display apparatus includes a display section, a pointing device, and a display control section. The display control section is configured to display an operation screen on the display section. The display control section is configured to generate an image that includes one of an object and an object group across a specified area on the operation screen, and execute processing of setting a part of the one of the object and the object group in a different display mode from the other part of the one of the object and the object group when the operation finger is recognized by a recognition section.
Description
- This is a Continuation of application Ser. No. 15/459,595 filed Mar. 15, 2017, which claims priority to Japanese Patent Application No. 2016-077516 filed on Apr. 7, 2016. The entire disclosure of the prior applications is hereby incorporated by reference herein in its entirety.
- The disclosure relates to a vehicular display apparatus that displays an operation screen.
- A vehicular display apparatus that displays a screen (an operation screen) enabling various operations of in-vehicle equipment by using a pointing device, which recognizes a position of a user's finger (hereinafter referred to as an “operation finger”), in an operation section has been known (for example, see Japanese Patent Application Publication No. 2004-345549 (JP 2004-345549 A)).
- In JP 2004-345549 A, a technique of recognizing the position of the operation finger in the operation section and moving a cursor that selects an icon on the operation screen on the basis of position information of the operation finger when the user contacts the operation section of the pointing device and performs a slide operation of a specified amount or larger has been disclosed.
- In addition, a technique of scrolling a plurality of selection targets (for example, a list) on the operation screen in accordance with movement of the operation finger in the operation section of the pointing device has been known.
- However, when the operation finger leaves the operation section even in the slightest amount due to a vibration or the like of a vehicle, there is a possibility that the pointing device can no longer recognize the operation finger. At this time, as long as the user moves the operation finger, the user cannot determine whether the pointing device recognizes the operation finger and thus can operate the operation screen or the pointing device does not recognize the operation finger and thus cannot operate the operation screen. That is, as long as the user checks whether the cursor can be moved or whether the list or the like can be scrolled, the user cannot determine whether the pointing device can operate the operation screen.
- The disclosure provides a vehicular display apparatus that allows a user to easily determine whether a pointing device can operate an operation screen.
- A vehicular display apparatus according to a first aspect of the disclosure includes: a display section; a pointing device that includes an operation section and a recognition section that recognizes an operation finger operating the operation section; and a display control section that is configured to display an operation screen on the display section, the operation screen including a plurality of selection targets, a selection operation of which is able to be performed by using the pointing device. The display control section is configured to: generate at least one of an image in which a cursor for selecting one selection target from the plurality of selection targets is moved and an image in which the plurality of selection targets that are arranged along a specified axis on the operation screen are scrolled in accordance with a position of the operation finger, which is recognized by the recognition section, on the operation section; generate an image that includes one of an object and an object group across a specified area on the operation screen; and execute processing of setting a part of the one of the object and the object group in a different display mode from the other part of the one of the object and the object group when the operation finger is recognized by the recognition section.
- According to the above configuration, in the case where the pointing device (the recognition section) recognizes the operation finger, the part of the object or the object group, which is displayed on the operation screen, is in the different display mode (for example, a color, luminance, a shape, size, displacement, or the like differs) from the other part thereof. Accordingly, the user visually recognizes presence or absence of a partial change in the display mode of the object or the object group on the operation screen, and thus can easily determine whether the operation screen can be operated by the pointing device.
- In addition, in the vehicular display apparatus according to the above aspect, the display control section may be configured to generate an image in which the part in the different display mode of the one of the object and the object group is moved along a moving direction of the operation finger when the operation finger, which is recognized by the recognition section, is moved with respect to the operation section.
- According to the above configuration, in the case where the operation finger, which is recognized by the pointing device (the recognition section), is moved, the part in the different display mode of the object or the object group is moved along the moving direction of the operation finger. For example, there is a case where the operation finger possibly leaves the operation section due to a vibration of a vehicle or the like while the operation finger is moved. In such a case, the user visually recognizes whether the part in the different display mode is moved along the moving direction of the operation finger, and thus can easily determine whether a moving operation of the cursor or a scrolling operation of a list or the like by the pointing device can appropriately be continued.
- In addition, in the vehicular display apparatus according to the above aspect, the display control section may be configured to generate the operation screen by composing a foreground image that includes the plurality of selection targets and a background image that includes one of the object and the object group.
- According to the above configuration, the object or the object group is displayed as a background of the plurality of selection targets (a plurality of icons or the list configured by including a plurality of items) or the like that are displayed as a foreground. Thus, occurrence of a situation where visibility of the plurality of selection targets, the cursor indicative of a selection status of the plurality of selection targets, or the like worsens, which is resulted from a change in the display mode or the like of the object group, can be suppressed.
- The disclosure can provide the vehicular display apparatus that allows the user to easily determine whether the operation screen can be operated by the pointing device.
- Features, advantages, and technical and industrial significance of exemplary embodiments of the disclosure will be described below with reference to the accompanying drawings, in which like numerals denote like elements, and wherein:
-
FIG. 1 is a block diagram that schematically shows one example of a configuration of a display apparatus; -
FIG. 2 is a view of one example of an in-vehicle mode of the display apparatus (a pointing device and a display); -
FIG. 3A is a view of one example of a foreground image by a display apparatus according to a first embodiment; -
FIG. 3B is a view of one example of a background image by the display apparatus according to the first embodiment; -
FIG. 3C is a view of one example of an operation screen by the display apparatus according to the first embodiment; -
FIG. 4A is a view of the one example of the foreground image by the display apparatus according to the first embodiment; -
FIG. 4B is a view of the one example of the background image by the display apparatus according to the first embodiment; -
FIG. 4C is a view of the one example of the operation screen by the display apparatus according to the first embodiment; -
FIG. 5A is a view of the one example of the foreground image by the display apparatus according to the first embodiment; -
FIG. 5B is a view of the one example of the background image by the display apparatus according to the first embodiment; -
FIG. 5C is a view of the one example of the operation screen by the display apparatus according to the first embodiment; -
FIG. 6A is a view of another example of the foreground image by the display apparatus according to the first embodiment; -
FIG. 6B is a view of another example of the background image by the display apparatus according to the first embodiment; -
FIG. 6C is a view of another example of the operation screen by the display apparatus according to the first embodiment; -
FIG. 7A is a view of another example of the foreground image by the display, apparatus according to the first embodiment; -
FIG. 7B is a view of another example of the background image by the display apparatus according to the first embodiment; -
FIG. 7C is a view of another example of the operation screen by the display apparatus according to the first embodiment; -
FIG. 8A is a view of another example of the foreground image by the display apparatus according to the first embodiment; -
FIG. 8B is a view of another example of the background image by the display apparatus according to the first embodiment; -
FIG. 8C is a view of another example of the operation screen by the display apparatus according to the first embodiment; -
FIG. 9A is a view of yet another example of the operation screen by the display apparatus according to the first embodiment; -
FIG. 9B is a view of yet another example of the operation screen by the display apparatus according to the first embodiment; -
FIG. 9C is a view of yet another example of the operation screen by the display apparatus according to the first embodiment; -
FIG. 10A is a view of one example of a foreground image by a display apparatus according to a second embodiment; -
FIG. 10B is a view of one example of a background image by the display apparatus according to the second embodiment; -
FIG. 10C is a view of one example of an operation screen by the display apparatus according to the second embodiment; -
FIG. 11A is a view of the one example of the foreground image by the display apparatus according to the second embodiment; -
FIG. 11B is a view of the one example of the background image by the display apparatus according to the second embodiment; -
FIG. 11C is a view of the one example of the operation screen by the display apparatus according to the second embodiment; -
FIG. 12A is a view of the one example of the foreground image by the display apparatus according to the second embodiment; -
FIG. 12B is a view of the one example of the background image by the display apparatus according to the second embodiment; and -
FIG. 12C is a view of the one example of the operation screen by the display apparatus according to the second embodiment. - A description will hereinafter be made on modes for carrying out the disclosure with reference to the drawings.
-
FIG. 1 is a block diagram that schematically shows one example of a configuration of adisplay apparatus 1 according to this embodiment. Thedisplay apparatus 1 includes apointing device 10, an electronic control unit (ECU) (a display control section) 20, and a display (a display section) 30. For example, thedisplay apparatus 1 is mounted on a vehicle and displays a screen (an operation screen) that includes a plurality of selection targets (a plurality of icons or the like) on thedisplay 30, a selection operation of which can be performed by using thepointing device 10. Hereinafter, a “vehicle” means the vehicle on which thedisplay apparatus 1 is mounted unless otherwise noted. - The
pointing device 10 is an input device that specifies a position on the operation screen, and is a touch pad, for example. Thepointing device 10 includes anoperation section 11 and arecognition section 12. - The operation section. 11 is a portion of the
pointing device 10 that is operated by a finger (an operation finger) of a user (for example, an occupant such as a driver of the vehicle). For example, theoperation section 11 is a touch operation screen of the touch pad. - The
recognition section 12 recognizes the operation finger on theoperation section 11. For example, therecognition section 12 is an electrostatic pad of the touch pad. For example, the electrostatic pad has such a structure that an electrode (an electrostatic sensor) extends linearly in each of an X-direction and a Y-direction on a plane with an insulator being interposed therebetween, and outputs a detection signal of these electrodes (a signal corresponding to a change amount of electric charges stored in the electrodes). Therecognition section 12 successively (that is, in every specified cycle) outputs a signal related to an operation state by the user (far example, a detection signal output by the electrostatic pad, hereinafter referred to as a “state signal”) to theECU 20. For example, therecognition section 12 outputs the state signal that corresponds to each coordinate (an x-coordinate and a y-coordinate) in a predetermined x-y-coordinate system by theoperation section 11. - Note that the
pointing device 10 is not limited to the touch pad or the like as long as adopting a mode of including: theoperation section 11; and therecognition section 12 that recognizes the operation finger on theoperation section 11. For example, thepointing device 10 may be a gesture recognition device that includes: a predetermined operation space (one example of the operation section 11); and a processing device (one example of the recognition section 12) that recognizes a position of the operation finger on the operation space on the basis of an image of the operation space captured by a camera. - The
pointing device 10 is provided at an appropriate place in a vehicle cabin and is preferably arranged at a position where the driver can easily operate thepointing device 10, more specifically, a position where the driver's hand can reach thepointing device 10 while keeping a driving posture. For example, as shown inFIG. 2 (a view of one example of an in-vehicle mode of the display apparatus 1), the pointing device 10 (the touch pad in this example) may be arranged in a console box or around the console box. - The
ECU 20 is one example of the display control section that displays the operation screen on thedisplay 30, and is an electronic control unit that executes processing of generating the operation screen. TheECU 20 moves a cursor that selects one selection target of the plurality of selection targets on the operation screen in accordance with the position of the operation finger, which is recognized by therecognition section 12, on theoperation section 11. For example, theECU 20 is configured by including a microcomputer as a central component and realizes various types of control processing by executing various programs stored in ROM on a CPU. As a functional section that is realized by executing the one or more programs on the CPU, theECU 20 includes astate reception section 21, a foregroundimage generation section 22, a backgroundimage generation section 23, and acomposition processing section 24. In addition, theECU 20 includes astorage section 29 as a predetermined storage area of non-volatile internal memory. - The
state reception section 21 executes processing of receiving the state signal that is received from thepointing device 10 in every specified cycle. Thestate reception section 21 transmits the received state signal to the foregroundimage generation section 22 and the backgroundimage generation section 23. - The foreground
image generation section 22 generates a foreground image of the operation screen that includes: the plurality of selection targets (a plurality of target components such as the icons), the selection operation of which can be performed by using thepointing device 10; and a cursor for selecting the one selection target from the plurality of selection targets. The cursor indicates the selected target component by emphatically displaying the target component such as the icon. For example, the cursor displays the selected target component in a mode of increasing luminance of the selected target component to be higher than that of the other target components, a mode of displaying the color of the selected target component in a different mode from the color of the other target components, a mode of surrounding the selected target component with a frame, or the like. - The foreground
image generation section 22 determines a cursor position (that is, what is selected from the plurality of target components) on the basis of the state signal received from thestate reception section 21. More specifically, the foregroundimage generation section 22 first computes the position on the operation screen that corresponds to the state signal. For example, the foregroundimage generation section 22 stores first corresponding relationship information in thestorage section 29 in advance, the first corresponding relationship information making change amounts of the x-coordinate and the y-coordinate on thepointing device 10, which are based on the state signal, correlate with change amounts of the x-coordinate and the y-coordinate on the operation screen. In this way, on the basis of said first corresponding relationship information, the foregroundimage generation section 22 can compute the change amounts of the x-coordinate and the y-coordinate on the operation screen that correspond to an operation of thepointing device 10. Then, with initial values of the x-coordinate and the y-coordinate (representative coordinates indicative of an initial position of the cursor upon activation of the display apparatus 1) on the operation screen being references, the foregroundimage generation section 22 can successively update the x-coordinate and the y-coordinate on the operation screen in accordance with the successively received state signal from thepointing device 10. - Next, the foreground
image generation section 22 determines a position of the cursor in accordance with the computed position on the operation screen. For example, the foregroundimage generation section 22 stores second corresponding relationship information in thestorage section 29 in advance, the second corresponding relationship information making the position of the cursor (for example, a centroid coordinate of the cursor) on the operation screen correlate with the position on the operation screen. In this way, on the basis of said second corresponding relationship information, the foregroundimage generation section 22 can determine the position (the x-coordinate and the y-coordinate) of the cursor on the operation screen. Then, the foregroundimage generation section 22 generates a foreground image in a mode of arranging the plurality of target components such as the icons at predetermined positions and arranging the cursor at the determined position on the operation screen (that is, on the one target component that is arranged at the same position as the determined position on the operation screen). - The background
image generation section 23 generates a background image of the operation screen that includes a graphic object or a graphic object group as collection of a number of the graphic objects (individual objects). The graphic object or the graphic object group is arranged across a specified area (for example, an area that includes an area where the plurality of target components are arranged) on the operation screen. For example, the graphic object may be a curved-surface shape surface object that is arranged across the specified area on the operation screen and imitates a water surface. In addition, for example, the graphic object group may be collection of a number of granular graphic objects (the individual objects) that are arranged across the specified area on the operation screen. - Similar to the foreground
image generation section 22, for example, based on the first corresponding relationship information, the backgroundimage generation section 23 computes the position on the operation screen that corresponds to the state signal received from thestate reception section 21. - In addition, on the basis of the state signal received from the
state reception section 21, the backgroundimage generation section 23 determines whether the pointing device 10 (the recognition section 12) recognizes the operation finger. That is, in the case where thepointing device 10 is the touch pad, the backgroundimage generation section 23 determines whether the user's finger is in contact with the touch pad. In the case where thepointing device 10 is the gesture recognition device, the backgroundimage generation section 23 determines whether the user's finger is held on the operation space. - Then, when determining that the pointing device 10 (the recognition section 12) does not recognize the operation finger, the background
image generation section 23 generates the background image that includes the graphic object or the graphic object group in a predetermined display mode. For example, the backgroundimage generation section 23 may generate the background image that includes the graphic object or the graphic object group in the predetermined display mode in which a difference that visually attracts the user's attention is not included in the entire graphic object or the entire graphic object group. - On the other hand, when determining that the pointing device 10 (the recognition section 12) recognizes the operation finger, the background
image generation section 23 changes a display mode of a part of the graphic object to differ from a display mode of the other parts of the graphic object or changes a display mode of a part of the graphic object group to differ from a display mode of the other parts of the graphic object group. More specifically, the backgroundimage generation section 23 defines third corresponding relationship information in advance and stores the third corresponding relationship information in thestorage section 29, the third corresponding relationship information making the part of the graphic object or the part of the graphic object group in the background image correlate with the position on the operation screen. Based on said third corresponding relationship information, the backgroundimage generation section 23 determines the part of the graphic object or the part of the graphic object group in the background image that corresponds to the computed position on the operation screen. Then, the backgroundimage generation section 23 generates the background image for which the determined part of the graphic object has the different display mode from the other parts of the graphic object or for which the determined part of the graphic object group has the different display mode from the other parts of the graphic object group. The “different display mode” means a display mode with a difference that is easily recognizable by the user who looks at the operation screen. The “different display mode” possibly includes a display mode with a different color, a display mode with difference luminance, a display mode with different size or a different shape more specifically, a display mode in which a shape or size of the individual object corresponding to the part of the graphic object group differs from a shape or size of each of the other individual objects), a display mode with a different amount of displacement (including an amount of displacement in a virtual three-dimensional space in the background image), and the like. The background image that is generated by the backgroundimage generation section 23 will be described in detail below. - The
composition processing section 24 executes processing of composing the foreground image, which is generated by the foregroundimage generation section 22, and the background image, which is generated by the backgroundimage generation section 23, to generate the operation screen. Then, thecomposition processing section 24 transmits a command signal that includes the generated operation screen to thedisplay 30. In this way, the operation screen can be displayed on thedisplay 30. - The
storage section 29 stores the first corresponding relationship information in advance, the first corresponding relationship information defining a corresponding relationship between a change amount of an x-y coordinate (the x-coordinate and the y-coordinate) on thepointing device 10 and a change amount of an x-y coordinate (the x-coordinate and the y-coordinate) on the operation screen. Thestorage section 29 also stores the second corresponding relationship information in advance, the second corresponding relationship information defining a corresponding relationship between the position (the x-coordinate and the y-coordinate) on the operation screen and the position of the cursor on the operation screen. Thestorage section 29 further stores the third corresponding relationship information in advance, the third corresponding relationship information making the part of the graphic object or the part of the graphic object group in the background image correlate with the position on the operation screen. On the operation screen, an x-axis direction is one example of the first direction, and a y-axis direction is one example of the second direction. - For example, the
display 30 is arranged at a remote position from thepointing device 10 and displays the operation screen that can be operated by thepointing device 10 in accordance with the command signal from the ECU 20 (the composition processing section 24). Thedisplay 30 is arranged at an appropriate position in the vehicle cabin, that is, at a position where thedisplay 30 can easily and visually be recognized by the user (the driver). For example, as shown inFIG. 2 , thedisplay 30 may be arranged in a central section of an instrument panel. Alternatively, like a head-up display (HUD), thedisplay 30 may be display means that directly displays the operation screen on eyesight of the user. - Next, with reference to
FIG. 3A toFIG. 5C , a description will be made on one example of the operation screen that is generated by thedisplay apparatus 1 according to this embodiment, more specifically, the foregroundimage generation section 22, the backgroundimage generation section 23, and thecomposition processing section 24 and that is displayed on thedisplay 30. -
FIG. 3A toFIG. 5C are views, each of which shows the one example of the operation screen by the display apparatus 1 (more specifically, the foregroundimage generation section 22, the backgroundimage generation section 23, and the composition processing section 24) according to this embodiment.FIG. 3A .FIG. 3B , andFIG. 3C respectively show aforeground image 40 a, abackground image 40 b, and anoperation screen 40 in a case where the pointing device 10 (the recognition section 12) does not recognize the operation finger. In addition,FIGS. 4A, 5A ,FIGS. 4B, 5B , andFIGS. 4C, 5C respectively show theforeground image 40 a, thebackground image 40 b, and theoperation screen 40 in a case where the pointing device 10 (the recognition section 12) recognizes the operation finger. More specifically,FIG. 4A ,FIG. 4B , andFIG. 4C respectively show theforeground image 40 a, thebackground image 40 b, and theoperation screen 40 at a time when a state shown inFIG. 3A ,FIG. 3B , andFIG. 3C (a state where thepointing device 10 the recognition section 12) does not recognize the operation finger) is shifted to a state where the pointing device 10 (the recognition section 12) recognizes the operation finger,FIG. 5A ,FIG. 5B , andFIG. 5C respectively show theforeground image 40 a, thebackground image 40 b, and theoperation screen 40 after the operation finger that is recognized by therecognition section 12 starts being moved from the state shown inFIG. 4A ,FIG. 4B , andFIG. 4C . - First, with reference to
FIGS. 3A to 3C , a description will be made on theoperation screen 40 in the case where the pointing device 10 (the recognition section 12) does not recognize the operation finger. - As shown in
FIG. 3A , theforeground image 40 a includes:icons 41 to 45 as the selection targets; and acursor 46 that selects either one of theicons 41 to 45. Theicon 41 is a switch component that shifts to an operation screen for an air conditioner mounted on the vehicle. Theicon 42 is a switch component that shifts to an operation screen for making a phone call. Theicon 43 is a switch component that shifts to an operation screen for a navigation device mounted on the vehicle. Theicon 44 is a switch component that shifts to an operation screen for an audio device mounted on the vehicle. Theicon 45 is a switch component that shifts to an operation screen for making various types of setting of thedisplay apparatus 1, for example. Positions of theicons 41 to 45 on theoperation screen 40 are defined in advance, and theicons 41 to 45 are sequentially arranged from a left end to a right end at a slightly lower position from center in a vertical direction of theforeground image 40 a (that is, the operation screen 40). In order to emphasize a state where either one of theicons 41 to 45 is selected, a mode of surrounding the selected one of theicons 41 to 45 with a frame is adopted for thecursor 46 in this example. - On the basis of the state signal received from the
state reception section 21, the foregroundimage generation section 22 determines a position of thecursor 46 on theforeground image 40 a (that is, the operation screen 40). Then, the foregroundimage generation section 22 arranges theicons 41 to 45 at the predetermined positions and generates theforeground image 40 a of theoperation screen 40 on which thecursor 46 is arranged at the determined position (more specifically, the same position as either one of theicons 41 to 45). Because the pointing device 10 (the recognition section 12) does not recognize the operation finger in the state shown inFIGS. 3A to 3C , the position of thecursor 46 corresponds to a position at termination of the last moving operation of thecursor 46 or the predetermined initial position of thecursor 46. In this example, as shown inFIG. 3A , thecursor 46 is arranged at the same position as the icon 41 (that is, on the icon 41) and is in a state of selecting theicon 41. - As shown in
FIG. 3B , thebackground image 40 b includes a graphic object group (hereinafter referred to as an “object group”) 47 that is configured by including a large number of the granular graphic objects (hereinafter simply referred to as “grains”). Theobject group 47 is arranged in a mode of arranging the grains as components in a lateral line across a specified area on theoperation screen 40, more specifically, a moving area of thecursor 46 in a lateral direction of theoperation screen 40 at a lower end of thebackground image 40 b (that is, the operation screen 40). When determining that the pointing device 10 (the recognition section 12) does not recognize the operation finger, the backgroundimage generation section 23 generates thebackground image 40 b that includes theobject group 47 in the predetermined display mode, that is, the display mode in which the grains are arranged in the lateral line (theobject group 47 is arranged along the lateral direction) in this example. - Then, the
composition processing section 24 composes theforeground image 40 a inFIG. 3A and the background image 40 h inFIG. 3B to generate theoperation screen 40. As shown inFIG. 3C , theoperation screen 40 includes: theicons 41 to 45 that are arranged in line from the left end to the right end at the slightly lower position from the center in the vertical direction; and thecursor 46 that is arranged at the same position as the icon 41 (that is, on the icon 41). Theoperation screen 40 also includes theobject group 47 that is configured by including the grains arranged in line from the left end to the right end at a position further below theicons 41 to 45 and thecursor 46. - Next, with reference to
FIGS. 4A to 4C , a description will be made on theoperation screen 40 at a time when the state where the pointing device 10 (the recognition section 12) does not recognize the operation finger is shifted to a state where the pointing device 10 (the recognition section 12) recognizes the operation finger. - As shown in
FIG. 4A , when the state of theoperation screen 40 shown inFIG. 3C is shifted to the state where the pointing device 10 (the recognition section 12) recognizes the operation finger, the position of thecursor 46 is not changed. Thus, the foregroundimage generation section 22 generates thesame foreground image 40 a as inFIG. 3A . - When determining that the pointing device 10 (the recognition section 12) recognizes the operation finger on the basis of the state signal from the
state reception section 21, as shown inFIG. 4B , the backgroundimage generation section 23 generates thebackground image 40 b in whichgrains 47 a as a part of theobject group 47, which is correlated with the position on theoperation screen 40 corresponding to the state signal, are in a different display mode from the grains of the other part of theobject group 47. More specifically, thegrains 47 a are significantly displaced in the vertical direction (the x-axis direction) with respect to the other grains in theobject group 47. - Then, the
composition processing section 24 composes theforeground image 40 a inFIG. 4A and thebackground image 40 b inFIG. 4B to generate theoperation screen 40. As shown inFIG. 4C , theoperation screen 40 includes: theicons 41 to 45 that are arranged in line from the left end to the right end at the slightly lower position from the center in the vertical direction; and thecursor 46 that is arranged at the same position as the icon 41 (that is, on the icon 41). Theoperation screen 40 also includes theobject group 47 that is arranged from the left end to the right end in the mode in which thegrains 47 a are significantly displaced in the vertical direction with respect to the other grains in theobject group 47 at the position further below theicons 41 to 45 and thecursor 46. In this example, thegrains 47 a are configured by including five grains with a lateral position on theoperation screen 40, which corresponds to the state signal from the state reception section 21 (the pointing device 10), being center and are displayed in a mode of partially overlapping theicon 41, which is selected by thecursor 46, in a background. In addition, an amount of displacement of each of the grains included in thegrains 47 a in the vertical direction differs. More specifically, of thegrains 47 a, the grain closest to the lateral position on theoperation screen 40, which corresponds to the state signal, has the largest amount of the displacement in the vertical direction, and the amount of the displacement of the grain in the vertical direction is gradually reduced as the grain separates from the lateral position on theoperation screen 40, which corresponds to the state signal. - As described above, when the pointing device 10 (the recognition section 12) recognizes the operation finger, the part (the
grains 47 a) of theobject group 47 is set in the different display mode from the other part of theobject group 47. More specifically, thegrains 47 a of theobject group 47 correlated with the position on theoperation screen 40, which corresponds to the state signal from thepointing device 10, (in other words, the position of the cursor 46) are set in the different display mode from the other grains in theobject group 47. In this way, the user can easily comprehend the operation state on theoperation screen 40. For example, there is a case where the operation finger of the user possibly leaves theoperation section 11 due to a vibration or the like of the vehicle. At this time, the user recognizes that thegrains 47 a are displaced (raised) in the vertical direction, and thus can easily determine whether theoperation screen 40 can be operated by the pointing device 10 (more specifically, whether an operable state continues). In addition, the user visually recognizes that thegrains 47 a are displaced (raised) in the vertical direction at substantially the same lateral position as thecursor 46 on theoperation screen 40, and thus can easily comprehend (an indication of) the position of thecursor 46 at a start of the operation by using thepointing device 10. In particular, when the user is the driver of the vehicle, the user cannot gaze at thedisplay 30. To handle this problem, thegrains 47 a as the part of theobject group 47 are raised in the vertical direction. In this way, without gazing at thedisplay 30, the user can easily comprehend whether theoperation screen 40 can be operated by thepointing device 10, the position of thecursor 46, and the like. In addition, thegrains 47 a are displayed as the background of theicons 41 to 45 and thecursor 46. Thus, even when thegrains 47 a are raised in the vertical direction, occurrence of a situation where visibility of theicons 41 to 45 and thecursor 46 worsens can be suppressed. - Next, with reference to
FIGS. 5A to 5C , a description will be made on theoperation screen 40 after the operation finger, which is recognized by therecognition section 12, starts being moved. - Note that when the state shown in
FIGS. 4A to 4C is shifted to a state shown inFIGS. 5A to 5C , the user performs an operation of moving thecursor 46 in a right direction by moving the operation finger in the right direction on theoperation section 11. - As shown in
FIG. 5A , the position of thecursor 46 is updated from the state shown inFIG. 4A along with the movement of the operation finger on theoperation section 11. More specifically, based on the state signal from thestate reception section 21, the foregroundimage generation section 22 determines (changes) the position of thecursor 46 to the same position as the icon 42 (that is, on the icon 42) on an immediate right side of theicon 41 and generates theforeground image 40 a shown inFIG. 5A . - Similar to the case of
FIG. 4B , when determining that the pointing device 10 (the recognition section 12) recognizes the operation finger on the basis of the state signal from thestate reception section 21, as shown inFIG. 5B , the backgroundimage generation section 23 generates thebackground image 40 b in which grains 47 h as a part of theobject group 47, which is correlated with the position on theoperation screen 40 corresponding to the state signal, are in a different display mode from the other grains of theobject group 47. At this time, the grains constituting the part of theobject group 47 that is raised in the vertical direction are shifted from thegrains 47 a to thegrains 47 b in accordance with the movement of the operation finger on theoperation section 11. More specifically, thegrains 47 b are arranged on a right side of thegrains 47 a, and the grains constituting the part of theobject group 47 that is raised in the vertical direction are shifted to the right direction along a moving direction of the operation finger on theoperation section 11. That is, the third corresponding relationship information stored in the above-describedstorage section 29 is set such that the grains constituting the part of theobject group 47 that is raised in the vertical direction are shifted in the same direction by following the movement of the position on theoperation screen 40 that corresponds to the state signal from thestate reception section 21. - Then, the
composition processing section 24 composes theforeground image 40 a inFIG. 5A and thebackground image 40 b inFIG. 5B to generate theoperation screen 40. As shown inFIG. 5C , theoperation screen 40 includes: theicons 41 to 45 that are arranged in line from the left end to the right end (in the lateral direction) at the slightly lower position from the center in the vertical direction; and thecursor 46 that is arranged at the same position as the icon 42 (that is, on the icon 42). Theoperation screen 40 also includes theobject group 47 that is arranged from the left end to the right end in the mode in which the grains 47 h are significantly displaced in the vertical direction with respect to the other grains at the position further below theicons 41 to 45 and thecursor 46. In this example, thegrains 47 b are configured by including the five grains with the lateral position on theoperation screen 40, which corresponds to the state signal from the state reception section 21 (the pointing device 10), being the center and are displayed in a mode of partially overlapping theicon 42, which is selected by thecursor 46, in the background. - As described above, when the operation finger, which is recognized by the pointing device 10 (the recognition section 12), is moved on the
operation section 11, the part of theobject group 47 in the different display mode is moved along the moving direction of the operation finger. More specifically, the grains (thegrains object group 47 that are displayed in the different display mode (the mode of being raised in the vertical direction) are shifted in the same direction (an outlined arrow inFIG. 5C ) by following the movement of the position on theoperation screen 40 that corresponds to the state signal from the state reception section 21 (the pointing device 10). For example, there is the case where the operation finger possibly leaves theoperation section 11 due to the vibration or the like of the vehicle while the operation finger is moved on theoperation section 11. In such a case, the user visually recognizes whether the part of theobject group 47 in the different display mode is moved (shifted) along the moving direction of the operation finger, and thus can easily determine whether the moving operation of thecursor 46 by thepointing device 10 can appropriately be continued. In addition, the user recognizes the direction in which the grains, which constitute the part of theobject group 47 and are displayed in the different display mode, are shifted, thereby easily comprehending the moving direction of thecursor 46 on theoperation screen 40. - Note that the
object group 47 configured by arranging a number of the grains in the lateral direction is used in this example; however, the grains of theobject group 47 may be joined to create the one graphic object. In such a case, when the state where the pointing device 10 (the recognition section 12) does not recognize the operation finger is shifted to the state where the pointing device 10 (the recognition section 12) recognizes the operation finger, a part of the graphic object that corresponds to thegrains 47 a is raised in the vertical direction. Then, in accordance with the movement of the operation finger, which is recognized by thepointing device 10, the part of the graphic object raised in the vertical direction is shifted from the part that corresponds to thegrains 47 a to a part that corresponds to thegrains 47 b (that is, moved along the moving direction of the operation finger). Also, in such a modified example, the same operational effects as those in this example can be realized. - In addition, the mode of significantly displacing (raising in the vertical direction) the part (the
grains FIG. 6A toFIG. 8C , a description will hereinafter be made on anoperation screen 50 that includes another example of the “different display mode.” -
FIG. 6A toFIG. 8C are views that show another example of the operation screen by the display apparatus 1 (more specifically, the foregroundimage generation section 22, the backgroundimage generation section 23, and the composition processing section 24) according to this embodiment.FIG. 6A ,FIG. 6B , andFIG. 6C respectively show aforeground image 50 a, abackground image 50 b, and theoperation screen 50 in the case where the pointing device 10 (the recognition section 12) does not recognize the operation finger. In addition,FIGS. 7A, 8A ,FIGS. 7B, 8B , andFIGS. 7C, 8C respectively show theforeground image 50 a, thebackground image 50 b, and theoperation screen 50 in the case where the pointing device 10 (the recognition section 12) recognizes the operation finger. More specifically,FIG. 7A ,FIG. 7B , andFIG. 1C respectively show theforeground image 50 a, thebackground image 50 b, and theoperation screen 50 at a time when a state shown inFIG. 6A ,FIG. 6B , andFIG. 6C the state where the pointing device 10 (the recognition section 12) does not recognize the operation finger) is shifted to the state where the pointing device 10 (the recognition section 12) recognizes the operation finger,FIG. 8A ,FIG. 8B , andFIG. 8C respectively show theforeground image 50 a, thebackground image 50 b, and theoperation screen 50 after the operation finger that is recognized by therecognition section 12 starts being moved from the state shown inFIG. 7A .FIG. 7B , andFIG. 7C . - First, with reference to
FIGS. 6A to 6C , a description will be made on theoperation screen 50 in a case where a moving operation of acursor 56 is not performed. As shown inFIG. 6A , theforeground image 50 a includes:songs 51 to 55 as the selection targets; and thecursor 56 that selects either one of thesongs 51 to 55. When thecursor 56 selects either one of thesongs 51 to 55 by using thepointing device 10, either one of thesongs 51 to 55 can be played from a specified audio source (for example, a CD or the like) in audio equipment that is mounted on the vehicle. Positions of thesongs 51 to 55 on theoperation screen 50 are defined in advance and are sequentially arranged from an upper end to a lower end within an area in the lateral direction that excludes a left end of theforeground image 50 a that is, the operation screen 50). In order to emphasize a state where either one of thesongs 51 to 55 is selected, a mode of surrounding the selected one of thesongs 51 to 55 with the frame is adopted for thecursor 56 in this example. - On the basis of the state signal received from the
state reception section 21, the foregroundimage generation section 22 determines a position of thecursor 56 on theforeground image 50 a (that is, the operation screen 50). Then, the foregroundimage generation section 22 arranges thesongs 51 to 55 at the predetermined positions and generates theforeground image 50 a of theoperation screen 50 on which thecursor 56 is arranged at the determined position (more specifically, the same position as either one of thesongs 51 to 55). Because the pointing device 10 (the recognition section 12) does not recognize the operation finger in the state shown inFIGS. 6A to 6C , the position of thecursor 56 corresponds to a position at termination of the last moving operation of thecursor 56 or a predetermined initial position of thecursor 56. In this example, as shown inFIG. 6A , thecursor 56 is arranged at the same position as the song 52 (that is, on the song 52) and is in a state of selecting thesong 52. - As shown in
FIG. 6B , thebackground image 50 b includes a graphic object group (hereinafter referred to as an “object group”) 57 that is configured by including a large number of the granular graphic objects (hereinafter simply referred to as “grains”). Theobject group 57 is arranged such that the grains as components are arranged at equally spaced intervals from right to left and up to down across a specified area on thebackground image 50 b (that is, the operation screen 50), more specifically, an entire area on theoperation screen 50 in the vertical direction and the lateral direction. When determining that the pointing device 10 (the recognition section 12) does not recognize the operation finger, the backgroundimage generation section 23 generates thebackground image 50 b that includes theobject group 57 in the predetermined display mode, that is, the display mode in which the grains in the same color are arranged at the equally spaced intervals from the right to the left and the up to the down in this example. - Then, the
composition processing section 24 composes theforeground image 50 a inFIG. 6A and thebackground image 50 b inFIG. 6B to generate theoperation screen 50. As shown inFIG. 6C , theoperation screen 50 includes: thesongs 51 to 55 that are arranged in line in the vertical direction; and thecursor 56 that is arranged at the same position as the song 52 (that is, on the song 52). Theoperation screen 50 also includes theobject group 57 configured by including the grains in the same color, which are arranged at the equally spaced intervals from the right to the left and the up to the bottom across the entire area, in a background. - Next, a description will be made on the operation screen. 50 at a time when the state where the pointing device 10 (the recognition section 12) does not recognize the operation finger is shifted to the state where the pointing device 10 (the recognition section 12) recognizes the operation finger with reference to
FIGS. 7A to 7C . - As shown in
FIG. 7A , when the state of theoperation screen 50 shown inFIG. 6C is shifted to the state where the pointing device 10 (the recognition section 12) recognizes the operation finger, the position of thecursor 56 is not changed. Thus, the foregroundimage generation section 22 generates thesame foreground image 50 a as inFIG. 6A . - When determining that the pointing device 10 (the recognition section 12) recognizes the operation finger on the basis of the state signal from the
state reception section 21, as shown inFIG. 7B , the backgroundimage generation section 23 generates thebackground image 50 b in whichgrains 57 a as a pan of theobject group 57, which is correlated with the position on theoperation screen 50 corresponding to the state signal, are in a different display mode from the grains of the other part of theobject group 57. More specifically, of the grains included in theobject group 57, thegrains 57 a are grains in two rows near a vertical position on theoperation screen 50, which corresponding to the state signal, and have a different shape from the other grains in theobject group 57. - Then, the
composition processing section 24 composes theforeground image 50 a inFIG. 7A and thebackground image 50 b inFIG. 7B to generate theoperation screen 50. As shown inFIG. 7C , theoperation screen 50 includes: thesongs 51 to 55 that are arranged in line in the vertical direction; and thecursor 56 that is arranged at the same position as the song 52 (that is, on the song 52). Theoperation screen 50 also includes theobject group 57 configured by including the grains, which are arranged at the equally spaced intervals from the right to the left and the up to the bottom across the entire area, in the background. The color of thegrains 57 a, which constitute the part of theobject group 57, is displayed in the different mode from the color of the grains constituting the other part of theobject group 57. In this example, thegrains 57 a are configured by including the grains in the two rows with the vertical position on theoperation screen 50, which corresponds to the state signal from the state reception section 21 (the pointing device 10), being center and are displayed in a mode of partially overlapping thesong 52, which is selected by thecursor 56, in the background. - As described above, in the case where the pointing device 10 (the recognition section 12) recognizes the operation finger, the part (the
grains 47 a) of theobject group 47 is set in the different display mode from the other part of theobject group 47. More specifically, thegrains 57 a of theobject group 57 correlated with the position on theoperation screen 50, which corresponds to the state signal from thepointing device 10, are set in the different display mode from the other grains in theobject group 57. In this way, the same operational effects as those in the example shown inFIG. 3A toFIG. 5C can be realized. More specifically, the user recognizes a color change of thegrains 57 a as the part of theobject group 57, and thus can easily determine whether theoperation screen 50 can be operated by the pointing device 10 (more specifically, whether the operable state continues). In addition, by recognizing the color change of thegrains 57 a as the part at substantially the same vertical position as thecursor 56 on theoperation screen 50, the user can easily comprehend (an indication of) the position of thecursor 56 at the start of the operation by using thepointing device 10. In addition, thegrains 57 a are displayed as the background of thesongs 51 to 55 and thecursor 56. Thus, even when the color of thegrains 57 a is changed, occurrence of a situation where visibility of thesongs 51 to 55 and thecursor 56 worsens can be suppressed. - Next, a description will be made on the
operation screen 50 after the operation finger, which is recognized by therecognition section 12, starts being moved with reference toFIGS. 8A to 8C . - Note that when the state shown in
FIGS. 7A to 7C is shifted to a state shown inFIGS. 8A to 8C , the user performs an operation of moving thecursor 56 downward by moving the operation finger downward on theoperation section 11. - As shown in
FIG. 8A , the position of thecursor 56 is updated from the state shown inFIG. 7A along with the movement of the operation finger on theoperation section 11. More specifically, based on the state signal from thestate reception section 21, the foregroundimage generation section 22 determines (changes) the position of thecursor 56 to the same position as the song 53 (that is, on the song 53) on an immediate lower side of thesong 52 and generates theforeground image 50 a shown inFIG. 8A . - Similar to the case of
FIG. 7B , when determining that the pointing device 10 (the recognition section 12) recognizes the operation finger on the basis of the state signal from thestate reception section 21, as shown inFIG. 8B , the backgroundimage generation section 23 generates thebackground image 50 b in which grains 57 b as a part of theobject group 57, which is correlated with the position on theoperation screen 50 corresponding to the state signal, are in a different display mode from the other grains of theobject group 57. At this time, the grains that constitute the part in the different color of theobject group 57 are shifted from thegrains 57 a to the grains 57 h in accordance with the movement of the operation finger on theoperation section 11. More specifically, the grains 57 b are arranged on a lower side of thegrains 57 a, and the grains that constitute the part in the different color of theobject group 57 are shifted downward along the moving direction of the operation finger on theoperation section 11. That is, the third corresponding relationship information stored in the above-describedstorage section 29 is set such that the grains constituting the part in the different color of theobject group 57 are shifted in the same direction by following the movement of the position on theoperation screen 50 that corresponds to the state signal from thestate reception section 21. - Then, the
composition processing section 24 composes theforeground image 50 a inFIG. 5A and thebackground image 50 b inFIG. 8B to generate theoperation screen 50. As shown inFIG. 8C , theoperation screen 50 includes: thesongs 51 to 55 that are arranged in line in the vertical direction; and thecursor 56 that is arranged at the same position as the song 53 (that is, on the song 53). Theoperation screen 50 also includes theobject group 57 configured by including the grains, which are arranged at the equally spaced intervals from the right to the left and the up to the bottom across the entire area, in the background, and the color of the grains 57 b as the part of theobject group 57 is displayed in the different mode from the color of the other grains. In this example, the grains 57 b are configured by including the grains in the two rows with the vertical position on theoperation screen 50, which corresponds to the state signal from the state reception section 21 (the pointing device 10), being the center and are displayed in a mode of partially overlapping thesong 53, which is selected by thecursor 56, in the background. - As described above, when the operation finger, which is recognized by the pointing device 10 (the recognition section 12), is moved on the
operation section 11, the part of theobject group 47 in the different display mode is moved along the moving direction of the operation finger. More specifically, the grains (thegrains 57 a, 57 b) that constitute the part of theobject group 57 displayed in the different display mode (the mode of the different color from the other grains) are shifted in the same direction (an outlined arrow inFIG. 8C ) by following the movement of the position on theoperation screen 50 that corresponds to the state signal from the state reception section 21 (the pointing device 10). Thus, the same operational effects as those in the example shown inFIG. 3A toFIG. 5C can be realized. More specifically, the user visually recognizes whether the part of theobject group 57 in the different display mode is moved (shifted) along the moving direction of the operation finger, and thus can easily determine whether the moving operation of thecursor 56 by thepointing device 10 can appropriately be continued. In addition, the user recognizes the direction in which the part of theobject group 57 in the different display mode are shifted, and can thereby easily comprehend the moving direction of thecursor 56 on theoperation screen 50. - Note that the
object group 57 configured by arranging a number of the grains at the equally spaced intervals from the right to the left and the up to the down is used in this example; however, a planar graphic object (a plane object) that covers the area where the grains of theobject group 57 are arranged may be used, for example. In such a case, when the state where the pointing device 10 (the recognition section 12) does not recognize the operation finger is shifted to the state where the pointing device 10 (the recognition section 12) recognizes the operation finger, a color of a part of the plane object that corresponds to thegrains 57 a is changed. Then, in accordance with the movement of the operation finger, which is recognized by thepointing device 10, the part in the different color of the plane object is shifted from the part that corresponds to thegrains 57 a to a part that corresponds to the grains 57 b. Also, in such a modified example, the same operational effects as those in this example can be realized. - In addition, the graphic object group that is arranged in two dimensions is used in this example. However, a virtual three-dimensional space may be set in the
background image 50 b, and the graphic object group that is arranged in said three-dimensional space may be used. A description will hereinafter be made on the operation screen that includes the graphic object group arranged in the virtual three-dimensional space with reference toFIGS. 9A to 9C . -
FIGS. 9A to 9C show yet another example of the operation screen by the display apparatus 1 (more specifically, the foregroundimage generation section 22, the backgroundimage generation section 23, and the composition processing section 24) according to this embodiment.FIG. 9A shows anoperation screen 60 in the case where the pointing device 10 (the recognition section 12) does not recognize the operation finger.FIGS. 9B, 9C show theoperation screen 60 in the case where the pointing device 10 (the recognition section 12) recognizes the operation finger. More specifically,FIG. 9B shows theoperation screen 60 at a time when a state shown inFIG. 9A (the state where the pointing device 10 (the recognition section 12) does not recognize the operation finger) is shifted to a state where the pointing device 10 (the recognition section 12) recognizes the operation finger.FIG. 9C shows theoperation screen 60 after the operation finger, which is recognized by therecognition section 12, starts being moved from the state shown inFIG. 9B . - Note that, when the state shown in
FIG. 9B is shifted to the state shown inFIG. 9C , the user performs an operation of moving a cursor 68 in the right direction by moving the operation finger in the right direction on theoperation section 11. - As shown in
FIG. 9A , theoperation screen 60 includes, as components of a foreground image:icons 61 to 67 as the selection targets; and a cursor 68 that selects either one of theicons 61 to 67. Theicons 61 to 67 are sequentially arranged from the left to the right at a lower end of theoperation screen 60. Because the pointing device 10 (the recognition section 12) does not recognize the operation finger in the state shown inFIG. 9A , a position of the cursor 68 corresponds to a position at termination of the last moving operation of the cursor 68 or a predetermined initial position of the cursor 68. In this example, the cursor 68 is arranged at the same position as the icon 64 (that is, on the icon 64) and is in a state of selecting theicon 64. - As shown in
FIG. 9A , theoperation screen 60 includes, as components of a background image, a graphic object group (hereinafter simply referred to as an “object group”) 69 that is arranged in the virtual three-dimensional space on theoperation screen 60. Theobject group 69 is configured by including a large number of the granular graphic objects (hereinafter simply referred to as the “grains”) arranged in the virtual three-dimensional space in accordance with a specified rule, and are arranged (along the lateral direction) across a left end to a right end (an area including a moving area of the cursor 68 that moves in the lateral direction) of theoperation screen 60. - When the state shown in
FIG. 9A is shifted to the state where the pointing device 10 (the recognition section 12) recognizes the operation finger, as shown inFIG. 9B ,grains 69 a constituting a part of theobject group 69, which is correlated with the position on theoperation screen 60 corresponding to the state signal, are displayed in a different display mode from the other grains of theobject group 69. More specifically, thegrains 69 a as the part of theobject group 69 are displaced (raised) in a specified direction (the vertical direction) in the virtual three-dimensional space with respect to the grains as the other part of theobject group 69, and are displayed in a mode of partially overlapping theicon 64, which is selected by the cursor 68, in a background. In this way, similar to the examples shown inFIG. 3A toFIG. 5C andFIG. 6A toFIG. 8C , the user can further easily comprehend the operation state on theoperation screen 60. That is, the user recognizes a raise in thegrains 69 a, which constitutes the part of theobject group 69, and thus can easily comprehend whether theoperation screen 40 can be operated by thepointing device 10, the position of the cursor 68, and the like. - When the operation finger, which is recognized by the
recognition section 12, starts being moved from the state shown inFIG. 9B , as shown inFIG. 9C , the cursor 68 is moved in the right direction along with the movement of the operation finger on theoperation section 11 and is arranged at the same position as the icon 65 (that is, on the icon 65). In addition, because the position on theoperation screen 60 that corresponds to the state signal is moved in the right direction in accordance with the movement of the operation linger on theoperation section 11, the grains that constitute the part in the different display mode of theobject group 69 are shifted from thegrains 69 a tograins 69 b. That is, the grains that constitute the raised part of theobject group 69 are shifted in the right direction along the moving direction of the operation finger on theoperation section 11. In this way, the similar operational effects as those in examples shown inFIG. 3A toFIG. 5C andFIG. 6A toFIG. 8C can be realized. More specifically, the user visually recognizes whether the part in the different display mode of theobject group 69 is moved (shifted) along the moving direction of the operation finger, and thus can easily determine whether the moving operation of thecursor 66 by thepointing device 10 can appropriately be continued. In addition, the user recognizes a direction in which the grains constituting the part in the different display mode are shifted, and can thereby easily comprehend the moving direction of thecursor 46 on theoperation screen 60. - In this example, the
object group 69 is arranged in the virtual three-dimensional space, and design of theoperation screen 60 can thereby be produced. - Note that the
object group 69, which is configured by arranging a large number of the grains in the virtual three-dimensional space, is used in this example; however, one graphic object. (surface) that is formed by joining the grains of theobject group 69 may be used, for example. In such a case, when the state where the pointing device 10 (the recognition section 12) does not recognize the operation finger is shifted to the state where the pointing device 10 (the recognition section 12) recognizes the operation finger, a part of the surface that corresponds to thegrain 69 a is raised in the virtual three-dimensional space. Then, in accordance with the movement of the operation finger, which is recognized by thepointing device 10, the part of the surface raised in the virtual three-dimensional space is shifted from the part that corresponds to thegrains 69 a to a part that corresponds to thegrains 69 b. Also, in such a modified example, the same operational effects as those in this example can be realized. - Next, a description will be made on a second embodiment.
- A
display apparatus 1 according to this embodiment uses apointing device 10 to display an operation screen that includes a plurality of selection targets (a plurality of items constituting a scrollable list or the like), a selection operation of which can be performed, on adisplay 30. More specifically, thedisplay apparatus 1 according to this embodiment differs from thedisplay apparatus 1 according to the first embodiment in a point that the operation screen including the plurality of selection targets (the plurality of items constituting the list or the like) that are arranged along a specified axis on the operation screen, and a scrolling operation of which can be performed in a direction of said specified axis by using thepointing device 10, is displayed on thedisplay 30. Hereinafter, a description will be centered on different portions from the first embodiment. - A configuration of the
display apparatus 1 according to this embodiment is shown inFIG. 1 as in the first embodiment. - An
ECU 20 is another example of the display control section that displays the operation screen on thedisplay 30, and is an electronic control unit that executes processing of generating the operation screen. In accordance with a position of an operation finger on anoperation section 11, which is recognized by arecognition section 12, theECU 20 scrolls the plurality of selection targets (the plurality of items constituting the list or the like) that are arranged along the specified axis on the operation screen. Similar to the first embodiment, theECU 20 includes astate reception section 21, a foregroundimage generation section 22, a backgroundimage generation section 23, acomposition processing section 24, and astorage section 29. - In accordance with a content of the scrolling operation that corresponds to a state signal from the state reception section 21 (the pointing device 10), the foreground
image generation section 22 determines arrangement of the plurality of selection targets (the plurality of items constituting the list or the like) on the operation screen. Then, the foregroundimage generation section 22 generates a foreground image of the operation screen that includes at least a part of the selection targets of the plurality of selection targets. This is because the number of the plurality of selection targets (the number of the items constituting the list) that can be selected by the scrolling operation using thepointing device 10 is usually larger than the number of the selection targets that can be displayed on the operation screen. - More specifically, based on the state signal received from the
state reception section 21, the foregroundimage generation section 22 determines a scrolling amount, a scrolling direction, and the like of (the plurality of selection targets constituting) the list or the like. For example, the foregroundimage generation section 22 confirms a content of the scrolling operation (a type or the like of the scrolling operation) on the basis of the state signal. For example, in the case where thepointing device 10 is a touch pad, a scrolling operation by “dragging (tracing)” and a scrolling operation by “flicking” are available as the types of the scrolling operation. The scrolling operation by “dragging” is an operation of moving a finger at a relatively low speed while the finger remains in contact with an operation surface of the touch pad. Meanwhile, the scrolling operation by “flicking” is an operation of moving the finger at a relatively high speed in a mode of snapping the operation surface of the touch pad with the finger in a direction of the scrolling operation. For example, based on a determination on whether change amounts of the x-coordinate and the y-coordinate on the touch pad, which are based on the state signal received from thestate reception section 21, are each equal to or larger than a specified threshold, the foregroundimage generation section 22 can determine whether the scrolling operation is by “dragging” or by “flicking”. When determining that the scrolling operation is by “dragging”, the foregroundimage generation section 22 determines the scrolling amount and the scrolling direction on the basis of the change amounts of the x-coordinate and the y-coordinate on the touch pad based on the state signal. For example, the foregroundimage generation section 22 stores fourth corresponding relationship information in thestorage section 29 in advance, the fourth corresponding relationship information making the scrolling amount and the scrolling direction correlate with the change amounts of the x-coordinate and the y-coordinate on the touch pad. In this way, based on the fourth corresponding relationship information, the foregroundimage generation section 22 can determine the scrolling amount and the scrolling direction. Meanwhile, when determining that the scrolling operation is by “flicking”, the foregroundimage generation section 22 determines a specified value, which is defined in advance for the scrolling operation by “flicking”, as the scrolling amount, and determines the scrolling direction on the basis of the change amounts of the x-coordinate and the y-coordinate on the touch pad based on the state signal. - Then, the foreground
image generation section 22 determines the arrangement of the selection targets in accordance with the determined scrolling amount and the determined scrolling direction, and generates the foreground image of the operation screen that includes at least the part of the selection targets of the plurality of selection targets (the plurality of items constituting the list). - Note that, also in the case where the
pointing device 10 is a gesture recognition device, types corresponding to the scrolling operation by “dragging” and the scrolling operation by “flicking” on the touch pad can be provided in accordance with a speed of a gesture. - Similar to the first embodiment, the background
image generation section 23 generates a background image of the operation screen that includes a graphic object arranged across a specified area on the operation screen or a graphic object group as collection of a large number of the graphic objects (individual objects). - Similar to the first embodiment, based on the state signal received from the
state reception section 21, the backgroundimage generation section 23 determines whether the pointing device 10 (the recognition section 12) recognizes the operation finger. - Similar to the first embodiment, when determining that the pointing device 10 (the recognition section 12) does not recognize the operation finger, the background
image generation section 23 generates the background image that includes the graphic object or the graphic object group in a predetermined display mode. Preferably, the backgroundimage generation section 23 may generate the background image that includes the graphic object or the graphic object group in a predetermined display mode in which a difference that visually attracts the user's attention is not included in the entire graphic object or the entire graphic object group. - On the other hand, similar to the first embodiment, when determining that the pointing device 10 (the recognition section 12) recognizes the operation finger, the background
image generation section 23 changes a display mode of a part of the graphic object to differ from a display mode of the other parts of the graphic object or changes a display mode of a part of the graphic object group to differ from a display mode of the other parts of the graphic object group. Then, in accordance with the content of the scrolling operation (the type or the like of the scrolling operation), the backgroundimage generation section 23 shifts (moves) the part in the different display mode of the graphic object or the graphic object group in the same direction as the scrolling operation. - For example, similar to the foreground
image generation section 22, the backgroundimage generation section 23 confirms the content of the scrolling operation (the type or the like of the scrolling operation) on the basis of the state signal received from thestate reception section 21. Then, when determining that the scrolling operation is by “dragging”, the backgroundimage generation section 23 determines the scrolling amount and the scrolling direction on the basis of the fourth corresponding relationship information. On the other hand, when determining that the scrolling operation is by “flicking”, similar to the foregroundimage generation section 22, the backgroundimage generation section 23 determines the specified value that is defined in advance for the scrolling operation by “flicking” as the scrolling amount, and determines the scrolling direction on the basis of the change amounts of the x-coordinate and the y-coordinate on the touch pad based on the state signal. Then, in accordance with the determined scrolling amount and the determined scrolling direction, the backgroundimage generation section 23 generates the background image in a mode in which the part in the different display mode of the graphic object or the graphic object group is shifted (moved) in the same direction as the scrolling operation in a period that corresponds to the content of the scrolling operation. The background image generated by the backgroundimage generation section 23 will be described in detail below. - Note that the “period that corresponds to the content of the scrolling operation” means a period that is defined in advance in accordance with the content of the scrolling operation (the type of the scrolling operation). For example, in the case of the scrolling operation by “dragging”, “period that corresponds to the content of the scrolling operation” may be a period in which the scrolling operation continues (that is, a period in which the
recognition section 12 recognizes the operation finger). Meanwhile, in the case of the scrolling operation by “flicking”, because a finger contact period (that is, the period in which therecognition section 12 recognizes the operation finger) is short, a predetermined period is set for the scrolling operation by “flicking”. For example, the plurality of operation targets (the plurality of items constituting the list) are scrolled at a relatively high speed in the scrolling operation by “flicking”. Thus, the predetermined period may be a relatively short period. - Next, a description will be made on one example of the operation screen by the
display apparatus 1, more specifically, the foregroundimage generation section 22, the backgroundimage generation section 23, and thecomposition processing section 24 according to this embodiment with reference toFIG. 10A toFIG. 12C . -
FIG. 10A toFIG. 12C shows the one example of the operation screen by the display apparatus 1 (more specifically, the foregroundimage generation section 22, the backgroundimage generation section 23, and the composition processing section 24) according to this embodiment.FIG. 10A ,FIG. 10B , andFIG. 10C respectively show aforeground image 70 a, abackground image 70 b, and anoperation screen 70 in the case where the pointing device 10 (the recognition section 12) does not recognize the operation finger. In addition,FIGS. 11A, 12A ,FIGS. 11B, 12B , andFIGS. 11C, 12C respectively show theforeground image 70 a, thebackground image 70 b, and theoperation screen 70 in the case where the pointing device 10 (the recognition section 12) recognizes the operation finger. More specifically,FIG. 11A ,FIG. 11B , andFIG. 11C respectively show theforeground image 70 a, thebackground image 70 b, and theoperation screen 70 at a time when a state shown inFIG. 10A FIG. 10B , andFIG. 10C (a state where the pointing device 10 (the recognition section 12) does not recognize the operation finger) is shifted to a state where the pointing device 10 (the recognition section 12) recognizes the operation finger.FIG. 12A ,FIG. 12B , andFIG. 12C respectively show theforeground image 70 a, thebackground image 70 b, and theoperation screen 70 after the operation finger, which is recognized by therecognition section 12 starts being moved from the state shown inFIG. 11A ,FIG. 11B andFIG. 11C . - Note that a song list 71 includes a plurality of
songs 71A to 71B (26 songs), an example of a plurality of selection targets, that are more than displayable songs (5 songs) on theoperation screen 70. - First, a description will be made on the
operation screen 70 in the case where the pointing device 10 (the recognition section 12) does not recognize the operation finger with reference toFIGS. 10A to 10C . - As shown in
FIG. 10A , theforeground image 70 a includes: thesongs 71A to 71E as a part of the songs included in the song list 71, and a fixedcursor 76 that selects either one of thesongs 71A to 71Z included in the song list 71. When the scrolling operation of the song list 71 is performed to position either one of thesongs 71A to 71Z at thecursor 76, either one of thesongs 71A to 71Z can be played from a specified audio source (for example, a CD or the like) in audio equipment that is mounted on the vehicle. Thesongs 71A to 71E are sequentially arranged from an upper end to a lower end within an area in a lateral direction that excludes a left end of theforeground image 70 a (that is, the operation screen 70). In addition, thecursor 76 is fixed at the lower end within the area in the lateral direction that excludes the left end of theforeground image 70 a (that is, the operation screen 70). In the state shown inFIG. 10A , thecursor 76 and thesong 71E are arranged at the same position (thecursor 76 is arranged on thesong 71E), and thesong 71E is selected. - In addition, as shown in
FIG. 10B , thebackground image 70 b includes a graphic object group (hereinafter referred to as an “object group”) 77 that is configured by including a large number of granular graphic objects (hereinafter simply referred to as “grains”). Theobject group 77 is arranged such that the grains as components are arranged at equally spaced intervals from right to left and up to down across a specified area on thebackground image 70 b (that is, the operation screen 70), more specifically, an entire area on theoperation screen 70 in a vertical direction and the lateral direction. When determining that the pointing device 10 (the recognition section 12) does not recognize the operation finger, the backgroundimage generation section 23 generates thebackground image 70 b that includes theobject group 77 in a predetermined display mode, that is, a display mode in which the grains in the same color are arranged at the equally spaced intervals from the right to the left and the up to the down in this example. - Then, the
composition processing section 24 composes theforeground image 70 a inFIG. 10A and thebackground image 70 b inFIG. 10B to generate theoperation screen 70. As shown inFIG. 10C , theoperation screen 70 includes: thesongs 71A to 71E as the part of the song list 71 that is arranged in line in the vertical direction; and thecursor 76 that is arranged at the same position as thesong 71E (that is, on thesong 71E). Theoperation screen 70 also includes theobject group 77 configured by, including the grains in the same color, which are arranged at the equally spaced intervals from the right to the left and the up to the down across the entire area, in a background. - Next, a description will be made on the
operation screen 70 at a time when the state where the pointing device 10 (the recognition section 12) does not recognize the operation finger is shifted to the state where the pointing device 10 (the recognition section 12) recognizes the operation finger with reference toFIGS. 11A to 11C . - Note that the direction of the scrolling operation in this example, that is, the moving direction of the operation finger on the
operation section 11 is downward. - As shown in
FIG. 11A , when the state of theoperation screen 70 shown inFIG. 10C is shifted to the state where the pointing device 10 (the recognition section 12) recognizes the operation finger, scrolling of the song list 71 is not started. Thus, the foregroundimage generation section 22 generates thesame foreground image 50 a as inFIG. 10A . - In addition, when determining that the pointing device 10 (the recognition section 12) recognizes the operation finger on the basis of the state signal from the
state reception section 21, as shown inFIG. 11B , the backgroundimage generation section 23 generates thebackground image 70 b in whichgrains 77 a that constitute a part of theobject group 77 are in a different display mode from the other grains of theobject group 77. More specifically, of the grains included in theobject group 77, thegrains 77 a are grains in two rows that are located at the upper end of theoperation screen 70, and have a different color from the other grains of theobject group 77. - Then, the
composition processing section 24 composes theforeground image 70 a inFIG. 11A and thebackground image 70 b inFIG. 11B to generate theoperation screen 70. As shown inFIG. 11C , theoperation screen 70 includes: thesongs 71A to 71E as the part of the song list 71 that is arranged in line in the vertical direction; and thecursor 76 that is arranged at the same position as thesong 71E (that is, on thesong 71E). Theoperation screen 70 also includes theobject group 77 configured by including the grains, which are arranged at the equally spaced intervals from the right to the left and the up to the down across the entire area, in the background, and the color of thegrains 77 a that constitute the part of theobject group 77 is displayed in the different mode from the color of the other grains of theobject group 77. In this example, thegrains 77 a are configured by including the grains in the two rows that are located at the upper end of theoperation screen 70. - As described above, in the case where the pointing device 10 (the recognition section 12) recognizes the operation finger, the part (the
grains 77 a) of theobject group 77 is set in the different display mode from the other part (the other grains) of theobject group 77. In this way, the user can further easily comprehend the operation state on the operation screen. More specifically, the user recognizes a color change of thegrains 77 a as the part of theobject group 77, and thus can easily determine whether theoperation screen 50 can be operated by the pointing device 10 (more specifically, whether the operable state continues). In addition, by recognizing the color change of thegrains 77 a as the part of theobject group 77, the user can easily comprehend that the song list 71 is scrolled in accordance with the operation using thepointing device 10. - Next, a description will be made on the
operation screen 70 after the operation finger, which is recognized by therecognition section 12, starts being moved with reference toFIGS. 12A to 12C . - As shown in
FIG. 12A , the displayable songs are updated from the state shown inFIG. 11A in accordance with movement of the operation finger on theoperation section 11. More specifically, when the song list 71 is scrolled downward, theforeground image 70 a no longer includes thesongs songs FIG. 12A , thecursor 76 and thesong 71C are arranged at the same position (thecursor 76 is arranged on thesong 71C), and thesong 71C is selected. - In addition, as shown in
FIG. 12B , the backgroundimage generation section 23 generates thebackground image 70 b in which grains 77 b that constitutes a part of theobject group 77 are in a different display mode from the other grains of theobject group 77. At this time, the grains as the part in the different color of theobject group 77 are shifted from thegrains 77 a to the grains 77 b in accordance with the movement of the operation finger on theoperation section 11. More specifically, the grains 77 b are arranged on a lower side of thegrains 77 a, and the grains that constitute the part in the different color of theobject group 77 are shifted downward along the moving direction of the operation finger on theoperation section 11. - Then, the
composition processing section 24 composes theforeground image 70 a inFIG. 12A and thebackground image 70 b inFIG. 12B to generate theoperation screen 70. As shown inFIG. 12C , theoperation screen 70 includes: thesongs songs 71A to 71C as the part of the song list 71 that is arranged in line in the vertical direction; and thecursor 76 that is arranged at the same position as thesong 71C (that is, on thesong 71C). Theoperation screen 70 also includes theobject group 77 configured by including the grains, which are arranged at the equally spaced intervals from the right to the left and the up to the down across the entire area, in the background, and the color of the grains 77 b that constitute the part of theobject group 77 is displayed in the different mode from the color of the other grains of theobject group 77. In this example, the grains 77 b are configured by including the grains in the two rows of theobject group 77 that are located near center in the vertical direction of theoperation screen 70. - As described above, when the operation finger, which is recognized by the pointing device 10 (the recognition section 12), is moved on the
operation section 11, the grains (thegrains 77 a, 77 b) that constitute the part of theobject group 77 displayed in the different display mode (the mode of the different color from the other grains) are shifted along the moving direction (the down direction) of the operation finger (an outlined arrow inFIG. 12C ). Accordingly, the user visually recognizes whether the part in the different display mode of theobject group 77 is moved (shifted) along the moving direction of the operation finger, and thus can easily determine whether the scrolling operation of the song list 71 by thepointing device 10 can appropriately be continued. In addition, the user recognizes a direction in which the grains constituting the part displayed in the different display mode of theobject group 77 displayed as the background are shifted, and can thereby easily comprehend the scrolling direction on theoperation screen 70 in accordance with the operation using thepointing device 10. - Note that the
object group 77 configured by including a number of the grains, which are arranged at the equally spaced intervals from the right to the left and the up to the down, is used in this example; however, a planar graphic object (a plane object) that covers an area where the grains of theobject group 77 are arranged may be used, for example. In such a case, when the state where the pointing device 10 (the recognition section 12) does not recognize the operation finger is shifted to the state where the pointing device 10 (the recognition section 12) recognizes the operation finger, a color of a part of the plane object that corresponds to thegrains 77 a is changed. Then, in accordance with the movement of the operation finger, which is recognized by thepointing device 10, the part in the different color of the plane object is shifted from the part that corresponds to thegrains 77 a to a part that corresponds to the grains 77 b. Also, in such a modified example, the same operational effects as those in this example can be realized. - In addition, the graphic object group that is arranged in a plane on the operation screen is used in this example. However, a virtual three-dimensional space may be set in the
background image 70 b, and a graphic object or a graphic object group that is arranged in said three-dimensional space may be used. Also, in such a modified example, the same operational effects as those in this example can be realized. - The mode for carrying out the disclosure has been described in detail so far. However, the disclosure is not limited to such a particular embodiment, and various modifications and changes can be made thereto within a scope of gist of the disclosure described in the claims.
Claims (6)
1. A vehicular display apparatus comprising:
a display;
a pointing device configured to recognize a position of an operation finger on the pointing device; and
a controller configured to
display an operation screen on the display, the operation screen including a plurality of selection targets, a selection operation of which is able to be performed by using the pointing device,
generate an image that includes horizontally arranged graphics objects across a specified area on the operation screen,
determine a part of plurality of graphics objects based on and corresponding to a position of the operating finger when the operation finger is recognized by the pointing device, and
execute processing of setting the part of the plurality of graphics objects in a different display mode from the other part of the plurality of graphics objects when the operation finger is recognized by the pointing device.
2. The vehicular display apparatus according to claim 1 , wherein
the controller is configured to generate an image in which the part in the different display mode of the plurality of graphics objects is moved along a moving direction of the operation finger when the operation finger, which is recognized by the pointing device, is moved with respect to the pointing device.
3. The vehicular display apparatus according to claim 1 , wherein
the controller is configured to generate the operation screen by composing a foreground image that includes the plurality of selection targets and a background image that includes the plurality of graphics objects.
4. The vehicular display apparatus according to claim 1 , wherein
the plurality of selection targets is arranged in line in a first direction on the operation screen,
the plurality of graphics objects is arranged along the first direction, and
the controller is configured to generate an image in which the part of the plurality of graphics objects is displaced in a second direction that crosses the first direction with respect to the other part of the plurality of graphics objects when the operation finger is recognized by the pointing device.
5. The vehicular display apparatus according to claim 4 , wherein
an amount of displacement of the plurality of graphics objects in the second direction is reduced as the plurality of graphics objects separate from a position on the operation screen in the first direction, the position corresponding to a state signal from the pointing device.
6. The vehicular display apparatus according to claim 1 , wherein
the plurality of graphics objects is arranged in a virtual three-dimensional space, and
the controller is configured to generate an image in which the part of the plurality of graphics objects is raised in a specified direction in the virtual three-dimensional space with respect to the other part of the plurality of graphics objects when the operation finger is recognized by the pointing device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/394,403 US20190250776A1 (en) | 2016-04-07 | 2019-04-25 | Vehicular display apparatus |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016077516A JP6390657B2 (en) | 2016-04-07 | 2016-04-07 | Vehicle display device |
JP2016-077516 | 2016-04-07 | ||
US15/459,595 US10318118B2 (en) | 2016-04-07 | 2017-03-15 | Vehicular display apparatus |
US16/394,403 US20190250776A1 (en) | 2016-04-07 | 2019-04-25 | Vehicular display apparatus |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/459,595 Continuation US10318118B2 (en) | 2016-04-07 | 2017-03-15 | Vehicular display apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190250776A1 true US20190250776A1 (en) | 2019-08-15 |
Family
ID=59929726
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/459,595 Expired - Fee Related US10318118B2 (en) | 2016-04-07 | 2017-03-15 | Vehicular display apparatus |
US16/394,403 Abandoned US20190250776A1 (en) | 2016-04-07 | 2019-04-25 | Vehicular display apparatus |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/459,595 Expired - Fee Related US10318118B2 (en) | 2016-04-07 | 2017-03-15 | Vehicular display apparatus |
Country Status (4)
Country | Link |
---|---|
US (2) | US10318118B2 (en) |
JP (1) | JP6390657B2 (en) |
CN (1) | CN107450719B (en) |
DE (1) | DE102017106578A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6390657B2 (en) * | 2016-04-07 | 2018-09-19 | トヨタ自動車株式会社 | Vehicle display device |
US10907727B2 (en) * | 2016-11-09 | 2021-02-02 | Arrival Limited | Gear selection system and method |
US20220410829A1 (en) * | 2021-01-06 | 2022-12-29 | Ssv Works, Inc. | Smart switch for vehicle systems |
DE102021208728A1 (en) | 2021-08-10 | 2023-02-16 | Volkswagen Aktiengesellschaft | Reduced operating device |
Family Cites Families (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07160416A (en) | 1993-12-06 | 1995-06-23 | Canon Inc | Information processor |
US7434177B1 (en) * | 1999-12-20 | 2008-10-07 | Apple Inc. | User interface for providing consolidation and access |
JP4228781B2 (en) | 2003-05-23 | 2009-02-25 | 株式会社デンソー | In-vehicle device operation system |
JP4351599B2 (en) | 2004-09-03 | 2009-10-28 | パナソニック株式会社 | Input device |
JP4788455B2 (en) * | 2006-04-12 | 2011-10-05 | 株式会社デンソー | In-vehicle operation system |
US9024864B2 (en) * | 2007-06-12 | 2015-05-05 | Intel Corporation | User interface with software lensing for very long lists of content |
US8375336B2 (en) * | 2008-05-23 | 2013-02-12 | Microsoft Corporation | Panning content utilizing a drag operation |
DE102008032377A1 (en) * | 2008-07-09 | 2010-01-14 | Volkswagen Ag | Method for operating a control system for a vehicle and operating system for a vehicle |
US8315672B2 (en) * | 2008-12-01 | 2012-11-20 | Research In Motion Limited | Portable electronic device and method of controlling same |
KR20100069842A (en) * | 2008-12-17 | 2010-06-25 | 삼성전자주식회사 | Electronic apparatus implementing user interface and method thereof |
CN101770283B (en) * | 2009-01-05 | 2012-10-10 | 联想(北京)有限公司 | Method and computer for generating feedback effect for touch operation |
JP5364925B2 (en) * | 2009-02-27 | 2013-12-11 | 現代自動車株式会社 | Input device for in-vehicle equipment |
JP5461030B2 (en) * | 2009-03-02 | 2014-04-02 | アルパイン株式会社 | Input device |
DE102009019561A1 (en) * | 2009-04-30 | 2010-11-04 | Volkswagen Ag | Method for displaying information in a motor vehicle and display device |
KR20110047422A (en) * | 2009-10-30 | 2011-05-09 | 삼성전자주식회사 | Apparatus and method for providing list in a portable terminal |
DE112011101422T5 (en) * | 2010-04-21 | 2013-02-07 | Research In Motion Limited | A method of interacting with a scrollable area on a portable electronic device |
US8468465B2 (en) * | 2010-08-09 | 2013-06-18 | Apple Inc. | Two-dimensional slider control |
US8760424B2 (en) * | 2011-03-17 | 2014-06-24 | Intellitact Llc | Touch enhanced interface |
JP2013033343A (en) * | 2011-08-01 | 2013-02-14 | Toyota Motor Corp | Operation device for vehicle |
JP5790578B2 (en) * | 2012-04-10 | 2015-10-07 | 株式会社デンソー | Display system, display device, and operation device |
JP6095283B2 (en) * | 2012-06-07 | 2017-03-15 | キヤノン株式会社 | Information processing apparatus and control method thereof |
TWI483165B (en) * | 2012-09-21 | 2015-05-01 | Au Optronics Corp | Capacitive touch sensor structure and applications thereof |
EP2954110A4 (en) * | 2013-02-07 | 2016-10-05 | Electrolux Home Prod Inc | User control interface for an appliance, and associated method |
JP5858059B2 (en) * | 2013-04-02 | 2016-02-10 | 株式会社デンソー | Input device |
US9535574B2 (en) * | 2013-06-28 | 2017-01-03 | Jive Software, Inc. | Infinite scrolling a very large dataset |
FR3030070B1 (en) * | 2014-12-15 | 2018-02-02 | Dav | DEVICE AND CONTROL METHOD FOR MOTOR VEHICLE |
US20160321230A1 (en) * | 2015-04-29 | 2016-11-03 | Facebook, Inc. | Generating a data table |
US9874952B2 (en) * | 2015-06-11 | 2018-01-23 | Honda Motor Co., Ltd. | Vehicle user interface (UI) management |
JP6390657B2 (en) * | 2016-04-07 | 2018-09-19 | トヨタ自動車株式会社 | Vehicle display device |
-
2016
- 2016-04-07 JP JP2016077516A patent/JP6390657B2/en active Active
-
2017
- 2017-03-15 US US15/459,595 patent/US10318118B2/en not_active Expired - Fee Related
- 2017-03-28 DE DE102017106578.4A patent/DE102017106578A1/en active Pending
- 2017-04-05 CN CN201710217804.0A patent/CN107450719B/en active Active
-
2019
- 2019-04-25 US US16/394,403 patent/US20190250776A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
DE102017106578A1 (en) | 2017-10-12 |
JP6390657B2 (en) | 2018-09-19 |
US10318118B2 (en) | 2019-06-11 |
CN107450719B (en) | 2020-06-05 |
US20170293370A1 (en) | 2017-10-12 |
JP2017187987A (en) | 2017-10-12 |
CN107450719A (en) | 2017-12-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20190250776A1 (en) | Vehicular display apparatus | |
US9442619B2 (en) | Method and device for providing a user interface, in particular in a vehicle | |
US9176634B2 (en) | Operation device | |
US9511669B2 (en) | Vehicular input device and vehicular cockpit module | |
KR101575650B1 (en) | Terminal, vehicle having the same and method for controlling the same | |
CN107918504B (en) | Vehicle-mounted operating device | |
JP2015170282A (en) | Operation device for vehicle | |
JP6277786B2 (en) | Vehicle control device | |
US20180307405A1 (en) | Contextual vehicle user interface | |
CN110869882A (en) | Method for operating a display device for a motor vehicle and motor vehicle | |
JP2018195134A (en) | On-vehicle information processing system | |
US11221735B2 (en) | Vehicular control unit | |
JP5985829B2 (en) | Vehicle control device | |
JP6147357B2 (en) | Display control apparatus and display control method | |
JP2014172413A (en) | Operation support system, operation support method, and computer program | |
JP2018010472A (en) | In-vehicle electronic equipment operation device and in-vehicle electronic equipment operation method | |
US20180232115A1 (en) | In-vehicle input device and in-vehicle input device control method | |
WO2017188098A1 (en) | Vehicle-mounted information processing system | |
JP2018132824A (en) | Operation device | |
EP3352067B1 (en) | Vehicular input device and method of controlling vehicular input device | |
US20230249552A1 (en) | Control apparatus | |
JP2017187922A (en) | In-vehicle information processing system | |
JP7001368B2 (en) | Operation device | |
JP6739864B2 (en) | In-vehicle information processing system | |
JP5950851B2 (en) | Information display control device, information display device, and information display control method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |