WO2020179785A1 - Dispositif d'affichage de véhicule - Google Patents
Dispositif d'affichage de véhicule Download PDFInfo
- Publication number
- WO2020179785A1 WO2020179785A1 PCT/JP2020/008915 JP2020008915W WO2020179785A1 WO 2020179785 A1 WO2020179785 A1 WO 2020179785A1 JP 2020008915 W JP2020008915 W JP 2020008915W WO 2020179785 A1 WO2020179785 A1 WO 2020179785A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- cover
- display device
- sensor sheet
- panel
- Prior art date
Links
- 238000001514 detection method Methods 0.000 description 8
- 238000000034 method Methods 0.000 description 7
- 238000000465 moulding Methods 0.000 description 5
- 230000007704 transition Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 239000011347 resin Substances 0.000 description 3
- 229920005989 resin Polymers 0.000 description 3
- 239000000758 substrate Substances 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 238000001746 injection moulding Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 229920003229 poly(methyl methacrylate) Polymers 0.000 description 2
- 239000004926 polymethyl methacrylate Substances 0.000 description 2
- 238000007639 printing Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000007649 pad printing Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000007650 screen-printing Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000015541 sensory perception of touch Effects 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
Definitions
- the present invention relates to a vehicle display device.
- Patent Document 1 discloses a device that displays information about audio and car navigation.
- the vehicle display device disclosed in Patent Document 1 is configured to allow various input operations by a steering switch that receives a touch operation.
- the operation method and operation part may be limited due to the restriction that the steering switch is provided on a part of the steering wheel. It would be convenient if the vehicle display device had a configuration that accepted more intuitive operations.
- the present invention has been made in view of the above circumstances, and an object of the present invention is to provide a vehicle display device that enables intuitive operation.
- the vehicle display device A display section for displaying information, A light-transmitting portion that allows the display portion to be seen through, and a panel portion located on the front side of the display portion,
- the panel part A cover having an operation unit with which a user's part performing an input operation contacts, It has a sensor sheet provided along the surface of the cover on the display portion side and detecting that the portion has come into contact with the operation portion.
- the operation portion is a three-dimensionally formed portion formed on at least one of a front surface portion and an end portion of the cover,
- the sensor sheet has a corresponding portion having a shape corresponding to the three-dimensional shape of the operation portion.
- FIG. 4 A perspective view of the display device for a vehicle which concerns on one Embodiment of this invention. It is a perspective view of the display device for a vehicle mainly for explaining the operation by a user. It is a perspective view of the display device for vehicles which removed the panel part.
- (A) is a rear perspective view of the panel portion, and (b) is a schematic cross-sectional view taken along the line AA shown in FIG. 4 (a).
- (A) to (c) are diagrams showing an example of transition of an image according to an input operation.
- the vehicle display device 100 is provided, for example, on an instrument panel of a motorcycle.
- the vehicle display device 100 includes an operation unit (a first operation unit M1 and a second operation unit M2 described later) that a user as a driver of the vehicle can touch with a finger 1 (see FIG. 2).
- the user performs an input operation on a predetermined device of the vehicle by touching the operation unit with one finger.
- the user side with respect to the vehicle display device 100 is referred to as the front (“F” in the figure), and the opposite side is referred to as the rear (“B” in the figure), and appropriate description will be given.
- the vehicle display device 100 includes a display unit 10, a panel unit 20, a case 30, a light emitting unit 40, and a control board 50, as shown in FIGS. 1 to 4.
- the display unit 10 displays information about the vehicle (hereinafter referred to as vehicle information) as an image, and is composed of, for example, an LCD (Liquid Crystal Display), an OLED (Organic Light Emitting Diodes), or the like.
- vehicle information includes not only the information of the vehicle itself but also the information of the outside of the vehicle related to the operation of the vehicle.
- the display unit 10 can function as a meter that displays measured amounts such as vehicle speed and engine speed.
- the panel unit 20 is a panel-shaped member located on the front side of the display unit 10, and includes a translucent unit 20a located at the center thereof.
- the translucent portion 20a is provided in a portion facing the display portion 10 and allows a user to visually recognize the display portion 10 through the transparent portion.
- the panel portion 20 has a cover 21 and a sensor sheet 22, and for example, both are integrated.
- the cover 21 is a plate-shaped member formed of a translucent resin such as polymethyl methacrylate resin (PMMA).
- PMMA polymethyl methacrylate resin
- the cover 21 is formed in a curved shape whose central portion is recessed toward the display portion 10.
- the cover 21 is fitted to the case 30 to form the exterior of the vehicle display device 100 together with the case 30.
- a light-shielding layer (not shown) is formed on the back surface of the cover 21 in a region other than the translucent portion 20a of the panel portion 20.
- the light-shielding layer can be composed of, for example, a light-shielding printing layer formed by screen printing or pad printing.
- the cover 21 includes operation portion forming portions 210 located on the left side and the right side of the transparent portion 20a in FIG.
- the operation portion forming portion 210 is formed symmetrically with the translucent portion 20a in between, and therefore one operation portion forming portion 210 will be described below and the other description will be omitted.
- the operation portion forming portion 210 has a front surface portion 211 having a surface S facing forward and a side wall portion 212 extending rearward from an end portion of the front surface portion 211, as shown in a cross section in FIG. 4B.
- the side wall portion 212 is formed with a first operation portion M1 having a three-dimensional shape.
- the first operation unit M1 is a unit that receives a slide operation OP1 by the user's finger 1.
- the slide operation OP1 is an operation in which the user moves the finger 1 along the side wall portion 212 while contacting the outer surface (outer peripheral side surface) of the first operation portion M1.
- the first operation unit M1 has a continuous uneven shape.
- the uneven shape is formed so that the recessed portion gradually increases from the upper side to the lower side when viewed from the user. Since the first operation portion M1 has such a concavo-convex shape, the user can grasp which part of the first operation portion M1 is currently touched by the tactile sensation of the finger 1.
- a second operation unit M2 having a three-dimensional shape is formed on the front surface portion 211.
- the second operation unit M2 is a unit that receives a tap operation OP2 with the user's finger 1.
- the tap operation OP2 is an operation in which the user taps the outer surface (the surface facing forward) of the second operation unit M2 with the finger 1.
- the second operation unit M2 has a shape protruding forward from the surface S of the front surface portion 211.
- one of the operation unit forming units 210 is provided with three second operation units M2 along the vertical direction as viewed from the user.
- each of the three second operation parts M2 will be referred to as “first tap part Ma”, “second tap part Mb”, and “third tap part” in order from top to bottom. "Mc”.
- at least the first operation unit M1 and the second operation unit M2 are provided in the vehicle so as to be exposed from the instrument panel and can be touched by the user.
- the sensor sheet 22 is, for example, a capacitance type sensor sheet, and a detection signal indicating that the user's finger 1 has come into contact with the first operation unit M1 or the second operation unit M2 is mounted on the control board 50, which will be described later. Supply to the control unit. As shown in FIG. 4A, the sensor sheet 22 is provided along the back surface of the cover 21 (the surface of the cover 21 on the display portion 10 side). In FIG. 4A, a broken line is added to the boundary between the cover 21 and the sensor sheet 22 in consideration of the visibility. The sensor sheet 22 is formed to have translucency.
- the translucent portion 20a of the panel portion 20 is composed of a portion of the cover 21 on which the light-shielding layer is not formed and a portion of the sensor sheet 22 facing the portion.
- the light-shielding layer formed in addition to the translucent portion 20a is not limited to the mode formed on the back surface of the cover 21.
- the light-shielding layer other than the light-transmitting portion 20 a may be formed on at least one of the front surface of the cover 21, the back surface of the cover 21, the front surface of the sensor sheet 22, and the back surface of the sensor sheet 22.
- the light-shielding layer is not limited to the light-shielding print layer, and may be a light-shielding film material or plate material.
- the sensor sheet 22 is integrally molded with the cover 21 by, for example, injection molding. Therefore, the panel portion 20 composed of the cover 21 and the sensor sheet 22 is formed in a curved shape that is concave toward the display portion 10.
- a known method such as in-mold molding or film insert molding can be adopted.
- the sensor sheet 22 formed in this way corresponds to the three-dimensional shape of the first operating unit M1 and the first corresponding unit C1 corresponding to the three-dimensional shape of the second operating unit M2. And a second corresponding portion C2.
- the case 30 has a box shape that opens toward the front, and is formed of, for example, a resin and has a light-shielding property.
- the case 30 houses the display unit 10 and the control board 50.
- the light emitting unit 40 is configured to emit light in correspondence with the position of the finger 1 detected by the sensor sheet 22 and notify the user of the position touched by the finger 1.
- the light emitting unit 40 has a plurality of light sources 41 that emit light toward the side wall portion 212 of the panel unit 20.
- the light source 41 is composed of, for example, an LED (Light Emitting Diode), and is mounted on a light source substrate 60 provided at the end of the case 30, as shown in FIG.
- the plurality of light sources 41 are provided at positions facing the side wall portion 212, and are arranged along the vertical direction in FIG. In the illustrated example, eight light sources 41 are arranged along the vertical direction with respect to one light source substrate 60.
- the side wall portion 212 emits light by receiving the light emitted from the lit light source 41 among the plurality of light sources 41 constituting the light emitting portion 40.
- a light transmitting layer may be provided on the side wall portion 212.
- the light transmission layer is formed by, for example, smoke printing, and transmits the light from the light emitting unit 40 while suppressing the inside from being seen from the side wall portion 212 of the panel unit 20.
- the light transmission layer may be composed of a light diffusion film or the like.
- Such a light transmission layer is formed on at least one of the front surface of the cover 21, the back surface of the cover 21, the front surface of the sensor sheet 22, and the back surface of the sensor sheet 22 in a portion corresponding to the side wall portion 212 of the panel unit 20. It should be.
- the control board 50 is a printed circuit board located between the display unit 10 and the case 30.
- the control board 50 is electrically connected to each of the display unit 10 and the sensor sheet 22 via wiring (not shown). Further, as shown in FIG. 3, the control board 50 is electrically connected to each of the plurality of light sources 41 via the wiring 51 and the light source board 60.
- a control unit (not shown) that controls the overall operation of the vehicle display device 100, including the operations of the display unit 10 and the light emitting unit 40, is mounted on the control board 50.
- the control unit controls to turn on or off each of the plurality of light sources 41 constituting the light emitting unit 40.
- the control unit performs control to turn on the light source 41 at a position corresponding to the position of the finger 1 detected by the sensor sheet 22 among the plurality of light sources 41 and turn off the other light sources 41.
- the control unit is composed of a microcomputer including a storage unit including a ROM (Read Only Memory) and a RAM (Random Access Memory), and a CPU (Central Processing Unit) that executes an operation program stored in the ROM. ..
- the control unit communicates with an ECU (Electronic Control Unit) that controls each unit of the vehicle and various sensors, and causes the display unit 10 to display an image indicating the received vehicle information.
- the control unit can display the normal image L1 representing the vehicle speed on the display unit 10.
- the control unit communicates with various systems usable in the vehicle such as a car navigation system and an audio system by a known method, and the operation unit by the user's finger 1 (first operation unit M1, second operation unit M2). Control various systems according to the input operation to.
- the control unit displays a menu image L2 (see FIG. 5B) representing various contents that can be displayed on the display unit 10 or an image representing the content selected in the menu image L2 (for example, in FIG. 5C).
- the audio image L3) shown can be displayed on the display unit 10.
- control unit acquires a detection signal indicating the position of the user's finger 1 touching the first operation unit M1 and the second operation unit M2 from the sensor sheet 22, and performs various gesture operations (input operations) by a known method. Identify the content of. For example, the sensor sheet 22 detects that the user's finger 1 has come into contact with the first operation unit M1 in the first corresponding unit C1. Based on the detection in the first corresponding unit C1, the control unit specifies the content of the slide operation OP1 performed on the first operation unit M1. Further, the sensor sheet 22 detects that the user's finger 1 has come into contact with the second operation unit M2 in the second corresponding unit C2.
- the control unit Based on the detection in the second corresponding unit C2, the control unit identifies the content of the tap operation OP2 performed on the second operation unit M2. Then, the control unit controls to move the cursor 2 (cursor image) according to the specified slide operation OP1, and controls to display the content indicated by the list selected by the cursor 2 according to the specified tap operation OP2. To do.
- the control unit switches the display image of the display unit 10 from the normal image L1 to the menu image L2 in response to the operation.
- the menu image L2 represents items indicating various contents that can be operated by the vehicle display device 100.
- FIG. 5B shows “Audio” indicating that operations related to audio are possible, “Navigation” indicating that operations related to car navigation are possible, and setting operations and display operations related to various information related to the vehicle are possible.
- This is an example in which each item of "Information” indicating that is shown in the menu image L2.
- a cursor 2 indicating that each item is in the selected state is also displayed.
- the control unit causes the finger 1 to come into contact with the first operation unit M1.
- the cursor 2 is moved according to. In this way, the user can select any of the items displayed on the menu image L2.
- the control unit causes the light emitting unit 40 to emit light in correspondence with the position of the finger 1 detected by the sensor sheet 22. For example, when it is determined that the finger 1 is touching the upper position of the first operation unit M1 in the slide operation OP1, the control unit corresponds to the specified upper position of the plurality of light sources 41 configuring the light emitting unit 40. Control is performed such that one light source 41 at the position where the light source is used is made to emit light and the other light sources 41 are turned off. As a result, the user is notified of the position where the finger 1 is touching in the light emitting unit 40.
- the control unit determines that the finger 1 is touching the upper position of the first operation unit M1 in the slide operation OP1, the control unit identifies the plurality of (for example, eight) light sources 41 configuring the light emitting unit 40.
- a plurality of (for example, two or three) light sources 41 located at positions corresponding to the above-mentioned upper positions may be made to emit light. Further, if it is possible to grasp which position the user performing the slide operation OP1 is currently touching, the position and shape of the portion of the panel unit 20 that receives light from the light source 41 to emit light, and the panel unit 20.
- the number of light sources 41 that emit light toward them is arbitrary.
- the control unit may control the light emission of the light source 41 in association with the position of the finger 1 detected by the sensor sheet 22.
- the control unit causes the light source 41 located at the position corresponding to the position of the finger 1 detected by the sensor sheet 22 to emit light, so that the first tap unit Ma, the second tap unit Mb, and the second tap unit Mb of the second operation unit M2. You may notify which of the third tap portion Mc the finger 1 is touching.
- FIG. 5B when the user performs one tap operation OP2 as a determination operation on the second tap portion Mb in a state where any item is selected by the cursor 2, control is performed.
- the unit displays an image showing various settings related to the item to be selected according to the operation.
- the transition from FIGS. 5 (b) to 5 (c) shows an example in which "Audio" is determined in the menu image L2 and the control unit switches the display image of the display unit 10 from the menu image L2 to the audio image L3. ..
- FIG. 5 (c) shows "Album” that enables selection and playback of music from an album, "Artist” that enables selection and playback of music from an artist, and music from a playlist as operations related to audio.
- each item of "Playlist” that enables selection and reproduction is shown in the audio image L3.
- the cursor 2 indicating that each item is in the selected state is also displayed.
- the selection operation and determination operation of each item in the audio image L3 are the same as those described in the menu image L2. Further, the operation method is the same when the operable content becomes more specific and the hierarchy of the display image becomes lower.
- the control unit that has received such a selection operation or a determination operation transmits an operation signal indicating the operation content to the operation target audio system, and the audio system receiving the operation signal operates according to the operation content. Execute (switch songs, play, etc.). The same applies when the operation target is another system such as a car navigation system.
- the user uses the third tap unit Mc of the second operation unit M2.
- the control unit returns the display image of the display unit 10 to the normal image L1 in response to the operation. That is, the tap operation OP2 on the third tap portion Mc enables the returning operation to the normal image L1.
- the type of input operation that can be performed by each of the first operation unit M1 and the second operation unit M2, the transition of images according to the input operation, the target device that can perform the input operation, and the like depend on the design. Can be changed arbitrarily.
- the control unit may be able to identify that the tap operation OP2 has been performed in the first operation unit M1.
- the second operation unit M2 may be provided with one or two or four or more with respect to one operation unit forming unit 210.
- the shape and number of the operation units and the types and contents of the input operations that can be accepted can be different between one and the other of the pair of operation unit forming units 210.
- the target device that can be input and operated is not limited to the above example, and may be an air conditioner (air conditioner) device in the vehicle, an arbitrary ECU in the vehicle (for example, one that controls switching of operation modes), and the like. ..
- the present invention is not limited to the above embodiments. Modifications (including deletion of components) can be appropriately added without changing the gist of the present invention.
- the vehicle on which the vehicle display device 100 is mounted may be a vehicle such as a motorcycle or a three-wheeled vehicle, instead of a motorcycle.
- the vehicle display device 100 is preferably provided in the vicinity of a steering device such as a steering wheel or a steering wheel.
- the part where the user touches the operation part (the first operation part M1 and the second operation part M2) during the input operation may be an arbitrary part of the palm other than the finger 1 or the back of the hand.
- the operation that the control unit can identify based on the detection result of the sensor sheet 22 is arbitrary as long as it is a known gesture operation.
- the display unit 10 is not limited to a display device that displays an image, and may be an analog pointer-type instrument or the like.
- the plurality of light sources 41 forming the light emitting unit 40 may include ones that emit lights of different colors. That is, there may be a plurality of types of emission colors of the light emitting unit 40.
- the cover 21 and the sensor sheet 22 are integrally molded has been shown, but a configuration in which the sensor sheet 22 is attached to the inner surface (back surface) of the cover 21 may be adopted. Further, a plurality of sensor sheets 22 may be provided on the inner surface of the cover 21.
- the side wall portion 212 is formed in a straight line along the vertical direction as viewed from the user is illustrated, but the shape is arbitrary.
- the side wall portion 212 may be formed in an arc shape (curved shape).
- the first operating portion M1 can also be formed at a position near the side wall portion 212 of the front surface portion 211. That is, the first operation unit M1 may be formed at the end of the cover 21.
- the end portion is preferably at least one of the left end portion and the right end portion of the cover 21 viewed from the user, but the first operation portion M1 is formed along the upper end portion and the lower end portion of the cover 21.
- the panel portion 20 is formed in a curved shape that is concave toward the display portion 10
- the shape of the panel portion 20 is arbitrary as long as it is plate-shaped.
- the panel section 20 may be formed in a flat plate shape.
- the vehicle display device 100 described above has a display unit 10 that displays information (for example, vehicle speed information, car navigation information, content information, and other information about the vehicle), and a transparent unit that allows the display unit 10 to be visible. It has a light unit 20a, and includes a panel unit 20 located on the front side of the display unit 10.
- the panel portion 20 has a cover 21 and a sensor sheet 22.
- the cover 21 has an operation unit (first operation unit M1 and second operation unit M2) with which a user's part (for example, a finger 1) that performs an input operation comes into contact.
- the sensor sheet 22 is provided along the surface (back surface) of the cover 21 on the display unit 10 side, and detects that the user's part has come into contact with the operation unit.
- the operation unit is a three-dimensional portion formed on at least one of a front surface portion (for example, the front surface portion 211) and an end portion (for example, the side wall portion 212) of the cover 21.
- the sensor sheet 22 has a corresponding portion (first corresponding portion C1, second corresponding portion C2) having a shape corresponding to the three-dimensional shape of the operation portion. According to this configuration, the input operation can be performed by touching the operation unit provided on the cover 21, so that the input operation can be intuitively performed.
- the operation unit there are a first operation unit M1 formed at the end portion of the cover 21 and a second operation unit M2 formed at the front surface portion, and the first operation unit M1 is a user's part.
- the slide operation OP by is accepted. According to this configuration, the slide operation OP1 is possible at the end of the cover 21, which is easy to touch with a part such as the user's finger 1, so that an intuitive input operation is possible.
- the second operation unit M2 is a portion that is raised (protruded toward the front) from the front surface of the cover 21 and receives the tap operation OP2 by the user's part. According to this configuration, the tap operation OP2 can be performed on the front surface of the cover 21, which is easy to touch with a part such as the user's finger 1, so that an intuitive input operation can be performed.
- the first operation unit M1 is a portion provided at the end of the cover 21 and having a continuous uneven shape. According to this configuration, the user can be made to recognize which position of the first operation unit M1 is being touched by the tactile sense of the finger 1.
- the vehicle display device 100 includes the light emitting unit 40 that emits light corresponding to the position of the user's site detected by the sensor sheet 22, and the light emitting unit 40 emits light toward the panel unit 20. Then, the position touched by the part is notified. According to this configuration, the light emitting part of the light emitting unit 40 can notify the part currently being operated by the user.
- the cover 21 and the sensor sheet 22 are integrally molded.
- the cover 21 and the sensor sheet 22 can be integrally molded by a method such as in-mold molding or film insert molding.
- the sensor sheet 22 having a shape corresponding to the operation portion having a three-dimensional shape can be provided inside the cover 21, the detection accuracy of the sensor sheet 22 is reduced, and the sensor sheet 22 is peeled off from the cover 21. Can be prevented.
- the panel portion 20 is formed in a curved shape that is concave toward the display portion 10. According to this configuration, the position of the end portion of the panel portion 20 is closer to the user than the center portion of the panel portion 20, so that the user operates the first operation portion M1 and the like formed on the end portion of the cover 21. Cheap. From this point of view, it is preferable that the panel portion 20 is formed in a curved shape in which the positions of the right end portion and the left end portion of the panel portion 20 are closer to the user than the central portion thereof.
- Vehicle display device 10 ... Display unit 20 ... Panel part, 20a ... Translucent part 21 ... Cover 210 ... Operation part forming part 211 ... Front part, S ... Front facing surface 212 ... Side wall part M1 ... First operation part , OP1 ... Slide operation M2 ... 2nd operation unit, OP2 ... Tap operation 22 ... Sensor sheet C1 ... 1st correspondence part, C2 ... 2nd correspondence part 30 ... Case 40 ... Light emitting part, 41 ... Light source 50 ... Control board 60 ... Light source substrate 1... Finger, 2... Cursor L1... Normal image, L2... Menu image, L3... Audio image
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
L'invention concerne un dispositif d'affichage de véhicule permettant un fonctionnement intuitif. Ce dispositif d'affichage de véhicule est pourvu d'une partie de panneau (20) disposée sur un côté de surface avant d'une partie d'affichage qui affiche des informations. La partie de panneau (20) comprend un couvercle (21) et une feuille de capteur (22). Le couvercle (21) comprend une partie d'actionnement à l'aide de laquelle une partie d'un utilisateur effectuant une opération d'entrée vient en contact. La feuille de capteur (22) est disposée le long d'une surface arrière du couvercle (21) afin de détecter que la partie de l'utilisateur est venue en contact avec la partie d'actionnement. La partie d'actionnement est une partie ayant une forme tridimensionnelle formée dans une partie de surface avant (211) et/ou une partie de paroi latérale (212) du couvercle (21). Une première partie d'actionnement (M1) formée dans la partie de paroi latérale (212) et une seconde partie d'actionnement (M2) formée dans la partie de surface avant (211) sont des exemples de la partie d'actionnement. En outre, la feuille de capteur (22) comprend une première partie correspondante (C1) ayant une forme correspondant à la première partie d'actionnement (M1), et une seconde partie correspondante (C2) ayant une forme correspondant à la seconde partie d'actionnement (M2).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019040328 | 2019-03-06 | ||
JP2019-040328 | 2019-03-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020179785A1 true WO2020179785A1 (fr) | 2020-09-10 |
Family
ID=72338352
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/008915 WO2020179785A1 (fr) | 2019-03-06 | 2020-03-03 | Dispositif d'affichage de véhicule |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2020179785A1 (fr) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012018448A (ja) * | 2010-07-06 | 2012-01-26 | Suzuki Motor Corp | 電子機器 |
WO2014171319A1 (fr) * | 2013-04-19 | 2014-10-23 | 本田技研工業株式会社 | Dispositif d'affichage monté sur un véhicule |
JP2014229014A (ja) * | 2013-05-21 | 2014-12-08 | 日本精機株式会社 | タッチパネル入力操作装置 |
JP2016110775A (ja) * | 2014-12-04 | 2016-06-20 | 株式会社東海理化電機製作所 | 車両用スイッチ装置 |
JP2017047767A (ja) * | 2015-09-01 | 2017-03-09 | 株式会社デンソー | 車両用表示装置 |
-
2020
- 2020-03-03 WO PCT/JP2020/008915 patent/WO2020179785A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012018448A (ja) * | 2010-07-06 | 2012-01-26 | Suzuki Motor Corp | 電子機器 |
WO2014171319A1 (fr) * | 2013-04-19 | 2014-10-23 | 本田技研工業株式会社 | Dispositif d'affichage monté sur un véhicule |
JP2014229014A (ja) * | 2013-05-21 | 2014-12-08 | 日本精機株式会社 | タッチパネル入力操作装置 |
JP2016110775A (ja) * | 2014-12-04 | 2016-06-20 | 株式会社東海理化電機製作所 | 車両用スイッチ装置 |
JP2017047767A (ja) * | 2015-09-01 | 2017-03-09 | 株式会社デンソー | 車両用表示装置 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4218660B2 (ja) | 車両用スイッチ装置 | |
WO2014021063A1 (fr) | Dispositif d'actionnement | |
US20170351422A1 (en) | Transportation means, user interface and method for overlapping the display of display contents over two display devices | |
US11798758B2 (en) | Steering switch device and steering switch system | |
JP2012247890A (ja) | タッチパネル入力操作装置 | |
US10483969B2 (en) | Input device | |
JP2012190185A (ja) | 制御装置 | |
JP2014229014A (ja) | タッチパネル入力操作装置 | |
JP2014142777A (ja) | タッチパネル入力操作装置 | |
US10596906B2 (en) | Finger strip and use of said finger strip | |
CN107415694A (zh) | 使用接近感测的车辆换挡器界面装置 | |
JP2008058790A (ja) | 表示装置と入力表示装置および車載用入力表示装置 | |
CN110018749B (zh) | 表面包裹的用户界面触摸控件 | |
JP2020204868A (ja) | 表示装置 | |
WO2020179785A1 (fr) | Dispositif d'affichage de véhicule | |
JP2005228563A (ja) | 車両用操作パネル | |
JP2012208762A (ja) | タッチパネル入力操作装置 | |
WO2020137044A1 (fr) | Dispositif d'entrée d'opération | |
JP2012018465A (ja) | 操作入力装置及びその制御方法 | |
JP2017208185A (ja) | 入力装置 | |
JP2013033309A (ja) | タッチパネル入力操作装置 | |
JP2020114718A (ja) | 入力装置 | |
JP5640816B2 (ja) | 入力装置 | |
JP7529380B2 (ja) | 車両用操作制御装置 | |
JP2013232081A (ja) | タッチパネル入力操作装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20767372 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20767372 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |