CN117460664A - Vehicle with a vehicle body having a vehicle body support - Google Patents

Vehicle with a vehicle body having a vehicle body support Download PDF

Info

Publication number
CN117460664A
CN117460664A CN202280041571.2A CN202280041571A CN117460664A CN 117460664 A CN117460664 A CN 117460664A CN 202280041571 A CN202280041571 A CN 202280041571A CN 117460664 A CN117460664 A CN 117460664A
Authority
CN
China
Prior art keywords
image
vehicle
function
light emitting
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280041571.2A
Other languages
Chinese (zh)
Inventor
岩丸虎喜
小山内拓也
武智崚
西冈修
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN117460664A publication Critical patent/CN117460664A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J45/00Electrical equipment arrangements specially adapted for use as accessories on cycles, not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J50/00Arrangements specially adapted for use on cycles not provided for in main groups B62J1/00 - B62J45/00
    • B62J50/20Information-providing devices
    • B62J50/21Information-providing devices intended to provide information to rider or passenger
    • B62J50/22Information-providing devices intended to provide information to rider or passenger electronic, e.g. displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J6/00Arrangement of optical signalling or lighting devices on cycles; Mounting or supporting thereof; Circuits therefor
    • B62J6/16Arrangement of switches
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J6/00Arrangement of optical signalling or lighting devices on cycles; Mounting or supporting thereof; Circuits therefor
    • B62J6/22Warning or information lights
    • B62J6/24Warning or information lights warning or informing the rider, e.g. low fuel warning lights
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62KCYCLES; CYCLE FRAMES; CYCLE STEERING DEVICES; RIDER-OPERATED TERMINAL CONTROLS SPECIALLY ADAPTED FOR CYCLES; CYCLE AXLE SUSPENSIONS; CYCLE SIDE-CARS, FORECARS, OR THE LIKE
    • B62K23/00Rider-operated controls specially adapted for cycles, i.e. means for initiating control operations, e.g. levers, grips
    • B62K23/02Rider-operated controls specially adapted for cycles, i.e. means for initiating control operations, e.g. levers, grips hand actuated

Abstract

The vehicle is provided with: a handle bar; an input unit attached to the handle bar; a plurality of light emitting sections provided corresponding to a plurality of operations performed by a driver of the vehicle using the input section; and a control unit that controls a light emission state of the plurality of light emitting portions, the control unit causing the light emitting portion corresponding to an effective operation of at least a part of the plurality of operations to emit light, and causing the light emitting portion corresponding to an ineffective operation of the plurality of operations to be turned off.

Description

Vehicle with a vehicle body having a vehicle body support
Technical Field
The present invention relates to a vehicle.
Background
Patent document 1 proposes a handlebar switch device mounted to a handlebar of a vehicle. The handlebar switch device has a touch panel for various input operations by a driver of the vehicle.
Prior art literature
Patent literature
Patent document 1: japanese patent application laid-open No. 2015-71364
Disclosure of Invention
Problems to be solved by the invention
In the handlebar switch assembly of patent document 1, it is difficult for the driver to intuitively grasp what kind of operation is possible. One aspect of the present invention provides a technique that enables a driver to intuitively grasp an operation that can be performed on an input device of a vehicle.
Means for solving the problems
In some embodiments, there is provided a vehicle, wherein the vehicle includes: a handle bar; an input unit attached to the handle bar; a plurality of light emitting units provided in correspondence with a plurality of operations performed by a driver of the vehicle with the input unit; and a control unit that controls a light emission state of the plurality of light emitting portions, the control unit causing the light emitting portion corresponding to an effective operation of at least a part of the plurality of operations to emit light, and causing the light emitting portion corresponding to an ineffective operation of the plurality of operations to be turned off.
Effects of the invention
By the above means, the driver can intuitively grasp the operation that can be performed to the input device of the vehicle.
Other features and advantages of the present invention will become apparent from the following description taken in conjunction with the accompanying drawings. In the drawings, the same or similar structures are denoted by the same reference numerals.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a diagram illustrating an external appearance of a front side of a vehicle according to some embodiments.
Fig. 2A is a diagram illustrating an external appearance of a left switch group of a vehicle according to some embodiments.
Fig. 2B is a diagram illustrating an external appearance of a left switch group of a vehicle according to some embodiments.
FIG. 3 is a functional block diagram of a collaboration system of some embodiments.
Fig. 4 is a diagram illustrating a display in a vehicle according to some embodiments.
Fig. 5 is a diagram illustrating a menu display screen of a collaboration function according to some embodiments.
Fig. 6 is a diagram illustrating a display screen of a communication function according to some embodiments.
Fig. 7 is a diagram illustrating a display screen of a route guidance function according to some embodiments.
Fig. 8 is a diagram illustrating a display screen of a music playing function according to some embodiments.
Fig. 9 is a diagram illustrating a display screen of a messaging function according to some embodiments.
Fig. 10 is a diagram illustrating a display in a vehicle according to some embodiments.
Detailed Description
Hereinafter, embodiments will be described in detail with reference to the accompanying drawings. The following embodiments are not intended to limit the invention according to the technical aspects, and the combination of features described in the embodiments is not necessarily essential to the invention. Two or more of the features described in the embodiments may be arbitrarily combined. The same or similar components are denoted by the same reference numerals, and redundant description thereof is omitted.
With reference to fig. 1, an external appearance of a vehicle 100 according to some embodiments will be described. Vehicle 100 is a straddle-type two-wheeled motor vehicle. In addition, the present invention can be applied to other two-wheeled vehicles or other vehicles, such as four-wheeled vehicles and tricycles. Fig. 1 is a front rear view of a vehicle 100.
The vehicle 100 has a display device 101 at the center in the vehicle width direction. The display device 101 displays information for a driver (hereinafter, simply referred to as a driver) of the vehicle 100. The display device 101 may be a dot matrix type display device such as a liquid crystal display or an organic EL (electroluminescence) display, or may be a set of indicators for notifying by turning off or emitting light of a predetermined mark.
The vehicle 100 includes a left handlebar switch 102 on the inner side in the vehicle width direction of a left handlebar 103. Left side handlebar switch 102 is mounted to handlebar 104. The handle bars of the vehicle 100 are constituted by handle bars 103, 104, handle bars, and the like. Therefore, the left handlebar switch 102 can also be said to be mounted to the handlebar of the vehicle 100.
The appearance of the left side handlebar switch 102 will be described with reference to fig. 2. The left handlebar switch 102 includes a plurality of switches. At least some of the plurality of switches are operable by a left thumb in a state where the driver holds the left handlebar 103 with a left hand. A switch used for a cooperative function (described later) between the vehicle 100 and the mobile device will be described below. The other switches of the left-side handle switch 102 (i.e., the switches used for the functions of the vehicle 100 other than the cooperative function) may have the same configuration as the conventional ones, and thus their description is omitted.
The left handlebar switch 102 includes an upper switch 200U, a lower switch 200D, and a left-right switch 200H. These switches can be operated by the thumb of the left hand in a state where the driver holds the left-hand handlebar with the left hand. The left-right switch 200H is disposed substantially in the center of the left handlebar switch 102. The left-right switch 200H is a tilt switch capable of tilting in the left-right direction. The driver can perform the left-direction operation by tilting the left-right switch 200H to the left, and can perform the right-direction operation by tilting the left-right switch 200H to the right. The left direction operation is an operation for inputting an instruction of the left direction. The same applies to the other direction operation. The upper switch 200U is disposed above the left and right switches 200H. The upper switch 200U is a push-type switch. The driver can perform the upward operation by pressing the up switch 200U. The lower switch 200D is disposed below the left and right switches 200H. The lower switch 200D is a push-type switch. The driver can perform the downward operation by pressing the down switch 200D. The upper switch 200U, the lower switch 200D, and the left-right switch 200H are collectively referred to as a direction switch 200. The direction switch functions as a direction input unit for instructing a plurality of directions (up, down, left, and right).
The left handlebar switch 102 further includes a plurality of light emitting portions 201U, 201L, 201R, 201D. The plurality of light emitting portions 201U, 201L, 201R, 201D are collectively referred to as a light emitting portion 201. The following description of the light-emitting section 201 is also applicable to any one of the plurality of light-emitting sections 201U, 201L, 201R, 201D. The light emitting units 201 can individually control the light emitting states. In the following description, "light emission" may be a concept including "light emission" and "blinking". The light emitting unit 201 can also switch a plurality of colors to emit light. The light emitting unit 201 may be constituted by, for example, an LED (Light Emitting Diode: light emitting diode).
The plurality of light emitting units 201 are provided corresponding to a plurality of operations performed using the direction switch 200. Specifically, the light emitting portion 201U corresponds to an upward operation, the light emitting portion 201L corresponds to a left operation, the light emitting portion 201R corresponds to a right operation, and the light emitting portion 201D corresponds to a downward operation. When the upward operation includes a plurality of operations (for example, a short upward operation and a long upward operation), the light emitting unit 201U may correspond to only one of the plurality of operations, or may correspond to a part of the plurality of operations, or may correspond to all of the plurality of operations. The same applies to the operation in the other direction.
The correspondence between the light emitting unit 201 and the operation may be represented by, for example, an upper switch 200U for performing an upper input operation, as in the light emitting unit 201U, and the light emitting unit 201U may be integrally formed. In addition, the correspondence between the light emitting unit 201 and the input operation may be represented by, for example, disposing the light emitting unit 201L adjacent to the left and right switches 200H for performing the left input operation (in this example, adjacent in the left direction as the operation direction) as the light emitting unit 201L. The correspondence between the light emitting unit 201 and the input operation may be performed by any means that allows the driver to recognize the correspondence.
Fig. 2B shows a direction input unit 210 as a modification of the direction switch 200. The direction input unit 210 has four independent buttons 211U, 211L, 211R, 211D. The button 211U is a button for accepting an upward operation, the button 211L is a button for accepting a left operation, the button 211R is a button for accepting a right operation, and the button 211D is a button for accepting a downward operation. The four buttons 211U, 211L, 211R, 211D are collectively referred to as buttons 211. The following description of the button 211 also applies to any one of the four buttons 211U, 211L, 211R, 211D. The button 211 may also be capable of emitting light. That is, the button 211 is a member that integrates a structure for receiving an operation of the driver and the light emitting portion. Instead of the button 211 emitting light itself, a light emitting portion may be provided around the button 211.
With reference to fig. 3, a description will be given of a functional configuration of a collaboration system that provides collaboration functions according to some embodiments of the present invention. The collaboration system has the vehicle 100, mobile device 310, and headset 320 described above.
The vehicle 100 includes a control unit 301, a display unit 304, a direction input unit 305, a communication unit 306, and a light emission control unit 307. The control unit 301 controls the entire vehicle 100. The control unit 301 is composed of, for example, a processor 302 and a memory 303. In this case, the operation performed by the control unit 301 is realized by the processor 302 executing a program stored in the memory 303. Part or all of the operations of the control unit 301 may be realized by a dedicated circuit such as an ASIC (application specific integrated circuit) or an FPGA (field programmable gate array).
The display portion 304 displays information for the driver. The display unit 304 is implemented by the display device 101 and the light emitting unit 201, for example. The direction input section 305 acquires instructions from four directions of the driver. The direction input unit 305 is implemented by, for example, the upper switch 200U, the lower switch 200D, and the left-right switch 200H. The communication portion 306 provides a function for the vehicle 100 to communicate with the outside. The communication unit 306 may support short-range wireless communication such as bluetooth (registered trademark). Further, the communication unit 306 may support cellular communication, road-to-vehicle communication, and the like.
The direction input unit 305 may receive two-stage input for each direction using the direction switch 200. For example, the direction input section 305 may accept a short press (for example, an input having a duration of less than 1 second) and a long press (for example, an input having a duration of 1 second or more). In addition, the direction input unit 305 may accept a single click input (for example, an input having an interval of 1 second or more until a subsequent input) and a double click input (for example, two consecutive inputs having an interval of less than 1 second).
The light emission control unit 307 controls the light emission state of the light emitting unit 201. For example, the light emission control unit 307 determines whether to turn on or off the plurality of light emitting units 201, respectively, and supplies a control signal for controlling the light emission state to each of the light emitting units 201 based on the determination. Specific examples of what light emission state the light emission control unit 307 causes the light emission unit 201 to be in will be described later. The light emission control unit 307 may autonomously determine the light emission state of the light emission unit 201, or may determine the light emission state of the light emission unit 201 according to an instruction from the mobile device 310.
The mobile device 310 includes a control unit 311, a display unit 314, an input unit 315, and a communication unit 316. The mobile device 310 may be a mobile phone device such as a smart phone, for example. The user of the mobile device 310 may be the same as the driver. Hereinafter, a case where the driver uses the mobile device 310 will be described. The mobile device 310 may be held by the driver or may be housed in the vehicle 100. The control unit 311 controls the entire mobile device 310. The control unit 311 is composed of, for example, a processor 312 and a memory 313. In this case, the operation performed by the control unit 311 is realized by the processor 312 executing a program stored in the memory 313. The programs may include an operating system and application programs. Part or all of the operations of the control unit 311 may be realized by a dedicated circuit such as an ASIC or FPGA.
The display 314 displays information for the driver. The display unit 314 is implemented by a display device such as a liquid crystal display or an organic EL display, for example. The input section 315 acquires an input from the driver. The input unit 315 is implemented by an input device such as a touch panel or a button, for example. The communication section 316 provides a function for the mobile device 310 to communicate with the outside. The communication unit 316 may support short-range wireless communication such as bluetooth. Further, the communication unit 306 may support cellular communication, wiFi (registered trademark) communication, and the like.
The headset 320 includes a microphone 321, a speaker 322, and a communication unit 323. The headset 320 is worn on the head of the driver. The microphone 321 acquires a voice input from the driver. Speaker 322 outputs a voice for the driver. The communication unit 323 may support short-range wireless communication such as bluetooth.
In some embodiments of the present invention, vehicle 100 and mobile device 310 cooperate with each other. Specifically, a communication link based on short-range wireless communication is established between the vehicle 100 and the mobile device 310, for example. The vehicle 100 and the mobile device 310 exchange data via the communication link. In addition, a communication link, such as one based on short-range wireless communication, is established between the mobile device 310 and the headset 320. The mobile device 310 and headset 320 exchange data over a communication link.
At least a portion of the plurality of functions provided by mobile device 310 are capable of cooperating with vehicle 100. The function that can cooperate with the vehicle 100 among the functions provided by the mobile device 310 is hereinafter referred to as a cooperation function. The driver inputs the direction to the mobile device 310 through the direction input unit 305 of the vehicle 100. The directional input is sent from the vehicle 100 to the mobile device 310. The mobile device 310 performs an operation of the cooperative function according to a direction input from the driver, and generates an image representing an operation state. The mobile device 310 transmits the generated image to the vehicle 100. The vehicle 100 displays the received image on the display unit 304.
In addition, the mobile device 310 sends the voice output for the driver to the headset 320. The headset 320 outputs the received voice output from the speaker 322. Headset 320 transmits voice input from the driver, acquired by microphone 321, to mobile device 310. The mobile device 310 performs an action of a cooperative function corresponding to a voice input from the driver.
An embodiment (hereinafter, referred to as a first embodiment) in the case where the display device 101 is a dot matrix will be described. Fig. 4 illustrates a display example of display unit 304 during the cooperative operation of vehicle 100 and mobile device 310. The screen 400 is an example of a screen displayed on the display device 101. The travel speed is displayed in the center of the screen 400, and the fuel level is displayed below the travel speed. The speedometer is displayed on the left side of the screen 400.
The region 401 located on the right side of the screen 400 is a region for displaying an image provided from the mobile device 310 in cooperation of the vehicle 100 and the mobile device 310. Hereinafter, such an image is referred to as a cooperative image. During periods when no collaboration image is provided from mobile device 310 (i.e., during periods when vehicle 100 is not in collaboration with mobile device 310), vehicle 100 may also display other images in region 401, such as images unrelated to collaboration functionality. For example, the vehicle 100 may display option images such as driving mode parameters (e.g., information set for the vehicle 100 such as an operation state of an ABS (antilock brake system)) and level adjustment in the region 401. In the region 401, an image of lower importance among images unrelated to the cooperative function (for example, images related to the running of the vehicle 100) may also be displayed. By sharing the region 401 with the image having a low importance regardless of the cooperation function and the cooperation image, the space of the screen 400 can be effectively utilized.
The area 402 located on the lower side of the screen 400 is an area for displaying an image provided from the mobile device 310 in a state in which the vehicle 100 is not in cooperation with the mobile device 310. Hereinafter, such an image is referred to as an interruption image. During periods when no interrupt image is provided from mobile device 310, vehicle 100 may also display other images, such as images unrelated to collaborative functionality, in area 402.
A specific operation example of the cooperation function will be described with reference to fig. 5 to 9. The mobile device 310 provides a number of collaboration functions. The plurality of collaboration functions may include two or more of a weather information providing function, a call function, a messaging function, a music playing function, and a route guidance function. Hereinafter, a case where the mobile device 310 provides all of the five cooperative functions will be described. The weather information providing function refers to a function of providing information of weather of a specific place. The weather may include at least one of weather and air temperature. The call function is a function of making a call with another person. The messaging function refers to a function of exchanging text messages with others. The music playing function refers to a function of playing music. The route guidance function is a function of guiding a route toward a specified destination.
The mobile device 310 receives two inputs from the vehicle 100 for four directions (up, down, left, and right), respectively. In the following example, the case where the two inputs are short press and long press is handled. The up-direction short press operation is simply referred to as up-short press. Other directions and long presses are the same. In FIGS. 5-9, short presses are represented by darkened triangles and long presses are represented by darkened triangles with "≡.. In fig. 5 to 9, all images other than the image 500 are collaborative images, that is, images that the mobile device 310 transmits to the vehicle 100 and displays on the display device 101 of the vehicle 100 in the course of performing the collaborative function. Therefore, in the following description of fig. 5 to 9, description of the display of the cooperative image of the vehicle 100 after the transmission of the cooperative image is omitted.
During a period in which the vehicle 100 and the mobile device 310 are not in cooperation (i.e., during a period in which the mobile device 310 is not performing any cooperation function), an image 500 other than the cooperation function is displayed in the region 401 of the screen 400. When the left long press is acquired from the driver via the direction input unit 305 of the vehicle 100 in a state where the mobile device 310 is not in cooperation with the vehicle 100, the mobile device starts cooperation with the vehicle 100. In the following description, the direction input is also acquired from the driver via the direction input unit 305 of the vehicle 100, and therefore, only the input direction (up, down, left, and right) and the input method (short press/long press) are shown below.
The plurality of collaboration functions have top page images, respectively. The top page image is an image that the mobile device 310 transmits to the vehicle 100 for one of the cooperation functions immediately after the cooperation function is selected. The image 501 is a top page image of the weather information providing function. In the image 501, an image representing the weather of the current location of the vehicle 100 and the air temperature of the current location are displayed. When a destination is set for the vehicle 100, an image indicating weather of the destination and an air temperature of the destination may be displayed on the image 501. Image 502 is a top page image of the call function. The image 502 contains an image representing a call function. The image 503 is a top page image of the route guidance function. The image 503 contains an image representing a route guidance function. Image 504 is a top page image of the music playing function. The image 504 contains an image representing a music play function. Image 505 is a top page image of a messaging function. Image 505 contains an image representing a messaging function.
When the collaboration is started, the mobile device 310 selects any one of the plurality of collaboration functions and transmits a top page image of the selected collaboration function. The cooperation function selected here may be a cooperation function set in advance as an initial function. Such a cooperative function is referred to as an initial cooperative function. In the following description, a call function is set as an initial collaboration function. Thus, mobile device 310 transmits image 502 immediately after the start of collaboration. The driver can set the other cooperative function as the initial cooperative function. Mobile device 310 may also initiate collaboration via voice notification.
When the mobile device 310 acquires a long right press with the top page image (image 502 in this example) having the initial cooperation function displayed, the cooperation with the vehicle 100 is completed. In this case, the mobile device 310 transmits a notification of completion of the cooperation to the vehicle 100, and the vehicle 100 completes the cooperation operation based on the notification, and displays the image 500 other than the cooperation function in the area 401. Mobile device 310 may also notify by voice that the collaboration has ended.
As described above, the cooperative image is displayed in the region 401 on the right side of the screen 400. Therefore, when the cooperative image is displayed in the region 401 at the start of the cooperation, the driver perceives that the cooperative image enters the region 401 from the outside of the screen 400 across the right end of the screen 400. Since the input direction (left direction) for starting the cooperative motion coincides with the virtual moving direction of the cooperative image, the driver can intuitively start the cooperative function. In order to make this effect more remarkable, the vehicle 100 may be displayed with an animation such as a collaborative image from the right side in the region 401. When the cooperative image in the region 401 is erased at the end of the cooperation, the driver perceives that the cooperative image passes the right end of the screen 400 and leaves the screen 400. Since the input direction (right direction) for ending the cooperative motion coincides with the virtual moving direction of the cooperative image, the driver can intuitively end the cooperative function. In order to make this effect more remarkable, the vehicle 100 may be displayed with an animation such that the cooperative image is separated from the region 401 to the right. Further, in the display of another cooperative image, the vehicle 100 may associate the input direction with the movement direction of the image. That is, the vehicle 100 may display an image received as a response to the instruction to transmit the upward, downward, leftward, and downward directions to the mobile device 310 by moving in the instructed direction.
Next, switching of the plurality of cooperation functions will be described. When the mobile device 310 acquires a short press up or a short press down in a state where a top page image (any one of the images 501 to 505) of any one of a plurality of collaboration functions is displayed, it switches the currently selected collaboration function to another collaboration function, and transmits the top page image of the collaboration function after the switch. In this way, the up short press and the down short press in the state where the top page image is displayed are processed as switching operations for switching the currently selected one of the plurality of collaboration functions to the other collaboration function. For example, when the mobile device 310 acquires a short press in a state where the top page image of the currently selected collaboration function is displayed, the currently selected collaboration function is switched in the order of the weather information providing function, the call function, the route guidance function, the music playing function, and the messaging function, and the top page image (any one of the images 501 to 505) corresponding to the switched collaboration function is transmitted. Further, the mobile device 310 switches the currently selected collaboration function in reverse order based on the fact that the previous short press is acquired in a state in which the top page image of the currently selected collaboration function is displayed, and transmits the top page image (any one of the images 501 to 505) corresponding to the switched collaboration function. The mobile device 310 may also notify the cooperation function after the switching by voice at the same time as the switching of the cooperation function.
Next, execution of the cooperation function will be described. When the mobile device 310 acquires a right short press in a state where a top page image (any one of the images 502 to 505) of the executable collaboration function is displayed, the currently selected collaboration function is executed, and an image corresponding to the collaboration function is transmitted. In this way, the right short press in the state where the home page image is displayed is processed as an execution operation for executing the currently selected collaboration function. In the present embodiment, the weather information providing function is a function of displaying only weather information to the top page image, and cannot be executed. Therefore, even if the right short press is acquired in a state where the image 501 is displayed, the mobile device 310 may not perform processing.
Even if any of the collaboration images shown in fig. 5 to 9 is being displayed, when the mobile device 310 acquires a long left press, it shifts to transmission of the top page image (image 502 in this example) of the initial collaboration function. Even if any of the cooperative images shown in fig. 5 to 9 is being displayed, the mobile device 310 sets the sound volume of the voice output to the driver to zero (i.e., muffling) when a long press down is acquired. On the other hand, when the mobile device 310 receives the long press down in the sound deadening state, the sound deadening state is released and the sound volume before sound deadening is returned. Even if any of the collaborative images shown in fig. 5 to 9 is being displayed, the mobile device 310 may end the collaborative operation when the long right press is acquired. In addition, the mobile device 310 may accept the instruction to end the cooperation operation only in a state where the top page image (in this example, the image 502) of the initial cooperation function is displayed, or may accept the instruction to end the cooperation operation only in a state where the top page image of any one of the cooperation functions is displayed.
The light-emitting state of the light-emitting section 201 will be described with reference to fig. 5. The light emission states 511 to 515 represent the light emission states of the light emitting section 201 in the case where the images 501 to 505 are displayed in the region 401 of the screen 400 of the vehicle. For example, when the weather information providing function is currently selected, the control unit 301 displays the image 501 in the area 401 of the screen 400, and causes the light emitting unit 201 to emit light so as to be in the light emitting state 511.
When the images 501 to 505 are displayed in the region 401 of the screen 400 of the vehicle, the light emission control unit 307 controls the light emission state of the light emitting unit 201U in accordance with the up-short press performed by the driver of the vehicle 100 using the direction switch 200. Similarly, the light emission control unit 307 controls the light emission states of the light emitting units 201L, 201R, and 201D in response to the left short press, the right short press, and the down short press, respectively.
The light emission control unit 307 controls the light emission states of the plurality of light emitting units 201 so as to cause the light emitting units 201 corresponding to the active operation among the plurality of operations to emit light and cause the light emitting units 201 corresponding to the inactive operation among the plurality of operations to turn off. As an example, the light emission state 511 in the case where the currently selected collaboration function is a weather information providing function is described. In this case, the up-down-left-right short press effective operation corresponding to the four light emitting sections 201 is only based on switching of the cooperation function of the down-short press. Therefore, the light emission control unit 307 causes the light emission unit 201D to emit light (for example, emit light), and causes the other light emission units 201U, 201L, and 201R to be turned off. In the present specification, the effective operation may be an operation by the driver in order to obtain a desired result, and the ineffective operation may be an operation by the driver in which the driver cannot obtain the desired result.
As another example, the light emitting state 512 in the case where the currently selected collaboration function is a call function will be described. In this case, the up-down-left-right short press effective operation corresponding to the four light emitting units 201 is only based on switching of the cooperation function of the up-down short press or the down-down short press, and execution of the cooperation function (call function) based on the right short press. Therefore, the light emission control unit 307 causes the light emission units 201U, 201R, and 201D to emit light (e.g., light up), and causes the other light emission units 201L to turn off. By controlling the light emitting states of the plurality of light emitting portions 201 in this manner, the driver can intuitively grasp which operation is effective.
In some embodiments, corresponding colors are assigned to each of the plurality of collaboration functions. For example, a weather information providing function may be assigned red, a call function may be assigned blue, a message transmitting/receiving function may be assigned yellow, a music playing function may be assigned green, and a route guidance function may be assigned purple. The colors assigned to the cooperation functions may be different from each other or may be partially repeated. The light emitting unit 201 corresponding to the effective operation may be caused to emit light in a color corresponding to a currently selected function among a plurality of functions (cooperation functions in this embodiment) provided to the driver 30. In this specification, the difference in light emission color is indicated by the difference in shading.
For example, the light emission control section 307 may cause the light emitting section 201 corresponding to an operation for switching the currently selected function to another function to emit light in a color allocated to the other function. As an example, the light emitting state 512 in the case where the currently selected collaboration function is a call function is described. In this case, the up-press is an operation for switching the currently selected call function to the weather information providing function. Thus, the light emission control section 307 emits the light emission section 201U corresponding to the up short press in red assigned to the weather information providing function. The push-down is an operation for switching the currently selected call function to the route guidance function. Thus, the light emission control section 307 emits light of yellow assigned to the route guidance function from the light emission section 201D corresponding to the short press.
Instead of or in addition to this, the light emission control section 307 may cause the light emission section 201 corresponding to an operation for executing the currently selected function to emit light in a color assigned to the currently selected function. As an example, the light emitting state 512 in the case where the currently selected collaboration function is a call function is described. In this case, the right short press is an operation for executing the currently selected talk function. Thus, the light emission control unit 307 causes the light emitting unit 201R corresponding to the right short press to emit light in blue assigned to the call function. In this way, by controlling the light emitting states of the plurality of light emitting portions 201, the driver can intuitively grasp the purpose of each operation.
A specific operation example of the call function will be described with reference to fig. 6. When the mobile device 310 acquires a right short press with the image 502 displayed, it shifts to transmission of the image 602. The image 602 is an image for requesting a designation of a call target from the driver. The mobile device 310 may request the designation of the call target from the driver by voice while transmitting the image 602. When the mobile device 310 acquires the down button, the right button, and the left button in the state where the image 602 is displayed, it shifts to transmission of the image 603, transmission of the image 604, and transmission of the image 502, respectively.
The image 603 is an image including a name of a call target set in advance. When transferring to the transmission of the image 603, the mobile device 310 reads out a name set in advance as a call target from the memory 313 and includes the name in the image 603. The mobile device 310 may also transmit the image 603 and simultaneously notify the driver of the read name of the call target by voice. When the mobile device 310 acquires the up press, the right press, and the left press in the state where the image 603 is displayed, it shifts to transmission of the image 602, transmission of the image 606, and transmission of the image 502, respectively.
The image 604 is an image showing a waiting state in a voice search. The image 604 may also contain an image representing the input direction for ending the voice search. The mobile device 310 may also communicate to the driver by voice while sending the image 603 that a voice search is waiting. After the transmission of the image 604, the mobile device 310 waits for voice input via the headset 320. After the mobile device 310 has acquired the voice input, it searches for a call object from the contact object list in the memory 313 based on the voice input. The mobile device 310 shifts to transmission of the image 605 when the call target can be determined, and does not transmit a new image when the call target cannot be determined (i.e., the vehicle 100 continues display of the image 604). If the call target is not specified, mobile device 310 may notify the driver of the call target by voice. When the mobile device 310 acquires the short left button in the state where the image 604 is displayed, the transmission of the image 602 is shifted.
The image 605 is an image containing the name of the call object determined by the voice search. The mobile device 310 may also notify the driver of the determined name of the call object by voice while transmitting the image 605. When the mobile device 310 acquires the right short press and the left short press in a state where the image 605 is displayed, it shifts to transmission of the image 606 and transmission of the image 602, respectively.
The image 606 is an image showing the execution status of the call function. The image 606 may contain the call object and the call status (in-call or in-call (in this case talk time)). The image 606 may include an image indicating an input direction for ending a call. The mobile device 310 starts the power-off to the call target while transmitting the image 606, and updates the image 606 according to the call condition. Voice for the conversation is input and output between the mobile device 310 and the driver via the headset 320. When the mobile device 310 acquires the short left button with the image 606 displayed, the call is ended, and the transmission of the image 502 is shifted. In the case where the mobile device 310 acquires the up short press or the down short press in a state where the image 606 is displayed, the volume output from the headphone 320 is increased or decreased. When the mobile device 310 acquires a long left press with the image 606 displayed, it shifts to transmission of the image 601 while continuing the call.
Image 601 is a top page image of a call function that is executing the call function. On the other hand, the image 502 is a top page image of a call function in a case where a call function other than the call function is being executed. The ongoing call function may be in a call or in a call with a call subject. The image 601 may contain the call object and the call status (in-call, in-call (in this case, call time)). Since the image 601 is a top page image, when the mobile device 310 acquires a short press up or a short press down in a state where the image 601 is displayed, the call function is switched to another collaboration function. The mobile device 310 may maintain the call state when the call function is switched to another cooperative function during the time when the call function is being executed. When the mobile device 310 acquires the right short press while the image 601 is displayed, it shifts to transmission of the image 606 while continuing the call. When the call is ended (call target end call) with the image 601 displayed, the mobile device 310 shifts to transmission of the image 502. The mobile device 310 may not receive an instruction to end the call from the driver in a state where the image 601 is displayed.
When each image shown in fig. 6 is displayed, the light emission control unit 307 may control the light emission states of the plurality of light emitting units 201 so as to cause the light emitting units 201 corresponding to the active operation among the plurality of operations to emit light and cause the light emitting units 201 corresponding to the inactive operation among the plurality of operations to turn off. As an example, the light emission state 613 of the light emitting unit 201 in the case where the image 603 is displayed will be described. In this case, the effective operation of the up-down-left-right short presses corresponding to the four light emitting units 201 is only the up short press, the left short press, and the right short press. Therefore, the light emission control unit 307 causes the light emitting units 201U, 201L, and 201R to emit light (e.g., light up), and causes the other light emitting units 201D to turn off. Since the state of the call function is continued to be selected even after these effective operations are performed, the light emission control section 307 can emit light in blue assigned to the call function by the light emitting sections 201U, 201L, 201R.
As another example, the light emission state 616 of the light emitting unit 201 in the case where the image 606 is displayed (that is, in the call by the driver of the mobile device 310) will be described. In this case, the effective operation of the up-down-left-right short presses corresponding to the four light emitting units 201 is only the up short press, the left short press, and the down short press. Among these effective operations, the left short press is an operation for ending a call, and the up short press and the down short press are operations for changing the volume. The change of the volume can be regarded as an operation with a low importance level as compared with the end of the call. Therefore, the light emission control unit 307 can turn off the light emitting units 201U and 201D corresponding to the up-short button and the down-short button for changing the volume. In this way, even in an effective operation, the light emission control section 307 can turn off the corresponding light emitting section. The light emitting section corresponding to the change of the volume may be turned off, and the same applies to other images to be described later. The light emission control unit 307 also turns off the light emitting unit 201R corresponding to the invalidation operation. As a result, the light emission control unit 307 causes the light emission unit 201L corresponding to the operation for ending the call to emit light during the call by the driver of the mobile device 310, and causes the other light emission units 201U, 201R, 201D to be turned off. Thus, the driver can intuitively grasp the operation for ending the call during the call.
The light emission state of the light emitting section 201 in the case where voice input is performed during the display of the image 604 will be described. The light emission control unit 307 may cause the plurality of light emission units 201 to blink in a predetermined pattern until the search result is returned from the time when the driver receives the voice input. For example, the light emission control unit 307 may flash each of the plurality of light emitting units 201 so that the light emitting units 201 that emit light draw circles. In the same manner as in other processing described later, the light emission control unit 307 may cause the plurality of light emitting units 201 to blink in a predetermined pattern during a period from when the instruction from the driver is received to when the response to the instruction is returned. This enables the driver to intuitively grasp that the process is in progress.
A specific operation example of the route guidance function will be described with reference to fig. 7. When each image shown in fig. 7 is displayed, the light emission control unit 307 may control the light emission states of the light emitting units 201 so as to turn off the light emitting units 201 corresponding to the active operation among the plurality of operations and turn off the light emitting units 201 corresponding to the inactive operation among the plurality of operations. When the mobile device 310 acquires a right short press with the image 503 displayed, it shifts to transmission of the image 702. The image 702 is an image for requesting a specified destination from the driver. The mobile device 310 may also request the driver to specify the destination by voice while transmitting the image 702. When the mobile device 310 acquires the down button, the right button, and the left button in the state where the image 702 is displayed, it shifts to transmission of the image 703, transmission of the image 704, and transmission of the image 503, respectively.
The image 703 is an image including a destination set in advance. When transferring to the transmission of the image 703, the mobile device 310 reads out a location set in advance as a destination from the memory 313 and includes the location in the image 703. The mobile device 310 may transmit the image 703 and simultaneously notify the driver of the read destination by voice. When the mobile device 310 acquires the up button, the right button, and the left button in the state where the image 703 is displayed, it shifts to transmission of the image 702, transmission of the image 706, and transmission of the image 503, respectively.
The image 704 is an image representing a waiting state in a voice search. The image 704 may also contain an image representing the input direction for ending the voice search. The mobile device 310 may also communicate to the driver by voice while sending the image 703 that a voice search is waiting. After the transmission of the image 704, the mobile device 310 waits for voice input via the headset 320. After the mobile device 310 obtains the voice input, it determines the destination based on the voice input, for example, from map information. The mobile device 310 transitions to transmission of the image 705 if the destination can be determined, and does not transmit a new image if the destination cannot be determined (i.e., the vehicle 100 continues display of the image 704). The mobile device 310 may notify the driver of the failure to determine the destination by voice. When the mobile device 310 acquires the short left button in a state where the image 704 is displayed, it shifts to transmission of the image 702.
The image 705 is an image containing a destination determined by voice search. The mobile device 310 may also communicate the determined destination to the driver by voice at the same time as the image 705 is transmitted. When the mobile device 310 acquires the right short press and the left short press in a state where the image 705 is displayed, it shifts to transmission of the image 706 and transmission of the image 702, respectively.
The image 706 is an image showing the execution status of the route guidance function. The image 706 may include a distance to a place where the vehicle should turn and a turning direction. The image 706 may include an image indicating an input direction for ending route guidance. The mobile device 310 starts route guidance while transmitting the image 706, and updates the image 706 according to the guidance status. Route guided speech is output from the mobile device 310 to the driver via the headset 320. When the mobile device 310 acquires the short left button in a state where the image 706 is displayed, it shifts to transmission of the image 707. If the mobile device 310 acquires a short up button or a short down button in a state where the image 706 is displayed, the volume output from the headphone 320 is increased or decreased.
The image 707 is an image for confirming whether or not route guidance is ended. The image 707 may include an image requesting confirmation and an image representing an input direction for responding to the confirmation. When the mobile device 310 acquires a short right button with the image 707 displayed, it ends the route guidance and shifts to transmission of the image 503. When the mobile device 310 acquires the long right press while the image 707 is displayed, it shifts to transmission of the image 706 while continuing route guidance.
The image 701 is a top page image of the route guidance function that is executing the route guidance function. On the other hand, the image 503 is a top page image of the route guidance function in the case where the route guidance function other than the one is being executed. The route guidance function being performed may be in the process of guiding the route to the driver. The image 701 may include the distance to the place where the vehicle should turn and the turning direction. As described above, when the mobile device 310 acquires a long left press in a state where the route guidance function is selected (that is, any one of the image 503 and the images 701 to 707 is displayed), the mobile device transitions to the transmission of the top page image of the initial cooperation function. When the route guidance function is selected again later, if the mobile device 310 is in a state of executing route guidance at the time of the previous selection, the image 701 is transmitted as a top page image of the route guidance function.
Since the image 701 is a top page image, when the mobile device 310 acquires a short press up or a short press down in a state where the image 701 is displayed, the route guidance function is switched to another cooperation function. When the mobile device 310 acquires a short right click while the image 701 is displayed, it shifts to transmission of the image 706 while continuing route guidance. When the route guidance ends (the destination is reached) with the image 701 displayed, the mobile device 310 transitions to transmission of the image 503. The mobile device 310 may not accept an instruction to end the route guidance in a state where the image 701 is displayed.
A specific operation example of the music playing function will be described with reference to fig. 8. When each image shown in fig. 8 is displayed, the light emission control unit 307 controls the light emission states of the plurality of light emitting units 201 so as to cause the light emitting unit 201 corresponding to the active operation among the plurality of operations to emit light and cause the light emitting unit 201 corresponding to the inactive operation among the plurality of operations to turn off. When the mobile device 310 acquires a right short press with the image 504 displayed, it shifts to transmission of the image 802. The image 802 is an image for searching for played music. The mobile device 310 selects a song in the playlist in the memory 313 while transmitting the image 802 and is included in the image 802. The mobile device 310 may also voice notify the driver of the name of the selected composition at the same time as the image 802 is transmitted. When the mobile device 310 acquires the right short press and the left short press in the state where the image 802 is displayed, it shifts to transmission of the image 803 and transmission of the image 504, respectively. When the mobile device 310 acquires the short-press-up in the state where the image 802 is displayed, the mobile device selects the last song in the playlist, and updates the image 802 in accordance with the selection. When the mobile device 310 acquires a short press in a state where the image 802 is displayed, the mobile device selects the next song in the playlist, and updates the image 802 in accordance with the selection.
The image 803 is an image showing the execution status of the music play function. The image 803 may also contain the name of the music being played and the play status (play time). The image 803 may also contain an image representing the input direction for ending the playback and the input direction for skipping the piece of music being played. The mobile device 310 updates the image 803 according to the playing status for the playing of the music while transmitting the image 803. The playing musical composition is output from the mobile device 310 to the driver via the headset 320. When the mobile device 310 acquires the short left button in the state where the image 803 is displayed, the playback is ended, and the transmission of the image 802 is shifted. When the mobile device 310 acquires a short right button in a state where the image 803 is displayed, the mobile device selects the next song in the playlist, and updates the image 803 in accordance with the selection. When the mobile device 310 acquires the up short press or the down short press in a state where the image 803 is displayed, the volume output from the headphone 320 is increased or decreased. When the mobile device 310 acquires a long left press with the image 806 displayed, it shifts to transmission of the top image (image 502 in this example) of the initial collaboration function while continuing to play.
Image 801 is a top page image of a music play function that is executing the music play function. On the other hand, the image 504 is a top page image of the music play function in the case where the music play function is not being executed. The music playing function being performed may be during the course of playing music to the driver. The image 801 may contain the name of the music being played and the play status (play time). As described above, when the mobile device 310 acquires a long left press in a state where the music playing function (that is, any one of the image 504 and the images 801 to 803 is displayed) is selected, the transmission of the top page image of the initial cooperation function is shifted. When the mobile device 310 selects the music playing function again later, if the music playing function is being executed at the time of the last selection, the image 801 is transmitted as the top page image of the music playing function.
Since the image 801 is a top page image, when the mobile device 310 acquires a short press up or a short press down in a state in which the image 801 is displayed, the music play function is switched to another collaboration function. When the mobile device 310 acquires the right short press while the image 801 is displayed, it shifts to transmission of the image 803 while continuing music playback. When the music playing is completed (the playlist is completed) in the state where the image 801 is displayed, the mobile device 310 shifts to transmission of the image 504. The mobile device 310 may not accept an instruction to end music play in a state where the image 801 is displayed.
A specific operation example of the message transmitting/receiving function will be described with reference to fig. 9. When each image shown in fig. 9 is displayed, the light emission control unit 307 may control the light emission states of the plurality of light emitting units 201 so as to cause the light emitting unit 201 corresponding to the active operation among the plurality of operations to emit light and cause the light emitting unit 201 corresponding to the inactive operation among the plurality of operations to turn off. When the mobile device 310 acquires a right short press with the image 505 displayed, it shifts to transmission of the image 901. The image 901 is an image for requesting a designation of a transmission object from the driver. The mobile device 310 may request the designation of the transmission target from the driver by voice while transmitting the image 901. When the mobile device 310 acquires the down button, the right button, and the left button in the state where the image 901 is displayed, it shifts to transmission of the image 902, transmission of the image 903, and transmission of the image 505, respectively.
The image 902 is an image containing a name of a transmission target set in advance. When transferring to the transmission of the image 902, the mobile device 310 reads out a name set in advance as a transmission target from the memory 313 and includes the name in the image 902. The mobile device 310 may transmit the image 902 and simultaneously notify the driver of the read name of the transmission target by voice. When the mobile device 310 acquires the up button, the right button, and the left button in the state where the image 902 is displayed, it shifts to transmission of the image 901, transmission of the image 905, and transmission of the image 505, respectively.
The image 903 is an image showing a waiting state in voice search. The image 903 may also contain an image representing the input direction for ending the voice search. The mobile device 310 may also communicate to the driver by voice while sending the image 902 that a voice search is waiting. After the transmission of the image 903, the mobile device 310 waits for voice input via the headset 320. After acquiring the voice input, the mobile device 310 searches for a transmission object from the contact object list in the memory 313 based on the voice input. The mobile device 310 shifts to transmission of the image 904 if the transmission target can be determined, and does not transmit a new image if the transmission target cannot be determined (i.e., the vehicle 100 continues display of the image 903). If the transmission destination cannot be specified, the mobile device 310 may notify the driver of the failure by voice. When the mobile device 310 acquires the short left button in a state where the image 903 is displayed, it shifts to transmission of the image 901.
Image 904 is an image containing the name of the transmission object determined by voice search. The mobile device 310 may also transmit the image 904 while simultaneously informing the driver of the determined name of the transmission object by voice. When the mobile device 310 acquires the right short press and the left short press in a state where the image 904 is displayed, it shifts to transmission of the image 905 and transmission of the image 901, respectively.
The image 905 is an image showing a waiting state in voice input. The image 905 may include an image indicating an input direction for ending voice input and an input direction for starting transmission. The mobile device 310 may also communicate to the driver by voice while sending the image 905 that voice input is waiting. After the transmission of the image 905, the mobile device 310 waits for voice input via the headset 320. The mobile device 310 acquires the input voice as a message to be transmitted. When the mobile device 310 acquires the left short press, the right short press, and the down short press in the state where the image 903 is displayed, it shifts to transmission of the image 904, transmission of the image 907, and transmission of the image 906, respectively.
The image 906 is an image of a fixed sentence including a message set in advance. When transferring to the transmission of the image 906, the mobile device 310 reads out a fixed sentence of a message set in advance from the memory 313 and includes the fixed sentence in the image 906. The image 906 may also contain an image representing the input direction for canceling transmission and the input direction for starting transmission. The mobile device 310 may also notify the driver of the read fixed sentence by voice while transmitting the image 906. When the mobile device 310 acquires the up button, the right button, and the left button in the state where the image 906 is displayed, it shifts to transmission of the image 905, transmission of the image 907, and transmission of the image 904, respectively.
The image 907 is an image representing the execution status of the message transmitting/receiving function. Image 907 may also contain an image representing that it is in the process of messaging. The mobile device 310 starts transmitting a message to the transmission target simultaneously with the image 907, and shifts to the transmission of the image 505 after the transmission is completed.
As shown in fig. 6 to 9, images transmitted in the respective collaboration functions (i.e., collaboration images) are classified into three levels of a menu level, an object selection level, and an activity level. The menu hierarchy contains a top page image of the currently selected collaboration function. The object selection hierarchy includes an image for selecting an execution object of the currently selected collaboration function. The activity hierarchy includes an image indicating the execution status of the currently selected collaboration function.
In each collaboration function having an object selection hierarchy, transition from a menu hierarchy to the object selection hierarchy is made based on a right short press. In addition, in each collaboration function having an activity hierarchy, transition from the object selection hierarchy to the activity hierarchy is made based on a right short press. In this way, the hierarchy is transferred based on the short right press, and thus the driver can intuitively transfer the hierarchy. In addition, a switch other than a direction switch such as a decision switch is not required. Further, the transition from the active hierarchy to the object selection hierarchy and the transition from the object selection hierarchy to the menu hierarchy are based on a left short press in the opposite direction from the right short press. Thus, the driver can intuitively return to the hierarchy. In addition, the mobile device 310 does not perform image-based confirmation for the driver at the time of transition from the menu hierarchy to the object selection hierarchy and transition from the object selection hierarchy to the activity hierarchy. This can reduce the amount of operations until the execution of the processing.
In each hierarchy, the up-short press and the down-short press are used for transfer of processing within the same hierarchy. As described above, the left-right switch 200H is a tilt-type switch, and the upper switch 200U and the lower switch 200D are independent push-type switches. Therefore, the amount of movement of the driver's finger in the left-right direction is small compared to the up-down direction. In the above-described embodiment, the shift between layers is performed by the input in the left-right direction, and thus the amount of movement of the finger of the driver can be reduced.
In the object selection hierarchy, transition to voice search is made based on a right short press. In this way, transition to voice search can be made based on input (right short press) in the same direction as transition to a deeper hierarchy, thereby providing a unified operation feeling.
A specific example of interrupt processing by the cooperation function will be described with reference to fig. 10. The interrupt process is a process in which the mobile device 310 requests the vehicle 100 to start the cooperative function, regardless of the instruction of the driver. For example, the mobile device 310 requests to start interrupt processing in the case of an incoming call from another person or in the case of receiving a message from another person.
When the mobile device 310 receives an incoming call, the mobile device 310 transmits an image related to the interrupt processing of the call function (i.e., an interrupt image) to the vehicle 100, and the vehicle 100 displays the interrupt image in the area 402 of the screen 400. The interrupt image may also contain the name of the caller, an input direction for indicating a response to an incoming call, and an input direction for indicating a non-response to an incoming call. When the user is in a state where a collaboration function other than the call function is currently selected at this time, an image of the currently selected collaboration function is displayed in the area 401. On the other hand, when it is not in the state where the cooperation function is being executed at this time, an image other than the cooperation function is displayed in the area 401. When the mobile device 310 acquires the right short press while the interrupt image is displayed, it responds to the incoming call and shifts to transmission of the image 606. At this time, a collaborative image (specifically, image 606) is displayed in the area 401. At this time, the vehicle 100 may delete the interrupt image from the region 402 and display the image based on the non-cooperative function. The same process is also performed when the mobile device 310 receives a message.
The light emission state 1001 of the light emitting unit 201 when the interrupt image is displayed (that is, when the mobile device 310 receives an incoming call and is calling the driver) will be described. In this case, the effective operation of the up-down, left-right short presses corresponding to the four light emitting units 201 is only the left short press and the right short press. Among these effective operations, the right short press is an operation for responding to an incoming call, and the left short press is an operation for rejecting an incoming call. Thus, the light emission control unit 307 causes the light emitting units 201L and 201R corresponding to the left short press and the right short press to emit light (e.g., light up), and causes the other light emitting units 201U and 201D to turn off. Thus, the driver can intuitively grasp the operation for responding to the incoming call or rejecting the incoming call. Further, the light emission control unit 307 may cause the light emitting units 201L and 201R to emit light in other colors. For example, the light emission control unit 307 may cause the light emission unit 201L corresponding to rejection of the incoming call to emit light in red, and cause the light emission unit 201R corresponding to response of the incoming call to emit light in green.
In the above embodiment, the vertical direction input and the horizontal direction input can be exchanged. For example, the left direction input may be used instead of the upper direction input, the right direction input may be used instead of the lower direction input, the upper direction input may be used instead of the left direction input, and the lower direction input may be used instead of the right direction input. In response, the position of the region 401 on the screen 400 may be changed. For example, when the long press is started by the down-press and the long press is ended by the up-press, the region 401 may be arranged on the upper side of the screen. In the above embodiment, the left direction input and the right direction input can be exchanged. For example, a left direction input may be used instead of a right direction input, and a right direction input may be used instead of a left direction input. In response, the position of the region 401 on the screen 400 may be changed. For example, when the cooperation starts by the right long press and the cooperation ends by the left long press, the region 401 may be arranged on the left side of the screen.
In the above embodiment, the light emission control unit 307 may not be configured to emit light from the light emission unit 201 corresponding to the effective operation in accordance with the running state of the vehicle 100. In addition, the light emission control unit 307 may change the light emission state of the light emitting unit 201 corresponding to the effective operation based on the running state of the vehicle 100. For example, the light emission control unit 307 may cause the light emitting unit 201 corresponding to the effective operation to emit light while the vehicle 100 is stopped, and cause the light emitting unit 201 corresponding to the effective operation to be turned off or dimmed while the vehicle 100 is running. This can suppress light from the light emitting unit 201 from interfering with the travel of the driver. Even when the light emitting section 201 corresponding to the effective operation is turned off or dimmed, the light emitting section 201 may be lighted as usual (in the same manner as when stopped) for a part of the effective operation (for example, the lighting state 1001 at the time of the above-described interrupt process).
In the above-described embodiment, the control of the light emitting state of the light emitting portion 201 is described by the context of the cooperative function of the vehicle 100 and the mobile device 310. In addition, the control of the light emitting state of the light emitting unit 201 according to the present invention may be used for a cooperative function of the vehicle 100 and other devices, a function of the vehicle 100 alone, or a combination thereof.
Summary of the embodiments
< item 1 >
A vehicle (100) in which,
the vehicle (100) is provided with:
a handle bar (104);
an input unit (102) attached to the handle bar;
a plurality of light emitting units (201) provided in correspondence with a plurality of operations performed by the driver of the vehicle with the input unit; and
a control unit (307) for controlling the light-emitting states of the plurality of light-emitting parts,
the control unit causes the light emitting portion corresponding to an effective operation of at least a part of the plurality of operations to emit light,
the control unit turns off the light emitting portion corresponding to an invalid operation among the plurality of operations.
According to this item, the driver can intuitively grasp the operation that can be performed to the input unit of the vehicle.
< item 2 >
The vehicle according to item 1, wherein,
the plurality of light emitting parts can switch a plurality of colors to emit light,
the control unit causes a light emitting portion corresponding to the effective operation to emit light in a color corresponding to a currently selected function of a plurality of functions provided to the driver.
According to this item, the driver can intuitively grasp the content of the operation to the input unit of the vehicle.
< item 3 >
The voice guidance apparatus of item 2, wherein,
Colors are assigned to the plurality of functions respectively,
the control unit causes a light emitting portion corresponding to an operation for switching the currently selected function to another function to emit light in a color assigned to the other function.
According to this item, the driver can intuitively grasp the content of the function after the switching.
< item 4 >
The vehicle according to item 2 or 3, wherein,
colors are assigned to the plurality of functions respectively,
the control unit causes a light emitting portion corresponding to an operation for executing the currently selected function to emit light in a color assigned to the currently selected function.
According to this item, the driver can intuitively grasp the content of the currently selected function.
< item 5 >
The vehicle according to any one of items 1 to 4, wherein,
the vehicle is capable of acting in cooperation with the telephone device of the driver,
the control unit causes a light-emitting portion corresponding to an operation for responding to the incoming call and a light-emitting portion corresponding to an operation for rejecting the incoming call to emit light and causes other light-emitting portions of the plurality of light-emitting portions to be turned off during a call from the telephone apparatus to the incoming call.
According to this item, the driver can intuitively grasp an operation for responding to an incoming call or rejecting the incoming call.
< item 6 >
The vehicle according to item 5, wherein the control unit causes a light-emitting portion corresponding to an operation for ending the call to emit light and causes other light-emitting portions of the plurality of light-emitting portions to go out during the call by the driver of the telephone apparatus.
According to this item, the driver can intuitively grasp an operation for ending the call.
< item 7 >
The vehicle according to any one of items 1 to 6, wherein,
the control unit causes the plurality of light emitting portions to blink in a predetermined pattern during a period from accepting an instruction of the driver to returning a response to the instruction.
According to this item, the driver can intuitively grasp that the process is in progress.
< item 8 >
The vehicle according to any one of items 1 to 7, wherein,
the control unit causes the light emitting portion corresponding to the effective operation to emit light in a stopped state of the vehicle,
the control unit turns off or dims the light emitting portion corresponding to the effective operation in a running state of the vehicle.
According to this aspect, the light from the light emitting portion can be suppressed from interfering with the travel of the driver.
< item 9 >
The vehicle according to any one of items 1 to 8, wherein,
the input portion includes a direction input portion (200, 211) for indicating a plurality of directions,
the plurality of operations are performed using the direction input section.
According to this item, the driver can intuitively grasp the operation that can be performed to the direction input unit.
< item 10 >
The vehicle according to any one of items 1 to 9, wherein the vehicle is a two-wheeled vehicle.
According to this item, the driver can intuitively grasp the operation that can be performed to the input unit of the two-wheeled vehicle.
The present invention is not limited to the above-described embodiments, and various modifications and changes can be made within the scope of the present invention.
The present application claims the first priority based on japanese patent application publication No. 2021-106887 filed on 6/28 of 2021, the contents of which are incorporated herein by reference in their entirety.
Description of the reference numerals
100: a vehicle;
102: a left handlebar switch;
201: a light-emitting part.

Claims (10)

1. A vehicle, wherein,
the vehicle is provided with:
a handle bar;
an input unit attached to the handle bar;
a plurality of light emitting units provided in correspondence with a plurality of operations performed by a driver of the vehicle with the input unit; and
A control unit that controls the light emission states of the plurality of light emitting portions,
the control unit causes the light emitting portion corresponding to an effective operation of at least a part of the plurality of operations to emit light,
the control unit turns off the light emitting portion corresponding to an invalid operation among the plurality of operations.
2. The vehicle according to claim 1, wherein,
the plurality of light emitting parts can switch a plurality of colors to emit light,
the control unit causes a light emitting portion corresponding to the effective operation to emit light in a color corresponding to a currently selected function of a plurality of functions provided to the driver.
3. The vehicle according to claim 2, wherein,
colors are assigned to the plurality of functions respectively,
the control unit causes a light emitting portion corresponding to an operation for switching the currently selected function to another function to emit light in a color assigned to the other function.
4. A vehicle according to claim 2 or 3, wherein,
colors are assigned to the plurality of functions respectively,
the control unit causes a light emitting portion corresponding to an operation for executing the currently selected function to emit light in a color assigned to the currently selected function.
5. The vehicle according to any one of claims 1 to 4, wherein,
The vehicle is capable of acting in cooperation with the telephone device of the driver,
the control unit causes a light-emitting portion corresponding to an operation for responding to the incoming call and a light-emitting portion corresponding to an operation for rejecting the incoming call to emit light and causes other light-emitting portions of the plurality of light-emitting portions to be turned off while the telephone device receives the incoming call and calls the driver.
6. The vehicle according to claim 5, wherein the control unit causes a light-emitting portion corresponding to an operation for ending the call to emit light and causes other light-emitting portions of the plurality of light-emitting portions to go out during the call based on the driver of the telephone device.
7. The vehicle according to any one of claims 1 to 6, wherein the control unit causes the plurality of light emitting portions to blink in a predetermined pattern during a period from accepting an instruction of the driver to returning a response to the instruction.
8. The vehicle according to any one of claims 1 to 7, wherein,
the control unit causes the light emitting portion corresponding to the effective operation to emit light in a stopped state of the vehicle,
the control unit turns off or dims the light emitting portion corresponding to the effective operation in a running state of the vehicle.
9. The vehicle according to any one of claims 1 to 8, wherein,
the input portion includes a direction input portion for indicating a plurality of directions,
the plurality of operations are performed using the direction input section.
10. The vehicle according to any one of claims 1 to 9, wherein the vehicle is a two-wheeled vehicle.
CN202280041571.2A 2021-06-28 2022-03-25 Vehicle with a vehicle body having a vehicle body support Pending CN117460664A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2021106887 2021-06-28
JP2021-106887 2021-06-28
PCT/JP2022/014351 WO2023276348A1 (en) 2021-06-28 2022-03-25 Vehicle

Publications (1)

Publication Number Publication Date
CN117460664A true CN117460664A (en) 2024-01-26

Family

ID=84692632

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280041571.2A Pending CN117460664A (en) 2021-06-28 2022-03-25 Vehicle with a vehicle body having a vehicle body support

Country Status (2)

Country Link
CN (1) CN117460664A (en)
WO (1) WO2023276348A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001278056A (en) * 2000-04-04 2001-10-10 Yaskawa Electric Corp Mobile carriage
CN102308265B (en) * 2009-02-23 2014-07-30 东洋电装株式会社 Touch controlled input device and electronic devices using same
JP6229419B2 (en) * 2013-10-03 2017-11-15 スズキ株式会社 Handle switch device
JP2017208185A (en) * 2016-05-17 2017-11-24 株式会社東海理化電機製作所 Input device
US11400997B2 (en) * 2016-05-23 2022-08-02 Indian Motorcycle International, LLC Display systems and methods for a recreational vehicle
JP6894195B2 (en) * 2016-06-14 2021-06-30 株式会社シマノ Bicycle display device
JP6864028B2 (en) * 2019-04-24 2021-04-21 本田技研工業株式会社 Handle switch

Also Published As

Publication number Publication date
WO2023276348A1 (en) 2023-01-05
JPWO2023276348A1 (en) 2023-01-05

Similar Documents

Publication Publication Date Title
JP6136627B2 (en) Vehicle information display device
US9608956B2 (en) Method and electronic device for performing exchange of messages
JP4462817B2 (en) Visualization method of character input mode by key backlight multicolor lighting
US20160021155A1 (en) Method and electronic device for performing exchange of messages
KR20050077806A (en) Method for carrying out a speech dialogue and speech dialogue system
WO2018120492A1 (en) Page processing method, mobile terminal, device and computer storage medium
JP2008077411A (en) Emulation device
JP2019156034A (en) Information transmission method for vehicle and information transmission system for motorcycle
US20210188389A1 (en) Motorcycle with a multifunctional device
JP6119456B2 (en) Vehicle information display device
CN117460664A (en) Vehicle with a vehicle body having a vehicle body support
JPWO2005125159A1 (en) Information terminal
EP2925551B1 (en) Information device for a vehicle driver and method to control same
WO2023276293A1 (en) Speech guidance device, speech guidance method, and program
US20220242242A1 (en) Vehicle, mobile device, and control method therefor
CN117546137A (en) Voice guidance device, voice guidance method, and program
CN112849118B (en) Control method of vehicle steering wheel, computer device, storage medium, and vehicle
WO2023276381A1 (en) Vehicle
JP2006243917A (en) Operation terminal, operation guidance method, and program
JP2009118384A (en) Mobile device, input support method, and input support program
CN108549514B (en) Operating system, operating method of operating system and vehicle with operating system
JPWO2023276348A5 (en) vehicle
KR20060057422A (en) Mobile communication device having function for manu setting by color transformation and method thereof
CN115123367A (en) Input device and recording medium
JP2018103935A (en) Out-vehicle device control method, out-vehicle control device, and vehicle out-device control program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination