Detailed Description
The present invention will be described below with reference to embodiments thereof, but the following embodiments do not limit the claimed invention. All combinations of the features described in the embodiments are not essential to the means for solving the present invention. In addition, in the drawings, the same or similar parts are sometimes assigned the same reference numerals, and overlapping description is omitted.
[ overview of vehicle 100 ]
Fig. 1 schematically shows an example of a system configuration of a vehicle 100. In the present embodiment, the vehicle 100 is provided with an interior space 120 therein. In the present embodiment, the vehicle 100 includes a drive system 130, a sensor system 140, an input/output system 150, an environment adjustment system 160, and a control system 170.
The vehicle 100 carries one or more persons or objects and moves. The vehicle 100 may be moved by an operation of a driver riding on the vehicle 100, may be moved by a remote operation, or may be moved by automatic driving.
The vehicle 100 may be an automobile, a motorcycle, an electric locomotive, or the like. Examples of the vehicle include an engine vehicle, an electric vehicle, a fuel cell vehicle, a hybrid vehicle, and a construction vehicle. Examples of the motorcycle include (i) a motorcycle, (ii) a three-wheeled motorcycle, and (iii) a standing two-wheeled vehicle or a three-wheeled vehicle having a power unit.
In the present embodiment, the vehicle 100 manages the environment of the in-vehicle space 120. For example, the vehicle 100 independently manages the respective environments of a plurality of regions (each region is sometimes referred to as a subspace) inside the in-vehicle space 120. The environment may be a visual field state (sometimes referred to as a visual field environment), an acoustic state (sometimes referred to as an acoustic environment), an air state (sometimes referred to as an air environment), or the like. This promotes or suppresses communication between a plurality of occupants sharing the vehicle interior space 120.
For example, when the vehicle 100 transports multiple riders, one rider may wish to promote communication with other riders sharing the in-vehicle space, or may wish to inhibit communication with other riders. In particular, one rider may or may not wish to share at least one of the visual experience, the auditory experience, and the olfactory experience with other riders.
More specifically, when a team consisting of a plurality of persons uses the vehicle 100 to travel, by facilitating communication between a plurality of riders during the course of travel, a sense of unity can be created between the riders. On the other hand, during the return trip of the travel, some riders may take a careless rest in order to suppress communication with other riders. In addition, when the vehicle 100 is, for example, a bus or a public taxi, there is a demand that it is not desirable that the conversation of oneself with an acquaintance be heard by others. In addition, there is a desire to eat food regardless of the surroundings.
According to the vehicle 100 of the present embodiment, the environment of the space in which one passenger is present and the environment of the space in which the other passengers are present are independently managed. Specifically, at least one of (i) a visual field range of each rider or a goodness of the visual field, (ii) a propagation range of a sound of each rider or a volume of the sound, and (iii) a diffusion range of an odorant, a concentration of the odorant, or a sensory intensity generated by each rider is adjusted. Thereby, the degree of sharing of at least one of the visual experience, the auditory experience, and the olfactory experience between one rider and the other rider is adjusted.
The diffusion range of the odorous substance is defined as, for example, an area where the concentration of the odorous substance is greater than a predetermined threshold. The odorant may be a gaseous chemical. The odorant may be an unpleasant inducing substance such as a malodor and a pungent odor, or may be an inducing substance of a pleasant fragrance.
[ overview of each unit of vehicle 100 ]
In the present embodiment, the vehicle interior space 120 is disposed inside the vehicle 100. The vehicle interior space 120 may be a space that can be commonly used by multiple riders. Details of the vehicle interior space 120 will be described later.
In the present embodiment, the drive system 130 drives the vehicle 100. For example, the drive system 130 drives the vehicle 100 based on instructions of the control system 170. In one embodiment, the drive system 130 drives the vehicle 100 based on an operation of a driver riding the vehicle 100. In another embodiment, the drive system 130 has a remote operation function or an automatic driving function.
In the present embodiment, the sensor system 140 includes various sensors. The sensor system 140 may send the output of each sensor to the control system 170.
The sensor system 140 may be provided with a sensor for detecting an external state of the vehicle 100. The sensor system 140 may be provided with a sensor for detecting the state of the vehicle interior space 120. The sensor system 140 may be provided with a sensor for detecting a state of each of one or more subspaces provided inside the vehicle interior space 120. As the state detected by the various sensors, there are exemplified (i) at least one of the temperature, humidity and cleanliness of the air, (ii) illuminance, (iii) sound volume, and (iv) concentration of a specific odorant.
The sensor system 140 may be provided with sensors for collecting information for estimating the own position of the vehicle 100. The sensor may be a GPS signal receiver, an acceleration sensor, a gyro sensor, an orientation sensor, or a rotary encoder.
In the present embodiment, the input/output system 150 receives an input from a passenger of the vehicle 100. The input/output system 150 can transmit the received information to the control system 170. The input-output system 150 outputs information to the occupants of the vehicle 100. The input-output system 150 may output information based on instructions of the control system 170.
The input-output system 150 may have a photographing device for photographing the external state of the vehicle 100. The input/output system 150 may have a camera for capturing the state of the vehicle interior space 120. The input-output system 150 may transmit image data of an image captured by the capturing device to the control system 170.
The input-output system 150 may have a sound collection device for collecting sound outside the vehicle 100. The input-output system 150 may have a sound collection device for collecting sound in the vehicle interior space 120. The input-output system 150 may transmit sound data of the sound collected by the sound collection device to the control system 170. Details of the input-output system 150 will be described later.
In the present embodiment, the environment adjustment system 160 adjusts the environment of the in-vehicle space 120. The environment adjustment system 160 adjusts the environment of the vehicle interior space 120, for example, by acting on at least one of the sight, hearing, and smell of the passenger present inside the vehicle interior space 120. The environment adjustment system 160 can adjust the environment of the vehicle interior space 120 by independently acting on at least one of the sight, hearing, and smell of each of a plurality of occupants present inside the vehicle interior space 120. Details of the environment adjustment system 160 will be described later.
In the present embodiment, the control system 170 controls each unit of the vehicle 100. In one embodiment, control system 170 controls the action of drive system 130. In another embodiment, control system 170 controls the actions of environmental conditioning system 160. For example, the control system 170 determines a control mode as a control target of the environment adjustment system 160. Furthermore, the control system 170 controls the operation of the environmental adjustment system 160 based on the determined control mode.
The control system 170 may independently control the environment of each of a plurality of sub-spaces configured within the interior of the in-vehicle space 120. For example, the control system 170 determines whether to activate or deactivate an independent control mode in which the environment of each of a plurality of subspaces disposed inside the vehicle interior space 120 is independently adjusted. When validation of the independent control mode is determined, for example, the control system 170 first determines a control mode to be applied to each subspace. The control system 170 then controls the environment adjustment system 160 to adjust the environment of each subspace based on the control target indicated by the control mode applied to that subspace.
In one embodiment, the control system 170 adjusts the environment of each subspace such that the environment of one subspace is the same as or similar to the environment of the other subspace. Thus, various experiences may be shared between one rider present in one subspace and other riders present in other subspaces. In addition, for example, communication between one rider present in one subspace and other riders present in other subspaces may be facilitated.
In another embodiment, the control system 170 adjusts the environment of each subspace such that the environment of one subspace is different from the environment of the other subspace. Thus, various experiences may not be shared between one rider present in one subspace and other riders present in other subspaces. Further, it is possible to suppress communication between one rider present in one subspace and other riders present in other subspaces.
On the other hand, when the invalidation of the independent control mode is determined, the control system 170 terminates the independent control mode. This also terminates the effect of promoting communication or the effect of suppressing communication. Details of the control system 170 are described later.
As described above, the control system 170 determines the control mode as the control target of the environment adjustment system 160. Here, the environment adjustment system 160 may have a plurality of types of independent control modes. In this case, the control system 170 may determine one of the plurality of independent control modes as the control mode that is the control target of the environment adjustment system 160. As the plurality of independent control modes, (a) a suppression mode for adjusting the environment of the vehicle interior space 120 to suppress communication between one rider and the other rider, and (b) a promotion mode for adjusting the environment of the vehicle interior space 120 to promote communication between one rider and the other rider, are exemplified.
The plurality of independent control modes may include a plurality of promotion modes that promote communication between one rider and the other rider to different degrees. As the plurality of promotion modes, (i) a first promotion mode for assisting in establishing communication between one rider and the other rider, and (ii) a second promotion mode for assisting in establishing communication between one rider and the other rider more powerfully than the first promotion mode, and the like are exemplified. The second facilitation mode may also be a mode for forcibly establishing communication between one rider and the other rider.
For example, when the control system 170 controls the operation of the environment adjustment system 160 to control the auditory experience of one rider and the other riders, in the first facilitation mode, the volume of the conversation between the riders is amplified to be transmitted to each other, or the environmental sound which becomes the noise of the conversation is reduced to allow the voice of the speaker to be clearly transmitted. And in the second facilitation mode, at least one of the positions and orientations of the seats of the two occupants is adjusted so that the one occupant and the other occupant face each other, or at least one of the positions and orientations of the seats of the two occupants is adjusted so that the distance between the one occupant and the other occupant becomes shorter.
The plurality of independent control modes may include a plurality of suppression modes having different degrees of suppression of communication between one rider and another rider. As the plurality of suppression modes, (i) a first suppression mode for reducing the communication between one occupant and the other occupant, and (ii) a second promotion mode for reducing the communication between one occupant and the other occupant more strongly than the first suppression mode are exemplified. The second suppression mode may also be a mode for blocking communication between one rider and another rider.
For example, when the control system 170 controls the operation of the environment adjustment system 160 to control the auditory experience of one rider and the other riders, in the first suppression mode, the volume of the dialog between the riders is reduced to transmit to each other, or the ambient sound that is the noise of the dialog is amplified to interfere with the transmission of the speaker's sound. In the second suppression mode, at least one of the positions and orientations of the seats of the two occupants is adjusted so that the one occupant and the other occupant face away from each other, or at least one of the positions and orientations of the seats of the two occupants is adjusted so that the distance between the one occupant and the other occupant becomes longer.
The vehicle 100 may be an example of a mobile body. The vehicle 100 may be an example of a space management system. Each of the plurality of riders may be an example of a first rider or a second rider. One rider may be an example of one of the first rider or the second rider. The other rider may be an example of the other of the first rider or the second rider. The in-vehicle space 120 may be an example of a shared space. One subspace may be an example of one of the first subspace or the second subspace. The other subspace may be an example of another subspace of the first subspace or the second subspace. The driving system 130 may be an example of a driving part. The input/output system 150 may be an example of an instruction accepting unit. The environmental conditioning system 160 may be an example of a conditioning section. The control system 170 may be an example of a space management system. The control system 170 may be an example of a control pattern decision section and an environment control section.
In the present embodiment, the mobile body is described in detail by taking the case where the mobile body is the vehicle 100 as an example. The mobile body is not limited to the vehicle 100 according to the present embodiment. Other examples of the mobile body include a ship, an aircraft, and the like. Examples of the vessel include a ship, a hovercraft, a water motorcycle, a submarine, and an underwater scooter. The aircraft may be an airplane, an airship, a hot air balloon, a helicopter, an unmanned aerial vehicle, or the like.
[ concrete configuration of each unit of vehicle 100 ]
In the present embodiment, each unit of the vehicle 100 may be realized by hardware, may be realized by software, or may be realized by hardware and software. For example, in the present embodiment, at least a part of the control system 170 is implemented by a computer mounted on the vehicle 100. In addition, at least a portion of the control system 170 may be implemented by a single server or may be implemented by a plurality of servers. At least a portion of the elements of control system 170 may be implemented on a virtual server or cloud system. At least a part of the units of the control system 170 may be implemented by a personal computer or a mobile terminal. As the mobile terminal, a mobile phone, a smartphone, a PDA, a tablet computer, a notebook or laptop computer, a wearable computer, and the like can be exemplified. The units of the control system 170 may also store information using distributed accounting techniques such as blockchains or a distributed network.
When at least a part of the components constituting the vehicle 100 is realized by software, the components realized by the software can be realized by starting a program that defines operations related to the components in an information processing device having a general configuration. The information processing apparatus having the above general configuration may include: (i) a data processing device having a processor such as a CPU or GPU, a ROM, a RAM, a communication interface, and the like; (ii) input devices such as a keyboard, a pointing device, a touch panel, a camera, a voice input device, a gesture input device, various sensors, and a GPS receiver; (iii) output devices such as a display device, a sound output device, and a vibration device; and (iv) storage devices (including external storage devices) such as memories, HDDs, SSDs, and the like.
In the information processing apparatus having the above general configuration, the data processing apparatus or the storage apparatus may store the program. The program is executed by a processor, and the information processing apparatus is caused to execute an operation defined by the program. The above-described program may also be stored in a non-transitory computer-readable recording medium. The program may be stored in a computer-readable medium such as a CD-ROM, a DVD-ROM, a memory, a hard disk, or a storage device connected to a network.
The program may be a program for causing a computer to function as the vehicle 100 or a part thereof. The program may include a module that defines the operation of each unit of the vehicle 100. These programs and modules act on a data processing device, an input device, an output device, a storage device, and the like, and cause a computer to function as each unit of the vehicle 100, or cause a computer to execute an information processing method in each unit of the vehicle 100. The above program may also be installed into a computer constituting at least a part of the vehicle 100 from a computer-readable medium or a storage device connected to a network. By executing the program, the computer can function as at least a part of each unit of the vehicle 100. The information processing described in the above-described program functions as a specific means by which software related to the program cooperates with various hardware resources of the vehicle 100 or a part thereof by the program being read by a computer. The vehicle 100 according to the present embodiment is configured to calculate or process information corresponding to the intended use of the computer by the above-described specific means.
The program may be a program for causing a computer to function as the control system 170. The program may be a program for causing a computer to execute the information processing method in the control system 170.
The above-described information processing method may be a space management method of managing an environment of the shared space. The shared space may be provided inside the mobile body, and may be a space that can be used by both the first passenger and the second passenger. The space management method may include a control mode decision step of deciding a control mode as a control target of an adjustment unit for adjusting the environment of the shared space. The control mode determining step may include a step of determining whether to validate or invalidate an independent control mode in which an environment in which a first subspace of a first occupant exists as a part of the shared space and an environment in which a second subspace of a second occupant exists as a part of the shared space are independently adjusted. The space management method may include an environment control step of controlling an operation of the adjusting section based on the control mode determined in the control mode determining step.
The in-vehicle space 120 will be described in detail with reference to fig. 2, 3, and 4. Fig. 2 schematically shows an example of a side view of the vehicle interior space 120. Fig. 3 schematically shows an example of a plan view of the vehicle interior 120. Fig. 2 may be an example of a-a' section in fig. 3. Fig. 4 schematically shows another example of a plan view of the vehicle interior 120.
As shown in fig. 2 and 3, the seat 212, the seat 214, the seat 312, and the seat 314 are disposed inside the housing 210 of the vehicle 100. In the example shown in fig. 2, the rider 20 is seated on a seat 212 and the rider 40 is seated on a seat 214. The seat 312 may be a driver seat of the vehicle 100. Note that the driver seat is not limited to the seat 312. Further, when the vehicle 100 moves by the full-automatic driving, the vehicle 100 may not be provided with a driver seat.
The rider 20 inputs information to the control system 170 or receives information from the control system 170 via the input-output system 150. The rider 20 may also input information to the control system 170 or receive information from the control system 170 via the communication terminal 22.
The communication terminal 22 is not particularly limited in detail as long as it is an information processing device capable of transmitting and receiving information with the control system 170. The communication terminal 22 is exemplified by a personal computer, a mobile terminal, and the like. As mobile terminals, mobile phones, smart phones, PDAs, tablet computers, notebook or laptop computers, wearable computers, and the like are exemplified.
The rider 40 inputs information to the control system 170 or receives information from the control system 170 via the input-output system 150. The rider 40 may also input information to the control system 170 or receive information from the control system 170 via the communication terminal 42.
The communication terminal 42 is not particularly limited in detail as long as it is an information processing device capable of transmitting and receiving information with the control system 170. The communication terminal 42 is exemplified by a personal computer, a mobile terminal, and the like. As mobile terminals, mobile phones, smart phones, PDAs, tablet computers, notebook or laptop computers, wearable computers, and the like are exemplified.
As shown in fig. 3, a subspace 220, a subspace 240, a subspace 320, and a subspace 340 are provided inside the vehicle interior space 120. The subspace 220 may be part of the interior space 120 and is the region in which the seat 212 or the passenger 20 using the seat 212 is present. The subspace 240 may be a portion of the in-vehicle space 120 and is the region in which the seat 214 or the rider 40 using the seat 214 is present. The subspace 320 may be a portion of the in-vehicle space 120 and is an area where the seat 312 or a passenger using the seat 312 is present. The subspace 340 may be a portion of the in-vehicle space 120 and is an area where the seat 314 or a rider using the seat 314 is present.
As shown in fig. 2, the subspace 222 and the subspace 242 are disposed inside the vehicle interior space 120. The subspace 222 may be a portion of the subspace 220 and is the region in which the head of the rider 20 using the seat 212 is located. The subspace 222 may be sized larger than the head of the rider 20. The subspace 242 may be a portion of the subspace 240 and is the area where the head of the occupant 40 is located using the seat 214. The subspace 242 may be sized larger than the head of the rider 40.
Further, similarly to the case of the subspaces 220 and 240, a subspace may be provided in the vicinity of the region where the head of the occupant who uses the seat 312 is located inside the subspace 320. Further, a subspace may be provided in the vicinity of the region where the head of the occupant who uses the seat 314 is located inside the subspace 340.
As described above, in the present embodiment, when the independent control mode is activated, the internal environment of the space is adjusted for each subspace. By providing the subspace inside the in-vehicle space 120 as described above, the vehicle 100 can adjust the in-vehicle environment for each rider.
As shown in fig. 2 and 3, the in-vehicle space 120 may be a space (sometimes referred to as a shared space) that can be commonly used by a plurality of riders. In the interior space 120, there is no space (sometimes referred to as an isolation space) surrounded on four sides by partitions or walls and exclusive for some of the occupants of the vehicle 100.
As for the shared space, the in-vehicle space 120 may be defined as a space other than the isolated space among the spaces inside the housing 210 where the occupants of the vehicle 100 can stay. In addition, the above description does not exclude an embodiment in which an isolation space is provided inside the case 210.
Fig. 4 shows an example of an embodiment in which the insulation space is arranged inside the housing 210. Referring to fig. 4, an example of an embodiment in which the shared space and the isolated space are adjacently arranged in the interior of the vehicle 100 is shown. As shown in fig. 4, the vehicle interior space 120 is divided into a first space 460 and a second space 480 by the partition 412 and the door 414. Each of the first space 460 and the second space 480 is enclosed by the case 210, the partition 412, and the door 414.
In the present embodiment, a plurality of seats are disposed in the first space 460 and can be commonly used by a plurality of passengers. On the other hand, the second space 480 may be used exclusively by a part of the plurality of occupants for at least a predetermined period. As shown in fig. 4, although an isolation space is provided inside the case 210, there is no isolation space inside the first space 460. In addition, similarly to the in-vehicle space 120 in the embodiment described with reference to fig. 2 and 3, one or more subspaces may be provided inside the first space 460. In addition, the vehicle 100 may adjust the environment inside the first space 460 for each subspace disposed inside the space.
The rider 20 may be an example of one of a first rider or a second rider. The rider 40 may be an example of another rider of the first rider or the second rider. The subspace 220 may be an example of one of the first subspace or the second subspace. The subspace 240 may be an example of another subspace of the first subspace or the second subspace. Similarly, the rider using the seat 312 may be an example of a first rider or a second rider. The rider using the seat 314 may be an example of a first rider or a second rider. Subspace 320 may be an example of a first subspace or a second subspace. The first space 460 may be an example of a shared space. The second space 480 may be an example of an isolation space.
Fig. 5 schematically shows an example of the internal configuration of the input-output system 150. In the present embodiment, the input/output system 150 includes an input unit 512, an output unit 514, and a communication unit 516. In the present embodiment, the input unit 512 includes one or more exterior cameras 522, one or more exterior microphones 524, one or more interior cameras 526, and one or more interior microphones 528. The input unit 512 includes one or more switches 532, one or more touch panels 534, one or more voice input units 536, and one or more gesture input units 538. In the present embodiment, the output unit 514 includes one or more speakers 542 and one or more displays 544.
In the present embodiment, the input unit 512 receives an input from at least one of a plurality of occupants. The input may be an instruction from at least one of the occupant 20 and the occupant 40, and an instruction regarding activation, deactivation, or switching of the independent control mode. The input may be an instruction from the driver of the vehicle 100 and an instruction for transmitting a message of the driver to other occupants.
The input portion 512 may acquire information indicating the external condition of the vehicle 100. The input unit 512 may acquire information indicating the internal state of the vehicle interior space 120. The input section 512 may transmit the input information to the control system 170.
In the present embodiment, the output unit 514 outputs information to the passenger of the vehicle 100. The output section 514 may output information based on an instruction of the control system 170.
In the present embodiment, the communication unit 516 transmits and receives information to and from an external information processing apparatus via a communication network. For example, the communication portion 516 transmits and receives information with a communication terminal (e.g., the communication terminal 22 or the communication terminal 42) of a passenger of the vehicle 100. The communication unit 516 may receive information input to the communication terminal by at least one of the plurality of occupants. The communication part 516 may transmit the received information to the control system 170.
The communication network may be a wired communication transmission line, a wireless communication transmission line, or a combination of a wireless communication transmission line and a wired communication transmission line. The communication network may include a wireless packet communication network, the internet, a P2P network, a private line, a VPN, a power line communication line, and the like. The communication network may also include a mobile communication network such as a mobile phone line network. The communication network may include a wireless data communication network such as a wireless MAN (e.g., WiMAX (registered trademark)), a wireless LAN (e.g., WiFi (registered trademark)), bluetooth (registered trademark), Zigbee (registered trademark), nfc (near Field communication), or the like. The communication network may include a communication line for V2X such as vehicle-to-vehicle communication and road-to-vehicle communication.
In the present embodiment, the vehicle exterior camera 522 images the situation outside the vehicle 100. Thereby, the external information of the vehicle 100 is acquired. In the present embodiment, the exterior microphone 524 collects sound outside the vehicle 100. Thereby, the external information of the vehicle 100 is acquired.
In the present embodiment, the interior camera 526 captures an image of the state of the interior of the vehicle interior space 120. Thereby, information indicating gestures of one or more riders is acquired. Each of the plurality of in-vehicle cameras 526 may image the state of each of the plurality of subspaces, and a single in-vehicle camera 526 may image the state of the plurality of subspaces.
In the present embodiment, the in-vehicle microphone 528 collects sound inside the in-vehicle space 120. Thereby, information indicating the voice of one or more occupants is acquired. Examples of the information indicating the voice include information indicating the content of the voice, information indicating the volume of the voice, information indicating a change in the volume of the voice, and information indicating the speaking interval of the passenger.
Each of the plurality of in-vehicle microphones 528 may collect sound for each of the plurality of subspaces, and a single in-vehicle microphone 528 may collect sound for a plurality of subspaces. The in-vehicle microphone 528 may also be a directional microphone.
In the present embodiment, the switch 532 receives instructions from one or more occupants. For example, each of the one or more switches 532 corresponds to a particular action with respect to the vehicle 100. Each of the more than one switches 532 may correspond to a particular rider.
In the present embodiment, the touch panel 534 receives instructions from one or more occupants. For example, a particular region of touch panel 534 corresponds to a particular action with respect to vehicle 100. Each of the more than one touch panel 534 may correspond to a particular rider.
In the present embodiment, the audio input unit 536 analyzes audio data acquired by the in-vehicle microphone 528 and receives instructions from one or more occupants. In the present embodiment, the gesture input unit 538 analyzes image data acquired by the in-vehicle camera 526 and receives an instruction from one or more occupants.
In the present embodiment, the speaker 542 outputs audio information to each of one or more occupants. Each of the one or more speakers 542 may correspond to a particular rider. Speaker 542 may have directivity.
In the present embodiment, the display 544 outputs images to each of one or more occupants. The image may be an animation or a still image. The image may be an enlarged image or a reduced image. Each of the more than one display 544 may correspond to a particular rider.
The input unit 512 and each unit of the input unit 512 may be an example of an instruction receiving unit and an instruction receiving unit. Each unit of the output section 514 and the output section 514 may be an example of an adjusting section. The communication unit 516 may be an example of an instruction accepting unit. The exterior camera 522 may be an example of an external information acquisition section and an image pickup device. The off-vehicle microphone 524 may be an example of an external information acquisition section and a sound collection device. The in-vehicle camera 526 may be an example of a passenger information acquisition section and an imaging device. The in-vehicle microphone 528 may be an example of the passenger information acquisition section and the sound collection device.
Fig. 6 schematically shows an example of the internal configuration of the environment adjustment system 160. In the present embodiment, the environment adjustment system 160 includes an air conditioner 620, a light adjuster 630, a seat adjuster 640, and a running sound adjuster 650. In the present embodiment, air conditioner 620 includes air supply portion 622, air discharge portion 624, and air cleaning portion 626. In the present embodiment, the light adjusting section 630 includes an illumination section 632 and an external light adjuster 634. The units of the environment adjustment system 160 may be examples of the adjustment section.
In the present embodiment, the air conditioner 620 adjusts the air environment inside the vehicle interior space 120. The air conditioner 620 may adjust an air environment of each of a plurality of sub-spaces disposed inside the vehicle interior space 120. The air conditioner 620 may be an air conditioner, a window, or the like. The operation of each unit of the air conditioner 620 will be described in detail later.
In the present embodiment, air supply unit 622 supplies air to the interior of vehicle interior space 120. Air supply unit 622 can supply air treated by air cleaning unit 626 to the inside of vehicle interior space 120.
In the present embodiment, exhaust unit 624 exhausts the air inside vehicle interior space 120 to the outside of vehicle 100. The exhaust unit 624 may supply air inside the vehicle interior space 120 to the air purification unit 626.
In the present embodiment, the air cleaning unit 626 cleans the supplied air. For example, the air purification portion 626 purifies air supplied from the exhaust portion 624. The air cleaning part 626 may supply cleaned air to the air supply part 622.
In the present embodiment, the light control unit 630 adjusts the lighting environment of the vehicle interior space 120. The light adjusting part 630 may adjust the lighting environment of each of a plurality of subspaces provided inside the in-vehicle space 120. The actions of the respective units of the light adjusting part 630 will be described in detail later.
In the present embodiment, the lighting unit 632 irradiates light into the interior space 120. The lighting unit 632 may include one or more lamps. At least one of the more than one lamps may be a spotlight.
In the present embodiment, the external light adjuster 634 adjusts the amount of light entering the interior of the vehicle interior space 120 from the outside of the vehicle 100. The external light adjusting section 634 may adjust the amount of visible light incident from the outside of the vehicle 100 to the inside of the vehicle interior space 120. As the external light adjuster 634, a light control glass, a movable light shielding member, and the like are exemplified. Examples of the light control system of the light control glass include a liquid crystal system and an electrochromic system. Examples of the light-shielding member include a curtain and a blind.
In the present embodiment, the seat adjusting unit 640 adjusts the seat of the vehicle 100. For example, the seat adjusting section 640 adjusts at least one of the position and the posture of each seat. The posture of the seat may be, for example, the orientation of the seat, the inclination angle of the seat, or the raising and lowering of the armrest. The seat adjusting unit 640 may control the operation of the movable unit of each seat. Details of the action of the seat adjuster 640 will be described later.
The running sound adjustment unit 650 adjusts the transmission of the running sound to the interior space 120. For example, the running sound adjustment unit 650 controls the operation of a noise cancellation device that generates a cancellation sound for canceling an engine sound, a motor sound, a road noise, and the like. The running sound adjustment unit 650 may also control the operation of a vibration suppression device that suppresses the transmission of the vibration of the tire to the vehicle body. Details of the action of the running sound adjuster 650 will be described later.
Fig. 7 schematically shows an example of the internal configuration of the control system 170. In the present embodiment, the control system 170 includes an operation management unit 720, a transition event detection unit 732, a pattern determination unit 734, and an in-vehicle environment control unit 740. In the present embodiment, the in-vehicle environment control unit 740 includes a visual field environment control unit 742, an acoustic environment control unit 744, and an air environment control unit 746.
In the present embodiment, the operation management unit 720 manages the operation of the vehicle 100. For example, the operation management section 720 controls the drive system 130 to move the vehicle 100. The operation management section 720 may acquire information indicating the state of the drive system 130 from the drive system 130. The operation management portion 720 may acquire information indicating the current position of the vehicle 100 from the sensor system 140. The operation management portion 720 may acquire information indicating the destination of the vehicle 100 from the input-output system 150. The operation management unit 720 may transmit the above-described various information to the transition event detection unit 732.
In the present embodiment, the transition event detecting unit 732 detects an event (sometimes referred to as a transition event) related to a transition of the control mode of the environment adjustment system 160. Examples of the transition of the control mode include activation of the independent control mode, deactivation of the independent control mode, and switching between a plurality of independent control modes. The transition event detecting portion 732 may detect a transition event based on information acquired by each unit of the vehicle 100.
In one embodiment, when the input/output system 150 receives an instruction for activating, deactivating, or switching the independent control mode from at least one of the plurality of occupants, the transition event detection unit 732 acquires information indicating that the instruction is input from the input/output system 150. The transition event detector 732 may acquire information indicating the content of the instruction. The information indicating the indicated content may be an example indicating that information indicating this is input. If the transition event detector 732 acquires information indicating that the instruction is input, the transition event detector 732 detects the occurrence of a transition event and transmits information indicating the content of the instruction to the pattern determiner 734. The case where the above-described indication is input may be an example of a transition event.
In another embodiment, the transition event detector 732 acquires information representing at least one of a gesture and a sound of at least one of the plurality of riders from the input-output system 150. The gesture includes a movement of the body, an expression, and the like. Examples of the body include hands, feet, face, head, eyes, and mouth. The transition event detecting unit 732 detects the occurrence of a predetermined transition event (sometimes referred to as a first event) based on the acquired information. For example, the transition event detecting part 732 analyzes at least one of the gesture and the sound, and detects the occurrence of the first event if a predetermined pattern is detected.
The transition event detecting unit 732 may transmit information indicating that the first event has occurred to the pattern determining unit 734. The transition event detecting unit 732 may transmit information indicating the content or type of the first event to the pattern deciding unit 734.
As the first event concerning the validation of the promotion pattern, there are (i) a detection of a motion in which one passenger turns his eyes, face, or body to another passenger, (ii) a detection of a motion in which one passenger speaks with another passenger, (iii) a detection of a motion in which one passenger touches another passenger, (iv) a detection of a motion in which one passenger listens to another passenger, and (v) a detection of an expression in which one passenger exhibits difficulty in understanding the speech of another passenger. The act of one rider speaking to the other rider may also be an act of attempting to speak to the other rider. The action of one rider touching the other may also be an attempt to touch the other rider.
Examples of the motion intended to speak to another occupant or to touch another occupant include (i) a motion to turn a hand or a foot toward another occupant or to extend toward another occupant, (ii) a motion to produce a specific expression, (iii) a motion to move a body away from a seat surface or a back surface of the seat, and (iv) a motion to lift an arm rest disposed on the seat. The actions of listening to the other passengers include actions focusing on the speech, actions of issuing an attachment, and the like.
As the first event concerning the validation of the promotion pattern, there are mentioned (i) a case where the volume of speech of one rider is larger than a predetermined value, (ii) a case where the content of speech of one rider includes the names or titles of other riders, and (iii) a case where the intonation pattern of speech of one rider matches or is similar to the predetermined pattern. The predetermined pattern related to the intonation includes a pattern when the user calls the speaker, a pattern when the user has a feeling of tension or an active emotion, and the like.
When the speech content of one rider includes the name or title of another rider, the promotion mode can be selected as the control mode for the subspace where the other rider exists, regardless of the state of the current control mode for the subspace where the other rider exists. That is, the promotion mode of one rider may override the suppression mode of the other rider.
When the name or title of another occupant is included in the speech content of one occupant and the number of occurrences or frequency of occurrences of the name or title is greater than a predetermined value, the promotion mode can be selected as the control mode for the subspace where the other occupant exists, regardless of the state of the current control mode for the subspace where the other occupant exists. When the speech content of one rider includes the name or title of the other rider and the volume of speech is larger than a predetermined value, the promotion mode can be selected as the control mode for the subspace where the other rider exists, regardless of the state of the current control mode for the subspace where the other rider exists.
As the first event concerning the activation of the suppression mode, there are exemplified (i) detection of an action of one rider blocking interference from another person, (ii) detection of one rider being sleeping or trying to sleep, (iii) detection of a degree of change in the body or expression of one rider not satisfying a predetermined criterion, (iv) detection of one rider sitting on the seat in a predetermined posture, and (v) detection of a negative emotion by analyzing a gesture of one rider, and the like. Examples of the operation for blocking the disturbance from another person include (i) an operation of turning the face or body in a direction in which no other person is present, (ii) a prone operation, (iii) an operation of covering the ear with a hand or an article, (iv) an operation of covering the face with a hand or an article, (v) an operation of covering the nose with a hand or an article, (vi) an operation of suppressing the operation of another person, (vii) an operation of leaning deep on the back surface of the seat, and (ix) an operation of lowering the armrest of the seat.
As the first event concerning the validation of the suppression pattern, there are mentioned (i) an interval of speech of one occupant is larger than a predetermined value, (ii) a content of speech of one occupant includes a word rejecting interference from another person, and (iii) a intonation pattern of speech of one occupant matches or is similar to a predetermined pattern. As the predetermined pattern related to the intonation, a pattern when interference from another person is rejected, a pattern when a negative emotion is present, and the like are exemplified.
Similarly, a first event relating to facilitating invalidation of a mode may also be set. The first event relating to the invalidation of the suppression mode may be used as the first event relating to the invalidation of the promotion mode. Further, a first event related to invalidation of the suppression mode may be set. The first event relating to the validation of the promotion mode may also be used as the first event relating to the invalidation of the suppression mode.
As the first event relating to switching between the plurality of facilitation modes, a specific single event may be specified, and a combination of the plurality of events may be used as the first event relating to switching between the plurality of facilitation modes. The combination of the plurality of events may be a specific combination or may be an arbitrary combination. For example, a weight or a point number is given to each event in advance, and when the total value of the weight or the point number of the detected event is greater than a predetermined threshold value, a first event related to switching between the plurality of promotion modes is detected. Thus, any combination of events may be used as a first event in connection with switching between the plurality of facilitation modes.
Similarly, as the first event relating to switching between the plurality of suppression modes, a specific single event may be specified, and a combination of the plurality of events may be used as the first event relating to switching between the plurality of suppression modes. The combination of the plurality of events may be a specific combination or may be an arbitrary combination. For example, a weight or a point number is given to each event in advance, and when the total value of the weight or the point number of the detected event is greater than a predetermined threshold value, a first event related to switching between the plurality of suppression modes is detected. Thus, an arbitrary combination of a plurality of events can be used as the first event relating to switching between the plurality of suppression modes.
In another embodiment, the transition event detection part 732 acquires information indicating the state of the drive system 130 from the operation management part 720. The transition event detecting part 732 detects the occurrence of a predetermined transition event (sometimes referred to as a second event) based on the acquired information. For example, the transition event detecting part 732 analyzes the state of the drive system 130, and detects the occurrence of a second event in the case where a predetermined pattern is detected.
The transition event detector 732 may transmit information indicating that the second event has occurred to the pattern determiner 734. The transition event detecting unit 732 may transmit information indicating the content or type of the second event to the pattern deciding unit 734. The predetermined mode includes, for example, the operation of a safety device, the change of the seat arrangement by the occupant, and the opening and closing of a window by the occupant.
In another embodiment, the transition event detecting unit 732 acquires information indicating the external state of the vehicle 100 from the input/output system 150. The transition event detecting unit 732 detects the occurrence of a predetermined transition event (sometimes referred to as a third event) based on the acquired information. For example, the transition event detecting part 732 analyzes the external state of the vehicle 100, and detects the occurrence of a third event in the case where a predetermined pattern is detected.
The transition event detector 732 may transmit information indicating that the third event has occurred to the pattern determiner 734. The transition event detecting unit 732 may transmit information indicating the content or type of the third event to the pattern deciding unit 734. As the predetermined pattern, there are exemplified a car-to-car distance with another vehicle being less than a predetermined value, an emergency vehicle approaching, and the presence of a specific land or building such as a sightseeing spot in the vicinity of the vehicle.
In the present embodiment, the mode determination unit 734 determines the control mode that is the control target of the environment adjustment system 160. The pattern determination unit 734 may transmit the determined control pattern to the in-vehicle environment control unit 740.
The mode determination unit 734 may determine activation, deactivation, or switching of the independent control mode. The mode determination unit 734 may determine one of the plurality of independent control modes as the control mode that is the control target. The mode decision section 734 may decide the control mode as the control target based on the type or content of the transition event detected by the transition event detection section 732.
When conflicting events are detected for the control patterns of a particular subspace, the pattern determination section 734 may determine which event takes precedence. For example, the pattern determination unit 734 may determine which event takes precedence based on at least one of the type of event, the combination of events, and the frequency of detection of events.
For example, there are cases where the transition event detecting unit 732 detects, at the same time, (i) a first event X indicating that the control mode of the subspace where the occupant B is located is set to the promotion mode based on the gesture of the occupant a, and (i) a first event Y indicating that the control mode of the subspace where the occupant B is located is set to the suppression mode based on the gesture of the occupant B. In this case, for example, when the first event X is an event of a predetermined type, the mode determination unit 734 determines to set the control mode of the subspace in which the rider B is present as the promotion mode, regardless of the current control mode of the subspace in which the rider B is present and the type of other events related to the subspace in which the rider B is present (sometimes referred to as coverage). The mode determination unit 734 may determine the control mode of the subspace in which the occupant B is located, by a combination of the first event X and the first event Y.
Similarly, for example, when the first event Y is an event of a predetermined type, the mode decision section 734 decides to set the control mode of the subspace where the rider B is located as the suppression mode regardless of the type of other events related to the subspace where the rider B is located. The mode determination unit 734 may determine the control mode of the subspace in which the occupant B is located, by a combination of the first event X and the first event Y.
In one embodiment, the transition event detection unit 732 detects an input of an instruction regarding activation, deactivation, or switching of the independent control mode from at least one of the plurality of occupants. In this case, the mode determination unit 734 determines whether the independent control mode is activated, deactivated, or switched based on the instruction.
In another embodiment, the transition event detector 732 detects a first event. In this case, the mode determination unit 734 determines whether the independent control mode is activated, deactivated, or switched based on the type of the first event.
In another embodiment, the transition event detector 732 detects a second event. In this case, the mode determination unit 734 determines whether the independent control mode is validated, invalidated, or switched based on the type of the second event.
In another embodiment, the transition event detector 732 detects a third event. In this case, the mode determination unit 734 determines whether the independent control mode is validated, invalidated, or switched based on the type of the third event.
In still another embodiment, the mode determination unit 734 may determine to invalidate the independent control mode (i) when a predetermined period of time has elapsed after the independent control mode is validated, (ii) when the vehicle 100 has moved a predetermined distance after the independent control mode is validated, or (iii) when a distance between the position of the vehicle 100 and the destination of the vehicle 100 is less than a predetermined value. The mode determination unit 734 may acquire information indicating the current position of the vehicle 100 and information indicating the destination of the vehicle 100 from the operation management unit 720.
In the present embodiment, the in-vehicle environment control unit 740 controls the environment of the in-vehicle space 120. In one embodiment, the in-vehicle environment control unit 740 acquires information indicating a control mode that is a control target of the environment adjustment system 160 from the mode determination unit 734. In-vehicle environment control unit 740 controls the operation of environment adjustment system 160 based on the information indicating the control mode.
In another embodiment, when the input/output system 150 receives a command from the driver of the vehicle 100 to transmit a message of the vehicle 100 to another passenger, the in-vehicle environment control unit 740 acquires information indicating that the command is received from the input/output system 150. In this case, the in-vehicle environment control unit 740 may control the operation of the environment adjustment system 160 so that the message of the driver is transmitted to the other occupant regardless of the control mode determined by the mode determination unit 734.
In the present embodiment, the visual field environment control unit 742 controls the visual field environment of the vehicle interior space 120. When the independent control mode is activated, the visual field environment control section 742 may control the visual field environment for each subspace. The visual field environment controller 742 controls the visual field environment for each subspace by controlling at least one of the one or more displays 544, the one or more light adjusting sections 630, and the one or more seat adjusting sections 640, for example.
In one embodiment, when the mode determination unit 734 determines that the suppression mode is the control mode, the visual field environment control unit 742 controls the environment adjustment system 160 such that (i) the illuminance in at least one of the first subspace and the second subspace is reduced as compared to a case where the suppression mode is invalidated, (ii) the distance between the first seat disposed in the first subspace and the second seat disposed in the second subspace is increased as compared to a case where the suppression mode is invalidated, and/or (iii) the degree to which the back face of the first seat and the back face of the second seat face each other is increased as compared to a case where the suppression mode is invalidated.
In another embodiment, when the mode determination unit 734 determines that the promotion mode is the control mode, the visual field environment control unit 742 controls the environment adjustment system 160 such that (i) the illuminance in at least one of the first subspace and the second subspace is increased as compared to a case where the promotion mode is invalidated, (ii) the distance between the first seat disposed in the first subspace and the second seat disposed in the second subspace is decreased as compared to a case where the promotion mode is invalidated, and/or (iii) the degree to which the front face of the first seat and the front face of the second seat face each other is increased as compared to a case where the promotion mode is invalidated.
In the present embodiment, the sound environment control unit 744 controls the sound environment of the vehicle interior space 120. When the independent control mode is activated, the sound environment control section 744 may control the sound environment for each subspace. The sound environment control section 744 controls the sound environment of each subspace by controlling at least one of the one or more seat adjusting sections 640, the one or more speakers, the one or more traveling sound adjusting sections 650, and the one or more air conditioning sections 620, for example.
In one embodiment, when the mode determination unit 734 determines that the suppression mode is the control mode, the sound environment control unit 744 controls the environment adjustment system 160 such that (i) a volume of a cancellation sound for canceling at least a portion of a sound spoken by at least one of the first and second occupants is increased as compared to a case where the suppression mode is deactivated, (ii) a volume of a masking sound for masking at least a portion of a sound spoken by at least one of the first and second occupants is increased as compared to a case where the suppression mode is deactivated, and/or (iii) a volume of at least one of a travel sound, an air conditioning sound, and an external sound in at least one of the first and second subspaces is increased.
In another embodiment, the sound environment control portion 744 controls the environment adjustment system 160 to output the sound outside the vehicle 100 to a subspace corresponding to the driver seat. When the specific occupant desires a sound environment in which the external sound can be heard, the sound environment control section 744 may control the environment adjustment system 160 so as to output the external sound to the subspace corresponding to the seat of the specific occupant. The specific rider may be a rider other than the driver.
In another embodiment, when the mode determination unit 734 determines the promotion mode as the control mode, the sound environment control unit 744 controls the in-vehicle microphone 528 and the speaker 542 of the environment adjustment system 160 to output the sound emitted by one rider to a subspace where the other rider exists. The other riders may be specific riders designated by one rider.
In the present embodiment, the air environment control unit 746 controls the air environment of the vehicle interior space 120. When the independent control mode is activated, the air environment control part 746 may control the air environment of each subspace. The air environment control unit 746 controls the air environment for each sub-space by controlling at least one of the one or more air conditioning units 620 and the one or more seat adjusting units 640. The air circumstance control part 746 may control the air flow in the vehicle by adjusting at least one of the flow rate and the direction of the air discharged from the air conditioning part 620 and the position of the seat controlled by the seat adjusting part 640.
In one embodiment, when the mode decision section 734 decides that the suppression mode is the control mode, the acoustic environment control section 744 controls the environment adjustment system 160 to increase the amount of air discharged from at least one of the first subspace and the second subspace to at least one of the outside of the vehicle 100 and the air purification section 626, as compared to the case where the suppression mode is invalidated. In another embodiment, when the mode determination unit 734 determines the promotion mode as the control mode, the acoustic environment control unit 744 controls the environment adjustment system 160 to increase the amount of air circulating inside the shared space compared to the case where the promotion mode is invalidated.
The operation managing section 720 may be an example of a drive information acquiring section. The transition event detecting unit 732 may be an example of an instruction accepting unit, a rider information acquiring unit, a first event detecting unit, a driving information acquiring unit, a second event detecting unit, an external information acquiring unit, and a third event detecting unit. The pattern decision section 734 may be an example of a space management system. The pattern determination section 734 may be an example of a control pattern determination section. The in-vehicle environment control section 740 and its respective units may be an example of the environment control section.
Fig. 8 schematically shows an example of mode transition in the vehicle 100. As shown in fig. 8, when the transition event detection unit 732 detects a transition event, the mode determination unit 734 determines the control mode, and the in-vehicle environment control unit 740 changes the control mode. In the present embodiment, when the control mode of the environment adjustment system 160 is shifted from the second suppression mode to the normal mode, the first suppression mode is passed. However, the transition of the control mode is not limited to the present embodiment.
In another embodiment, the control mode of the environmental adjustment system 160 may transition from the second suppression mode to the normal mode without going through the first suppression mode. For example, as described above, when a predetermined period elapses after the independent control mode is activated, when the vehicle 100 moves a predetermined distance, or when the vehicle 100 reaches the vicinity of a destination or a transit point, the control mode of the environment adjustment system 160 may be shifted from the second suppression mode to the normal mode without passing through the first suppression mode.
In another embodiment, the control mode of the environmental adjustment system 160 may transition from the second suppression mode to the facilitation mode without going through the first suppression mode and the normal mode. For example, when a particular type of event or combination of particular events is detected, the control mode of the environmental adjustment system 160 transitions from the second suppression mode to the facilitation mode without going through the first suppression mode and the normal mode. The specific events include detection of an operation of the safety device or approach of the emergency vehicle by the transition event detection unit 732, detection of a sound volume of one occupant by the transition event detection unit 732 being larger than a predetermined value, detection of a case where the utterance of one occupant includes a predetermined word by the transition event detection unit 732, and reception of an instruction from the driver of the vehicle 100 by the input unit 512.
In another embodiment, the control mode of environmental adjustment system 160 may transition from the facilitation mode to the first suppression mode without going through the normal mode. The control mode of the climate adjustment system 160 may also transition from the facilitation mode to the second suppression mode without going through the normal mode and the first suppression mode.
In another embodiment, the environmental adjustment system 160 may set the transition mode of the control mode for each time period. The time zone includes a working day, a holiday, early morning, afternoon, daytime, evening, night, late night, and the like. The time period may also be specified by the user. For example, during the daytime, the control mode of the environment adjusting system 160 is set to the normal mode even if an event related to the promotion mode is not detected for a certain period of time. On the other hand, at night, when an event related to the promotion mode is not detected for a certain period of time, the control mode of the environment adjustment system 160 automatically shifts from the normal mode to the suppression mode.
Fig. 9 schematically shows an example of a space management method in the vehicle 100. An example of an adjustment method of the visual environment in the in-vehicle space 120 will be described with reference to fig. 9. An example of an adjustment method of the auditory environment in the vehicle interior space 120 will be described with reference to fig. 9. In addition, in the present embodiment, an example of a space management method in the vehicle 100 is described taking a case where communication between the rider 20 and the rider 40 is promoted or suppressed by adjusting the visual environment and the auditory environment as an example. However, the space management method of the vehicle 100 is not limited to the present embodiment. In other embodiments, communication between the rider 20 and the rider 40 may be facilitated or suppressed by adjusting one of the visual environment or the auditory environment.
In the present embodiment, the vehicle 100 includes an in-vehicle camera 922 and an in-vehicle camera 924 inside the casing 210. The vehicle 100 is provided with an in-vehicle microphone 932 and an in-vehicle microphone 934 inside the case 210. The vehicle 100 is provided with a display 942 and a display 944 inside the casing 210. The vehicle 100 is provided with speakers 952 and 954 inside the housing 210.
According to the present embodiment, in the promotion mode, the image of the rider 20 captured by the in-vehicle camera 922 is displayed on the display 944. The image of the occupant 40 captured by the in-vehicle camera 924 is displayed on the display 942. This promotes communication between the rider 20 and the rider 40.
In another embodiment, the image of the rider 20 captured by the in-vehicle camera 922 may also be displayed on the display of the communication terminal 42 of the rider 40. The image of the passenger 40 captured by the in-vehicle camera 924 may be displayed on the display of the communication terminal 22 of the passenger 20.
According to the present embodiment, in the promotion mode, the sound of the rider 20 collected by the in-vehicle microphone 932 is output from the speaker 954. In addition, the sound of the occupant 40 collected by the in-vehicle microphone 934 is output from the speaker 952. Thus, communication between the rider 20 and the rider 40 is promoted.
In another embodiment, the voice of the rider 20 collected by the in-vehicle microphone 932 may also be output from the voice output device of the communication terminal 42 of the rider 40. The text data of the voice or the sign language image may be displayed on the display of the communication terminal 42 of the rider 40. The voice of the occupant 40 collected by the in-vehicle microphone 934 may be output from the voice output device of the communication terminal 22 of the occupant 20. The text data of the voice or the sign language image may be displayed on the display of the communication terminal 22 of the rider 20.
Fig. 10 schematically shows an example of a space management method in the vehicle 100. An example of an adjustment method of a visual environment in the in-vehicle space 120 will be described with reference to fig. 10.
According to the present embodiment, in the promotion mode, the positions and postures of the seats 212, 214, and 314 are changed. Specifically, the seats 212 and 214 are opposite the seat 314. In addition, the distance between the seat 212 and the seats 214 and 314 becomes closer. This promotes communication between the riders who use the seats 212, 214, and 314.
According to the present embodiment, in the suppression mode, the postures of the seat 214 and the seat 314 are changed. Specifically, the seat 212 and the seat 314 face toward each other in another orientation. This suppresses communication between the occupants using the seats 212 and 314.
Fig. 11 schematically shows an example of a space management method in the vehicle 100. An example of an adjustment method of a visual environment in the in-vehicle space 120 will be described with reference to fig. 11. In the present embodiment, the vehicle 100 includes a lamp 1112, a lamp 1114, a lamp 1116, and a lamp 1118 inside the housing 210. Further, window 1122, window 1124, window 1126, and window 1128 are disposed on casing 210 of vehicle 100. For the windows 1122, 1124, 1126, and 1128, for example, a light control glass is used.
According to the present embodiment, in the promotion mode, for example, the lamp 1114 irradiates the seat 214 with light, and the lamp 1118 irradiates the seat 314 with light. This allows the occupant using the seat 214 and the occupant using the seat 314 to clearly see their faces. As a result, communication between the riders who use the seats 212 and 314 is promoted.
According to the present embodiment, in the suppression mode, for example, the window 1124 is made to reduce the transmittance of the outside light, and the window 1128 is made to reduce the transmittance of the outside light. Thus, the occupant using the seat 214 and the occupant using the seat 314 cannot see the faces of each other well. As a result, communication between the occupants using the seats 212 and 314 is suppressed.
Fig. 12 schematically shows an example of a space management method in the vehicle 100. An example of the adjustment method of the olfactory environment in the vehicle interior space 120 will be described with reference to fig. 12. In the present embodiment, the vehicle 100 includes an air supply pipe 1220, an air supply nozzle 1222, an air supply nozzle 1224, an air supply nozzle 1226, and an air supply fan 1228 inside the casing 210. Vehicle 100 includes an exhaust duct 1240, an exhaust nozzle 1242, an exhaust nozzle 1244, an exhaust nozzle 1246, and an exhaust fan 1248 inside casing 210.
According to the present embodiment, in the boost mode, air containing the same fragrance component is supplied from the air supply nozzle 1222 and the air supply nozzle 1224. This promotes communication between the occupants using the seats 212 and 214. In other embodiments, air drawn in by the air discharge nozzles 1242 may also be supplied to the sub-spaces 240 from the air supply nozzles 1224. In addition, the air sucked by the air discharge nozzle 1244 may also be supplied from the air supply nozzle 1222 to the subspace 220.
According to the present embodiment, in the suppression mode, air of the subspace 220 is discharged to the outside of the vehicle 100 from the exhaust nozzle 1242. The purified air is supplied from the air supply nozzle 1222 to the vehicle interior space 120, and functions as an air curtain partitioning the subspace 220 and the subspace 240. Similarly, air of the subspace 240 is discharged from the exhaust nozzle 1244 to the outside of the vehicle 100. The purified air is supplied from the air supply nozzle 1224 to the vehicle interior space 120, and functions as an air curtain that partitions the subspace 220 and the subspace 240. This suppresses communication between the occupants using the seats 212 and 214.
Fig. 13 schematically shows an example of the seat 212. In the embodiment according to fig. 9 to 12, the details of the vehicle 100 are described, taking as an example a case where the subspace 220 and a part of the subspace 240 are not physically separated, and the environments of the subspace 220 and the subspace 240 are independently controlled. The present embodiment differs from the embodiment of fig. 9 to 12 in that the physical cover 1300 incorporated in the seat 212 is deployed, and the environment of the subspace 222 provided in the vicinity of the head of the occupant using the seat 212 is independently controlled.
As shown in fig. 13, in the mode 1320 in the normal mode, the cover 1300 is housed in the headrest of the seat 212, for example. On the other hand, in configuration 1340 when in the independent control mode, the enclosure 1300 is deployed to cover the subspace 222. In this embodiment, the in-vehicle camera 922, the in-vehicle microphone 932, the display 942, the speaker 952, and the exhaust nozzle 1242 may be disposed inside the cover 1300. The deployment and storage of the cover 1300 may be performed automatically or manually. In addition, in other embodiments, the cover 1300 may be always unfolded without being stored, and may also be configured to be freely attached and detached.
FIG. 14 illustrates an example of a computer 3000 that can fully or partially embody aspects of the invention. Vehicle 100 or a portion thereof may be implemented by computer 3000. For example, the control system 170 is implemented by a computer 3000.
The program installed in the computer 3000 can cause the computer 3000 to function as an operation related to the apparatus according to the present embodiment or one or more "units" of the apparatus, or can cause the computer 3000 to execute the operation or the one or more "units", and/or can cause the computer 3000 to execute the process according to the present embodiment or the steps of the process. Such programs may be executed by the CPU3012 in order to cause the computer 3000 to perform certain operations associated with some or all of the functional blocks of the flowcharts and block diagrams described herein.
The computer 3000 of the present embodiment includes a CPU3012, a RAM3014, a graphic controller 3016, and a display device 3018, which are connected to each other through a main controller 3010. The computer 3000 further includes a communication interface 3022, a hard disk drive 3024, a DVD-ROM drive 3026, and an input/output unit such as an IC card drive, which are connected to the main controller 3010 via the input/output controller 3020. The computer 3000 further includes a conventional input/output unit such as a ROM3030 and a keyboard 3042, which are connected to the input/output controller 3020 via an input/output chip 3040.
The CPU3012 operates according to programs stored in the ROM3030 and the RAM3014, thereby controlling the respective units. The graphics controller 3016 acquires image data generated by the CPU3012 in a frame buffer or the like provided in the RAM3014 or itself, and causes the image data to be displayed on the display device 3018.
The communication interface 3022 communicates with other electronic apparatuses via a network. The hard disk drive 3024 stores programs and data used by the CPU3012 in the computer 3000. The DVD-ROM drive 3026 reads programs or data from the DVD-ROM 3001 or the like, and supplies the programs or data to the hard disk drive 3024 via the RAM 3014. The IC card driver reads and/or writes a program and data from/to the IC card.
The ROM3030 internally stores a startup program or the like executed by the computer 3000 when activated, and/or a program dependent on hardware of the computer 3000. The input/output chip 3040 may also connect various input/output units with the input/output controller 3020 via a parallel port, a serial port, a keyboard port, a mouse port, or the like.
The program is provided by a computer-readable storage medium such as a DVD-ROM 3001 or an IC card. The program is read from a computer-readable storage medium, installed to the hard disk drive 3024, the RAM3014, or the ROM3030, which is also an example of the computer-readable storage medium, and executed by the CPU 3012. The information processing described in these programs is read by the computer 3000, and the cooperation between the programs and the various types of hardware resources described above is realized. The apparatus or method may be configured to perform operations or processes on information in accordance with use of the computer 3000.
For example, in the case of performing communication between the computer 3000 and an external device, the CPU3012 may execute a communication program loaded on the RAM3014, and instruct a communication process to the communication interface 3022 based on a process described in the communication program. The communication interface 3022 reads transmission data stored in a transmission buffer processing area provided in a recording medium such as the RAM3014, the hard disk drive 3024, the DVD-ROM 3001, or the IC card, and transmits the read transmission data to the network, or writes reception data received from the network into a reception buffer processing area provided in the recording medium, or the like, under the control of the CPU 3012.
In addition, the CPU3012 can cause all or a necessary part of a file or database held in an external recording medium such as a hard disk drive 3024, a DVD-ROM drive 3026 (DVD-ROM 3001), an IC card, or the like to be read into the RAM3014, and perform various types of processing on data on the RAM 3014. The CPU3012 may then write the processed data back to the external recording medium.
Various information such as various types of programs, data, tables, and databases can be stored in a recording medium and received for information processing. The CPU3012 can execute various processes described in various places in the present disclosure, including various operations specified by an instruction sequence of a program, information processing, condition judgment, conditional branching, unconditional branching, retrieval/replacement of information, and the like, on data read from the RAM3014, and write the result back to the RAM 3014. In addition, the CPU3012 can retrieve information in a file, a database, or the like within the recording medium. For example, when a plurality of items each having an attribute value of the 1 st attribute associated with an attribute value of the 2 nd attribute are stored in the recording medium, the CPU3012 may retrieve an item matching the condition, which specifies an attribute value of the 1 st attribute, from the plurality of items, and read the attribute value of the 2 nd attribute stored in the item, thereby acquiring the attribute value of the 2 nd attribute associated with the 1 st attribute satisfying the preset condition.
The programs or software modules described above may be stored on the computer 3000 or in a computer-readable storage medium near the computer 3000. A recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the internet may be used as a computer-readable storage medium, and the program may be provided to the computer 3000 via the network.
The present invention has been described above with reference to the embodiments, but the technical scope of the present invention is not limited to the scope described in the above embodiments. It will be apparent to those skilled in the art that various changes and modifications can be made in the above embodiments. In addition, the matters described with respect to a specific embodiment can be applied to other embodiments within a range not technically contradictory. Each component may have the same features as those of other components having the same names and different reference numerals. It is apparent from the description of the claims that such modifications and improvements can be made within the technical scope of the present invention.
Note that the order of execution of the respective processes such as the operation, flow, step, and stage in the apparatus, system, program, and method shown in the claims, the specification, and the drawings is not particularly explicitly indicated as "preceding" or "preceding", and may be realized in any order as long as the output of the preceding process is not used in the subsequent process. Even if the description is made using "first", "next", and the like for convenience in the operation flows in the claims, the description, and the drawings, it does not mean that the operations are necessarily performed in this order.
Description of the reference numerals
20 riders; 22 a communication terminal; 40 riders; 42 a communication terminal; 100 vehicles; 120 space in the vehicle; 130 a drive system; 140 a sensor system; 150 input output system; 160 an environmental conditioning system; 170 control system; 210 a housing; 212 seat; 214 seat; 220 subspace; 222 a subspace; 240 subspace; 242 subspace; 312 seats; 314 seat; 320 subspaces; 340 subspace; 412 a separator plate; 414 a door; 460 a first space; 480 a second space; 512 an input unit; 514 output part; 516 a communication unit; 522 an external camera; 524 an outboard microphone; 526 vehicle interior cameras; 528 an in-vehicle microphone; 532 switch; 534 a touch panel; 536 a voice input section; 538 a gesture input part; 542 a speaker; 544 a display; 620 an air conditioning part; 622 air supply part; 624 an exhaust part; 626 air purifying part; 630 a dimming part; 632 an illumination section; 634 an external light adjuster; 640 seat adjusting parts; 650 a running sound adjustment section; 720 running a management part; 732 a transition event detection section; 734 a pattern determining unit; 740 an in-vehicle environment control unit; 742 a visual field environment control unit; 744 sound environment control unit; 746 air environment control; 922 a camera in the vehicle; 924 an in-vehicle camera; 932 an in-vehicle microphone; 934 an in-vehicle microphone; 942 a display; a 944 display; 952 a loudspeaker; 954 speakers; 1112 lamps; 1114 a lamp; 1116 a lamp; 1118 lamp; 1122 window; 1124 window; 1126 windows; 1128 window(s); 1220 gas supply pipe; 1222 air supply nozzle; 1224 air supply nozzles; 1226 air supply nozzles; 1228 supplying an air fan; 1240 an exhaust pipe; 1242 exhaust nozzle; 1244 exhaust nozzle; 1246 exhaust nozzle; 1248 an exhaust fan; 1300 a cover; 1320, form; 1340; 3000 computers; 3001 DVD-ROM; 3010 a master controller; 3012 a CPU; 3014 RAM; 3016 a graphics controller; 3018 a display device; 3020 an input/output controller; 3022 a communication interface; 3024 a hard disk drive; 3026DVD-ROM drive; 3030 ROM; 3040 an input/output chip; 3042 a keyboard.