WO2022270234A1 - Dispositif pour objet mobile et procédé de commande pour objet mobile - Google Patents

Dispositif pour objet mobile et procédé de commande pour objet mobile Download PDF

Info

Publication number
WO2022270234A1
WO2022270234A1 PCT/JP2022/021886 JP2022021886W WO2022270234A1 WO 2022270234 A1 WO2022270234 A1 WO 2022270234A1 JP 2022021886 W JP2022021886 W JP 2022021886W WO 2022270234 A1 WO2022270234 A1 WO 2022270234A1
Authority
WO
WIPO (PCT)
Prior art keywords
occupant
information
request
unit
occupants
Prior art date
Application number
PCT/JP2022/021886
Other languages
English (en)
Japanese (ja)
Inventor
昊舟 李
雅史 野原
夏子 宮崎
圭司 岡本
太郎 小林
あす郁 坂井
一博 松井
晋海 崔
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Priority to DE112022003245.6T priority Critical patent/DE112022003245T5/de
Priority to CN202280044906.6A priority patent/CN117580732A/zh
Publication of WO2022270234A1 publication Critical patent/WO2022270234A1/fr
Priority to US18/389,773 priority patent/US20240157897A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/9032Query formulation
    • G06F16/90332Natural language query formulation or dialogue systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/037Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/037Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
    • B60R16/0373Voice control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/9035Filtering based on additional data, e.g. user or group profiles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L17/00Speaker identification or verification techniques
    • G10L17/06Decision making techniques; Pattern matching strategies
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/08Speech classification or search
    • G10L15/18Speech classification or search using natural language modelling
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L17/00Speaker identification or verification techniques

Definitions

  • the present disclosure relates to a mobile device and a mobile control method.
  • Patent Literature 1 discloses a technique for acquiring personal information corresponding to passengers on board a vehicle from a personal information database and providing provision information based on the acquired personal information to passengers who board the vehicle again.
  • the demands of passengers change depending on the relationship and state of multiple passengers. Therefore, it has been difficult to accurately estimate a passenger's request using only the passenger's personal information. Also, it has been difficult to accurately estimate the needs of the occupants, making it difficult to provide a comfortable in-vehicle experience that meets the needs of the occupants.
  • One object of this disclosure is to provide a comfortable indoor experience that better matches the needs of the passengers by more accurately estimating the needs of the passengers in the presence of multiple passengers. and to provide a mobile object control method.
  • a device for a mobile body is a device for a mobile body that can be used in a mobile body, wherein information about an occupant of the mobile body detected by a sensor used in the mobile body is used.
  • an occupant information identifying unit that distinguishes and identifies individual occupants of a mobile object from certain occupant information; and a demand estimating unit for estimating the demand of the occupant according to the combination.
  • a mobile body control method of the present disclosure is a mobile body control method that can be used in a mobile body, comprising: a passenger information specifying step of distinguishing and specifying each passenger of the mobile body from the passenger information, which is information about the passengers of the mobile body detected in the above; It also includes a request estimation step of estimating a request of a passenger according to a combination of passenger information of a plurality of passengers.
  • the request of the occupant is estimated according to the combination of the occupant information of the plurality of occupants.
  • This makes it possible to more accurately estimate occupant demands that change depending on the relationship between multiple occupants and their conditions.
  • the occupant information is information about the occupant of the mobile object detected by the sensor used in the mobile object, it is possible to more accurately estimate the occupant's request that matches the actual situation. As a result, more accurate estimation of occupant needs in the presence of multiple occupants can provide a comfortable interior experience that better meets occupant needs.
  • FIG. 1 is a diagram showing an example of a schematic configuration of a vehicle system 1;
  • FIG. It is a figure which shows an example of a schematic structure of HCU20.
  • 4 is a flowchart showing an example of the flow of provision-related processing in HCU 20.
  • FIG. 1 is a diagram showing an example of a schematic configuration of a vehicle system 1;
  • FIG. It is a figure which shows an example of a schematic structure of HCU20.
  • 4 is a flowchart showing an example of the flow of provision-related processing in HCU 20.
  • a vehicle system 1 shown in FIG. 1 will be described as being used, for example, in an automobile (hereinafter simply referred to as a vehicle).
  • the vehicle system 1 includes an HMI (Human Machine Interface) system 2 , an air conditioning system 3 , a near field communication module (hereinafter NFCM) 4 , a wide area communication module (hereinafter WACM) 5 and a seat ECU 6 .
  • the HMI system 2, the air conditioning system 3, the NFCM 4, the WACM 5, and the seat ECU 6 are assumed to be connected to, for example, an in-vehicle LAN.
  • a vehicle equipped with the vehicle system 1 is hereinafter referred to as a vehicle.
  • the air conditioning system 3 is a vehicle cooling and heating system.
  • the air-conditioning system 3 acquires air-conditioning request information including setting values related to air-conditioning set by the user of the own vehicle from the HCU 20, which will be described later. Then, according to the acquired air-conditioning request information, the temperature, airflow, fragrance, etc. in the vehicle interior of the own vehicle are adjusted.
  • the air conditioning system 3 includes an air conditioning control ECU 30 , an air conditioning unit 31 and an aroma unit 32 .
  • the air conditioner unit 31 generates warm air and cold air (hereinafter referred to as conditioned air).
  • the conditioned air is supplied into the passenger compartment from an air outlet provided in, for example, an instrument panel of the vehicle.
  • the aroma unit 32 has beads or the like impregnated with aroma oil such as essential oil containing aromatic components (hereinafter referred to as impregnated matter). Then, airflow generated by the air conditioner unit 31 is passed around the impregnated matter, thereby supplying the fragrance to the interior of the vehicle.
  • the aroma unit 32 may atomize the aroma oil.
  • the aromatic component atomized by the aroma unit 32 may be mixed with the airflow generated by the air conditioner unit 31 and supplied into the passenger compartment.
  • the air-conditioning unit 31 provides the occupants of the own vehicle with wind stimulation. Also, the air conditioner unit 31 gives warm and cold stimuli to the occupants of the own vehicle according to the difference in the temperature of the conditioned air. In other words, the air conditioner unit 31 provides tactile stimulation.
  • the aroma unit 32 stimulates the occupants of the own vehicle with the aroma. That is, the aroma unit 32 provides olfactory stimulation. Both the air conditioner unit 31 and the aroma unit 32 are presentation devices that present stimuli.
  • the air conditioning control ECU 30 is an electronic control device that controls the operations of the air conditioning unit 31 and the aroma unit 32 .
  • the air conditioning control ECU 30 is connected to the air conditioning unit 31 and the aroma unit 32 .
  • NFCM4 is a communication module for performing short-range wireless communication.
  • the NFCM 4 performs short-range wireless communication with the portable terminal of the occupant of the own vehicle when communication connection is established with the portable terminal.
  • Near-field wireless communication is, for example, wireless communication whose communication range is at most several tens of meters.
  • wireless communication conforming to Bluetooth (registered trademark) Low Energy may be used.
  • Mobile terminals include, for example, multifunctional mobile phones and wearable devices.
  • the WACM 5 transmits and receives information to and from a center outside the own vehicle via wireless communication. That is, wide area communication is performed.
  • the seat ECU 6 is an electronic control unit that executes various processes related to controlling the seat environment, such as adjusting the seat position of the seat of the vehicle.
  • the seat of the own vehicle is an electric seat whose slide position and reclining position can be electrically changed. If the seat of the vehicle is not an electric seat, the seat ECU 6 may be omitted.
  • Seats include a driver's seat, a passenger's seat, and a rear seat.
  • the electric seat may be only part of the driver's seat, front passenger's seat, and rear seat.
  • the slide position is the position of the seat in the longitudinal direction of the vehicle.
  • the recline position is the angle of the seat backrest.
  • the backrest of the seat can also be called a seat back.
  • the HMI system 2 acquires occupant information and presents stimuli to the occupants. Stimulation here also includes the provision of information. Details of the HMI system 2 are provided below.
  • the HMI system 2 includes an HCU (Human Machine Interface Control Unit) 20 , an indoor camera 21 , a microphone 22 , a lighting device 23 , a display device 24 and an audio output device 25 .
  • HCU Human Machine Interface Control Unit
  • the interior camera 21 captures an image of a predetermined range inside the interior of the vehicle.
  • the indoor camera 21 images the range including the driver's seat, the passenger's seat, and the rear seats of the own vehicle.
  • a plurality of cameras may be used as the indoor camera 21, and the imaging range may be shared by the plurality of cameras.
  • the indoor camera 21 is composed of, for example, a near-infrared light source, a near-infrared camera, and a control unit for controlling them.
  • the indoor camera 21 captures an image of the occupant irradiated with near-infrared light by the near-infrared light source.
  • An image captured by the near-infrared camera is image-analyzed by the control unit.
  • the control unit detects the occupant's wakefulness, facial orientation, line-of-sight direction, posture, etc., based on the occupant's feature amount extracted by image analysis of the captured image.
  • the degree of arousal may be detected by, for example, the degree of opening and closing of the eyelids.
  • the microphone 22 collects the voice uttered by the occupant of the vehicle, converts it into an electrical voice signal, and outputs it to the HCU 20 .
  • the microphone 22 is provided for each seat so that it is possible to distinguish and collect the voices of the passengers in each seat.
  • the microphone 22 does not have to be provided for each seat.
  • a zoom microphone with a narrowed directivity may be used as the microphone 22 provided for each seat.
  • the lighting device 23 is provided at a position that can be visually recognized by the occupant, and stimulates the occupant with light emission. In other words, it provides a visual stimulus.
  • the illumination device 23 is a presentation device that presents stimuli.
  • a light-emitting device such as an LED may be used as the lighting device 23 . It is preferable that the illumination device 23 be capable of switching the color of the emitted light. Light emission of the illumination device 23 is controlled by the HCU 20 .
  • the display device 24 displays information.
  • the display device 24 is provided at a position that can be visually recognized by the occupant, and provides the occupant with display stimulation. In other words, it provides a visual stimulus.
  • the display device 24 is a presentation device that presents stimuli.
  • the display device 24 preferably displays at least an image.
  • the display device 24 may display text or the like in addition to images.
  • the display of the display device 24 is controlled by the HCU 20 .
  • a meter MID Multi Information Display
  • CID Center Information Display
  • a rear seat display a transparent display, or a transparent skin display can be used.
  • the meter MID is a display device installed in front of the driver's seat inside the vehicle. As an example, the meter MID may be configured to be provided on the meter panel.
  • CID is a display device arranged in the center of the instrument panel of the vehicle.
  • the rear seat display is a display device for passengers in the rear seats of the own vehicle.
  • the rear-seat display may be provided on the seatbacks of the driver's seat and passenger's seat, on the ceiling, or the like, with the display surface facing the rear of the vehicle.
  • a transparent display is a transmissive display device. Examples of transparent displays include OLED (Organic Electro-Luminescence).
  • the transparent display may be configured to be provided on the window of the vehicle.
  • a transmissive skin display is a display device that displays through a transmissive skin. The transmissive skin display may be provided on the door trim, seat back, floor roof, or the like of the vehicle.
  • the audio output device 25 provides sound stimulation to the occupants. In other words, auditory stimulation is provided.
  • the audio output device 25 is a presentation device that presents stimuli. Sounds output from the audio output device 25 include music, environmental sounds, and the like. Music may include BGM. An environmental sound may be a sound that reproduces a specific environment. As the audio output device 25, for example, an audio speaker that outputs audio may be used.
  • the HCU 20 is mainly composed of a microcomputer equipped with a processor, memory, I/O, and a bus connecting them.
  • the HCU 20 executes a control program stored in the memory to execute various types of processing such as processing related to provision of the indoor environment of the own vehicle (hereinafter referred to as provision related processing).
  • This HCU 20 corresponds to a mobile device.
  • Memory as used herein, is a non-transitory tangible storage medium for non-transitory storage of computer-readable programs and data.
  • a non-transitional physical storage medium is implemented by a semiconductor memory, a magnetic disk, or the like.
  • a schematic configuration of the HCU 20 will be described below.
  • the HCU 20 includes, as shown in FIG. An estimation unit 208 and an indoor environment identification unit 209 are provided as functional blocks. Execution of the processing of these functional blocks by the computer corresponds to execution of the control method for a moving object. A part or all of the functions executed by the HCU 20 may be configured as hardware using one or a plurality of ICs or the like. Also, some or all of the functional blocks provided by the HCU 20 may be implemented by a combination of software executed by a processor and hardware members.
  • the occupant authentication unit 201 authenticates regular occupants of the own vehicle.
  • the authentication may be performed by matching with the information of the authorized passenger registered in the authentication DB 202 in advance.
  • a non-volatile memory may be used as the authentication DB 202 .
  • the authentication DB 202 may be provided in a center that can communicate via the WACM 5 . It is preferable to use various authentication methods for authentication. For example, it is preferable to use authentication using a sensor of the own vehicle and authentication by cooperation with the portable terminal of the passenger of the own vehicle.
  • Authentication by cooperation with the portable terminal of the occupant of the own vehicle includes code collation using the identification information of each occupant held in the portable terminal.
  • the occupant authentication unit may acquire this identification information from the mobile terminal via the NFCM4.
  • the occupant authentication unit 201 may specify whether the occupant is a driver or a fellow passenger based on the occupant's boarding position.
  • the boarding position may be detected by a seat sensor, or may be estimated from a door whose opening/closing is detected. Alternatively, individual occupants may be identified from identification information held by the mobile terminal.
  • the occupant authentication unit 201 may use a plurality of authentication methods, for example, to improve the accuracy of authentication.
  • the condition for establishment of authentication may be the establishment of authentication by a plurality of authentication methods.
  • the occupant authentication unit 201 may use a plurality of authentication methods, for example, for quick authentication.
  • the provision processing unit 203 provides various indoor environments by controlling various presentation devices of the own vehicle. For example, when the passenger authentication unit 201 has successfully authenticated the passenger, the provision processing unit 203 may provide an indoor environment in which hospitality is produced. Examples of indoor environments with hospitality effects include seat position adjustment for each passenger, image display related to hospitality effects, lighting related to hospitality effects, and audio output related to hospitality effects.
  • the adjustment of the seat position for each occupant may be realized by instructing the seat ECU 6 .
  • the identification of the seat position for each passenger may be realized by storing in advance the identification information of the portable terminal and the seat position for each passenger in association with each other. Lighting may be realized by controlling the lighting device 23 .
  • Image display may be realized by controlling the display device 24 .
  • Audio output may be realized by controlling the audio output device 25 .
  • the voice recognition unit 204 performs voice recognition on the voice collected by the microphone 22, and recognizes the content of the utterance of the passenger.
  • the speech recognition unit 204 may distinguish and specify the utterance content of each passenger based on the difference in the microphone 22 that collects the sound.
  • the personal DB 205 pre-stores information about each passenger.
  • a non-volatile memory may be used as the personal DB 205 .
  • Information about each passenger includes information for identifying each passenger (hereinafter referred to as passenger identification information).
  • the information on each passenger includes at least one of information on individual passenger's preferences and information on past action history (hereinafter referred to as auxiliary information).
  • auxiliary information information on past action history
  • personal DB 205 may be provided in a center capable of communication via WACM 5 .
  • each occupant may be associated with attributes of the occupant, biological information of the occupant, identification information of the mobile terminal of the occupant, and the like.
  • the attributes of the occupant referred to here may be a relationship with a predetermined person as a reference.
  • the predetermined person will be described as the person himself below.
  • the attributes of the crew include the person himself/herself, wife, grandfather, grandmother, son over a certain age (hereinafter referred to as son), daughter over a certain age (hereinafter referred to as daughter), infant under a certain age (hereinafter referred to as infant), friend of the person. , wife's friend, son's friend, daughter's friend, etc.
  • the attributes of the occupant may be attributes other than those described here, or may be attributes more subdivided than the attributes described here. For example, if there are multiple grandfathers, grandmothers, sons, daughters, toddlers, and friends, they can be distinguished. For example, friend A, friend B, friend C, and so on.
  • the biometric information of the occupant referred to here includes a feature amount extracted from a face image, a voiceprint, and the like.
  • preference information information of the preference of the passenger
  • action history information information of the past action history
  • the occupant information identifying unit 206 identifies information about the occupants of the own vehicle detected by the sensors used in the own vehicle (hereinafter referred to as occupant information) by distinguishing between individual occupants of the own vehicle.
  • the processing in the occupant information specifying section 206 corresponds to the occupant information specifying step.
  • the sensors referred to here include the indoor camera 21 and the microphone 22 .
  • As the occupant information there is speech content recognized by the speech recognition unit 204 .
  • the occupant information includes information derived from an image of the occupant detected by the indoor camera 21 (hereinafter referred to as image-derived information).
  • image-derived information includes at least one of the occupant's facial image detected by the indoor camera 21, the feature amount extracted from the facial image, and the behavior and posture of the occupant detected by the indoor camera 21.
  • the occupant state includes the occupant's arousal level, the occupant's facial orientation, the occupant's line of sight direction, and the occupant's posture.
  • the occupant information specifying unit 206 may specify the utterance content of which occupant by referring to the attribute specifying information stored in the personal DB 205 based on the voiceprint.
  • the occupant information specifying unit 206 refers to the attribute specifying information stored in the personal DB 205 based on the feature amount extracted from the face image for the image-derived information, and identifies which occupant the image-derived information is. Just do it. Also, the passenger's boarding position may be specified using the one specified by the passenger authentication unit 201 . In addition, if a microphone 22 is provided for each seat, it is possible to specify the utterance content of the occupant at which boarding position based on the utterance content recognized from the sound collected by which microphone 22. good. It should be noted that the method of distinguishing individual occupants and specifying the utterance content and image origin information is not limited to the above.
  • the interior camera 21 detects the occupant's arousal level, face orientation, line-of-sight direction, and posture as the occupant state, but this is not necessarily the case.
  • the psychological state of the occupant may be estimated as the occupant state based on the feature amount of the facial image of the occupant by the indoor camera 21 .
  • the occupant information identifying unit 206 detects the occupant's arousal level, face direction, line-of-sight direction, posture, and psychological state. It may be configured to have the function of
  • the auxiliary information acquisition unit 207 acquires the aforementioned auxiliary information.
  • the auxiliary information acquisition unit 207 may acquire auxiliary information from the personal DB 205 . If the personal DB 205 is provided at a center outside the vehicle, the auxiliary information acquisition unit 207 may acquire auxiliary information from this center via the WACM 5 .
  • the request estimation unit 208 estimates the occupant's request according to the combination of the occupant information of the multiple occupants.
  • the processing in this request estimating unit 208 corresponds to the request estimating step.
  • the request estimating unit 208 may estimate requests for all of the multiple occupants of the own vehicle, or may estimate requests for some of the occupants.
  • the request estimating unit 208 may use a machine-learning device to estimate a passenger's request according to a combination of the passenger information of the plurality of passengers based on the passenger information of the plurality of passengers.
  • the learning device a learning device obtained by machine learning that receives a combination of passenger information of a plurality of passengers as an input and outputs the request of the passenger according to this combination may be used.
  • the request estimating unit 208 may estimate the occupant's request based on the correspondence relationship between the combination of occupant information of a plurality of occupants and the occupant's request estimated from the combination. This correspondence relationship may be obtained by interviewing a plurality of subjects.
  • the request estimating unit 208 uses the learning device described above to estimate the occupant's request, as an example.
  • the request estimating unit 208 may estimate the occupant's request according to the combination of the utterance contents of the multiple occupants based on the utterance contents of the multiple occupants of the own vehicle identified by the occupant information identifying unit 206 .
  • a learner that performs machine learning to estimate the passenger's request according to the conversation content, which is the flow of the utterance content of the plurality of passengers, by inputting the order of the utterance content as an input. preferable.
  • this learning device even if the content of speech has not been learned, it is possible to estimate the request of the passenger according to the combination of the content of speech of a plurality of passengers based on the similarity of the elements of the speech content.
  • the request estimating unit 208 estimates the background of the conversation based on the conversation content, which is the flow of the utterances of the plurality of occupants of the own vehicle, identified by the occupant information identifying unit 206. is preferably estimated. In this case, estimation may be performed step by step, such as estimating the background from the content of the conversation and estimating the request to the passenger from the estimated background. As a result, it is possible to more accurately estimate requests from crew members that are difficult to estimate only by understanding the content of utterances of multiple crew members. Background here can be translated as context. In other words, the background referred to here can also be read as context, context, circumstance, or the like.
  • the request estimating unit 208 determines the requests of the occupants of the own vehicle. Estimates are preferred.
  • the learning device by inputting the auxiliary information, it is possible to use a machine that performs machine learning for estimating the request of the passenger according to the auxiliary information as well. According to this, it is possible to estimate the passenger's request with higher accuracy by responding to the passenger's preference and past behavior history.
  • the request estimation unit 208 estimates that the wife in the front passenger seat is talking to her son and daughter in the backseat from the order of the utterances and the content of the utterances. It should be noted that it may be estimated that the wife is talking to the son and daughter in the backseat not only from the content of the speech, but also from the facial orientation of the wife identified by the occupant information identification unit 206 . In addition, from the character strings "can't see the firefly" and "sorry”, it can be inferred that the wife feels regret and that the reason is that she cannot see the firefly. In addition, the son's "It's true” and the daughter's "Yeah!
  • Another example is the B case below. It is assumed that the multiple occupants who made the utterances are the wife of the driver and the friend of the wife of the passenger.
  • the utterance order is the wife, the wife's friend, and the wife.
  • the content of the wife's first utterance is "The apple pie shop that has recently opened at ZZ Mall is very delicious. I made my family eat it recently and it was very popular.”
  • the utterance content of the wife's friend is "Hey! I'm curious! I love apple pie. I want to try it! What is the name of the store?"
  • the contents of the wife's second utterance are "Well, what is the name? I forgot. I think it was a fairly long English name.”
  • the request estimating unit 208 infers that the wife and her friend are having a conversation from the order of utterances and the content of the utterances.
  • the wife's character string "The newly opened apple pie shop at ZZ Mall is really delicious.” It is presumed that you want to eat pie.
  • the background is presumed that the wife should remember the name of the apple pie shop in ZZ mall. Then, from this estimated result, the passenger's request to know the name of the apple pie store in ZZ Mall is estimated.
  • Another example is the following C case.
  • the multiple occupants who made the utterances are assumed to be the person in the driver's seat and his wife in the passenger's seat.
  • the utterance order is the wife, then the person himself/herself.
  • the content of the wife's utterance is assumed to be "It's almost noon.”
  • the contents of the utterance of the person himself/herself are assumed to be "every shop is crowded after noon.”
  • the request estimation unit 208 estimates that the wife and the person are having a conversation from the order of utterances and the content of the utterances.
  • the background of wanting to quickly find a restaurant can be inferred from the wife's character string "It's noon.” Based on this estimated result, the passenger's request to eat at a restaurant near the current position is estimated.
  • the passenger's preference information among the auxiliary information acquired by the auxiliary information acquisition unit 207 is also used, the following may be done. For example, when it is possible to acquire preference information that both the wife and the driver themselves like ramen, it is possible to estimate the passenger's request to eat at a ramen restaurant near the current position.
  • the request estimating unit 208 may estimate the occupant's request according to a combination of the occupant states of the plurality of occupants based on the aforementioned occupant states of the occupants of the own vehicle identified by the occupant information identifying unit 206. . According to this, it is possible to more accurately estimate the request of the passenger even in a situation where conversation does not occur.
  • the request estimating unit 208 estimates the background of the occupant state based on the occupant states of the plurality of occupants of the own vehicle identified by the occupant information identifying unit 206, and estimates the occupant's request that matches the background. is preferred.
  • the background may be estimated from a combination of the occupant states of a plurality of occupants, and the request to the occupant may be estimated from the estimated background. According to this, it becomes possible to estimate the passenger's request with higher accuracy.
  • the occupant information used by the request estimating unit 208 to estimate the occupant's request may be a combination of the utterance content and the occupant state.
  • a plurality of occupants whose occupant states are identified are the wife in the driver's seat and the infant in the rear seat.
  • the occupant state of the infant is assumed to be a state of low arousal.
  • a state of low wakefulness may be a state of being asleep, or a state of drowsiness above a certain level.
  • the wife's occupant state is assumed to be facing toward the rear.
  • the request estimator 208 infers from the combination of these occupant states the background that the wife is concerned about whether the infant is asleep or unwell. Based on this presumed background, it is presumed that the wife's request is to clearly confirm the condition of the infant.
  • Another example is the E case below.
  • a plurality of occupants whose occupant states are identified are the wife in the driver's seat and the infant in the rear seat.
  • the occupant state of the infant is assumed to be a state of low arousal.
  • the wife's occupant state is her line of sight to the rear-view mirror and her confused state of mind.
  • the request estimation unit 208 estimates that the wife is concerned about whether the infant is asleep or sick, but the interior is dark and cannot be confirmed with the rear-view mirror. be done. Based on this presumed background, it is presumed that the wife's request is to clearly confirm the condition of the infant.
  • a plurality of occupants whose occupant states are identified are the wife in the driver's seat and the infant in the rear seat.
  • the occupant state of the infant is assumed to be a state in which the infant is not asleep but has a low wakefulness. It is assumed that the wife's occupant state is her line of sight to the rearview mirror and silence. Silence may be identified from the presence or absence of utterance content, or may be identified from the open/closed state of the mouth in the face image.
  • the request estimation unit 208 estimates the background of the wife's desire to put the infant to sleep from the combination of these occupant states. Based on this presumed background, the wife's desire to create an environment in the car where it is easy for the infant to fall asleep is presumed.
  • the passenger's preference information among the auxiliary information acquired by the auxiliary information acquisition unit 207 is also used, the following may be done. For example, if it is possible to obtain preference information about a song that is always played when an infant goes to bed, it is sufficient to estimate the wife's request to play this song in the car.
  • the indoor environment specifying unit 209 specifies the indoor environment of the own vehicle that is estimated to satisfy the occupant's request estimated by the request estimation unit 208 . Then, the provision processing unit 203 causes the indoor environment specified by the indoor environment specifying unit 209 to be provided.
  • the provision processing unit 203 provides the indoor environment specified by the indoor environment specifying unit 209 by providing visual content, auditory content, lighting, in-vehicle air conditioning, fragrance, conversation with the in-vehicle AI, etc. singly or in combination. Let it be.
  • the provision processing unit 203 may cause the display device 24 to provide the visual content.
  • the provision processing unit 203 may cause the audio output device 25 to provide audio content and conversation with the in-vehicle AI.
  • the provision processing unit 203 may cause the lighting device 23 to provide lighting.
  • the provision processing unit 203 may cause the air conditioning unit 31 to provide the in-vehicle air conditioning via the air conditioning control ECU 30 .
  • the provision processing unit 203 may cause the aroma unit 32 to provide the scent via the air conditioning control ECU 30 .
  • the indoor environment specifying unit 209 uses a learner that performs machine learning to determine the indoor environment of the vehicle that satisfies the occupant's request estimated by the request estimating unit 208 based on the occupant's request estimated by the request estimating unit 208. should be estimated.
  • the learner may be a learner obtained by machine learning in which the request of the passenger is input and the in-vehicle environment that satisfies the request is output.
  • the indoor environment specifying unit 209 may estimate an in-vehicle environment that satisfies the occupant's request based on the correspondence relationship between the occupant's request and the in-vehicle environment that satisfies the request. This correspondence relationship may be obtained by interviewing a plurality of subjects.
  • the indoor environment specifying unit 209 may estimate the indoor environment in which the firefly image is displayed. Furthermore, the indoor environment specifying unit 209 may estimate an indoor environment that provides the atmosphere of seeing fireflies last year. Then, the provision processing unit 203 may display a firefly image, or provide the atmosphere when the firefly was seen last year.
  • the provision processing unit 203 may cause the display device 24 to display the firefly image.
  • the provision processing unit 203 may acquire a firefly image from the center via the WACM 5 and cause the display device 24 to display the acquired firefly image.
  • the provision processing unit 203 may output the audio from the audio output device 25 .
  • the provision processing unit 203 may control the air-conditioning temperature of the air-conditioning unit 31 via the air-conditioning control ECU 30 so that the room temperature matches the temperature of the firefly colony.
  • the provision processing unit 203 may adjust the light in the passenger compartment so that the brightness matches the time when fireflies were seen last year. Dimming may be realized by controlling the lighting device 23 or controlling the room light.
  • the indoor environment identifying unit 209 may estimate an indoor environment in which a voice asking the name of the apple pie store in ZZ Mall is output.
  • the provision processing unit 203 may output a voice asking the name of the apple pie store in ZZ Mall.
  • the provision processing unit 203 may specify XX, which is the name of the apple pie store in ZZ Mall, by searching the Internet via WACM5.
  • voice output the provision processing unit 203 outputs a voice such as "Is the store you were talking about earlier possibly XX?" It should be output.
  • the provision processing unit 203 may cause the display device 24 such as CID to display information on the apple pie shop in ZZ Mall. In this way, when the passengers are conversing with each other, the conversation between the passengers may be assisted.
  • the indoor environment specifying unit 209 may estimate an indoor environment in which a voice proposing a restaurant near the current position is output. Then, the provision processing unit 203 may output a voice proposing a restaurant near the current position. The provision processing unit 203 may specify restaurants near the current position by searching the Internet via the WACM 5 . As an example of voice output, if the provision processing unit 203 causes the voice output device 25 to output a voice such as "There is a restaurant called YY 100 meters away" from the voice output device 25 at the timing of the gap between the person and his wife. good. In addition, the provision processing unit 203 may cause the display device 24 such as a CID to display information on the proposed restaurant or display map information indicating the location of the restaurant.
  • the display device 24 such as a CID to display information on the proposed restaurant or display map information indicating the location of the restaurant.
  • the indoor environment specifying unit 209 may estimate an indoor environment in which the child's appearance is displayed on the display device 24 that is easy for the wife to see.
  • the provision processing unit 203 displays the infant's condition on the display device 24 which is easy for the wife to see.
  • the provision processing unit 203 may cause the display device 24 such as the CID and the meter MID to display the image of the infant captured by the indoor camera 21 .
  • the audio output device 25 outputs a low-volume sound indicating that the infant has fallen asleep in conjunction with the display of the image of the infant. You can output.
  • the low volume referred to here may be a volume that is estimated not to disturb the sleep of the infant.
  • Case F it is presumed that the wife wants to create an environment in the car where the infant can easily fall asleep.
  • the indoor environment specifying unit 209 may estimate an indoor environment in which the infant can easily sleep.
  • the provision processing unit 203 may provide an indoor environment in which the infant can easily sleep. If there is a song that the infant always plays when sleeping, the provision processing unit 203 may output the song from the audio output device 25 at a reduced volume. If there is no music that the infant always plays when sleeping, the volume of the music output from the audio output device 25 can be lowered.
  • the air conditioning unit 31 may be controlled via the air conditioning control ECU 30 to adjust the room temperature and the air volume so that the infant can relax.
  • a seat heater may be used to adjust the temperature. The same may be done when the subject requesting sleep or requesting sleep is a fellow passenger other than an infant. In this case, the indoor area in which the driver is present is not provided with an indoor environment that facilitates sleep.
  • the provision processing unit 203 may provide an indoor environment that is easy for the passenger to enjoy. For example, the provision processing unit 203 may search for music that the passenger likes based on the preference information in the personal DB 205 and cause the audio output device 25 to output the music. In this case, in order not to offend the passenger, it is preferable to ask the passenger for permission to play the music before the music is output.
  • the provision processing unit 203 may provide an indoor environment that easily awakens the occupant's drowsiness.
  • the provision processing unit 203 may control the air conditioning unit 31 via the air conditioning control ECU 30 to blow cool air.
  • the provision processing unit 203 may control the aroma unit 32 via the air conditioning control ECU 30 to emit a scent that has an awakening effect.
  • the provision processing unit 203 may cause the audio output device 25 to output up-tempo music.
  • the provision processing unit 203 may also cause the audio output device 25 to output a sound for calling attention.
  • Provision-related processing in HCU 20 Next, an example of the flow of provision-related processing in the HCU 20 will be described using the flowchart of FIG. 3 may be configured to be started when, for example, the passenger authentication unit 201 authenticates a regular passenger of the own vehicle.
  • step S1 the provision processing unit 203 provides an indoor environment with hospitality.
  • step S2 the occupant information identifying unit 206 identifies each occupant of the own vehicle according to the occupant information detected by the sensors used in the own vehicle.
  • step S3 if there is auxiliary information for the occupant identified in S2 (YES in S3), the process proceeds to step S4. On the other hand, if there is no auxiliary information for the occupant identified in S2 (NO in S3), the process proceeds to step S5. In step S4, the auxiliary information acquisition unit 207 acquires auxiliary information about the occupant identified in S2, and the process proceeds to step S5.
  • step S5 the request estimating unit 208, based on the occupant information of the multiple occupants of the own vehicle specified by the occupant information specifying unit 206, estimates the occupant's request according to the combination of the occupant information of the multiple occupants.
  • step S6 if the passenger's request can be estimated (YES in S6), the process proceeds to step S7. On the other hand, if the occupant's request could not be estimated (NO in S6), the process proceeds to step S9.
  • step S7 the indoor environment specifying unit 209 specifies the indoor environment of the own vehicle that is estimated to satisfy the passenger's request estimated in S5.
  • step S8 the provision processing unit 203 causes the indoor environment specified in S7 to be provided.
  • step S9 if it is time to end the provision-related processing (YES in S9), the provision-related processing is ended. On the other hand, if it is not the end timing of the provision related process (NO in S9), the process returns to S2 and repeats the process.
  • An example of the end timing of the provision-related processing is when the power switch of the own vehicle is turned off.
  • the occupant's request is made according to the combination of the occupant information of the plurality of occupants. Since the estimation is made, it is possible to more accurately estimate the relationship between a plurality of occupants and the occupant's request that changes depending on the state.
  • the occupant information is information about the occupants of the own vehicle detected by the sensors used in the own vehicle, it is possible to more accurately estimate the occupant's request that matches the actual situation. As a result, more accurate estimation of occupant needs in the presence of multiple occupants can provide a comfortable interior experience that better meets occupant needs.
  • the occupant information may be either the contents of the utterance or the occupant state, but the configuration is not necessarily limited to this.
  • the occupant information only one of the utterance content and the occupant state may be used.
  • the vehicle system 1 may be configured to be applied to a vehicle other than an automobile as long as it is a mobile body capable of accommodating a plurality of passengers.
  • vehicle system 1 may be configured to be used in moving bodies such as railway vehicles, aircraft, and ships.
  • controller and techniques described in this disclosure may also be implemented by a special purpose computer comprising a processor programmed to perform one or more functions embodied by a computer program.
  • the apparatus and techniques described in this disclosure may be implemented by dedicated hardware logic circuitry.
  • the apparatus and techniques described in this disclosure may be implemented by one or more special purpose computers configured by a combination of a processor executing a computer program and one or more hardware logic circuits.
  • the computer program may also be stored as computer-executable instructions on a computer-readable non-transitional tangible recording medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • Business, Economics & Management (AREA)
  • Acoustics & Sound (AREA)
  • Game Theory and Decision Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Seats For Vehicles (AREA)
  • Passenger Equipment (AREA)

Abstract

L'invention concerne une HCU (20) pouvant être utilisée dans un véhicule, comprenant : une unité d'identification d'informations de passager (206) pour identifier des informations de passager, qui sont des informations concernant un passager d'une propre voiture, détectées par un capteur utilisé dans la propre voiture en distinguant les passagers individuels de la propre voiture ; et une unité d'estimation de demande (208) pour estimer, en fonction des informations de passager concernant une pluralité de passagers de la propre voiture identifiée par l'unité d'identification d'informations de passager (206), une demande de passager dépendant d'une combinaison d'informations de passager concernant la pluralité de passagers.
PCT/JP2022/021886 2021-06-25 2022-05-30 Dispositif pour objet mobile et procédé de commande pour objet mobile WO2022270234A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
DE112022003245.6T DE112022003245T5 (de) 2021-06-25 2022-05-30 Vorrichtung für mobiles objekt und steuerungsverfahren für mobiles objekt
CN202280044906.6A CN117580732A (zh) 2021-06-25 2022-05-30 移动体用装置以及移动体用控制方法
US18/389,773 US20240157897A1 (en) 2021-06-25 2023-12-19 Device for mobile object and control method for mobile object

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021106090A JP2023004437A (ja) 2021-06-25 2021-06-25 移動体用装置及び移動体用制御方法
JP2021-106090 2021-06-25

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/389,773 Continuation US20240157897A1 (en) 2021-06-25 2023-12-19 Device for mobile object and control method for mobile object

Publications (1)

Publication Number Publication Date
WO2022270234A1 true WO2022270234A1 (fr) 2022-12-29

Family

ID=84544521

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/021886 WO2022270234A1 (fr) 2021-06-25 2022-05-30 Dispositif pour objet mobile et procédé de commande pour objet mobile

Country Status (5)

Country Link
US (1) US20240157897A1 (fr)
JP (1) JP2023004437A (fr)
CN (1) CN117580732A (fr)
DE (1) DE112022003245T5 (fr)
WO (1) WO2022270234A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008126766A (ja) * 2006-11-17 2008-06-05 Fuji Heavy Ind Ltd 車両の運動制御装置
JP2012133530A (ja) * 2010-12-21 2012-07-12 Denso Corp 車載装置
JP2018133696A (ja) * 2017-02-15 2018-08-23 株式会社デンソーテン 車載装置、コンテンツ提供システムおよびコンテンツ提供方法
JP2020157944A (ja) * 2019-03-27 2020-10-01 本田技研工業株式会社 車両機器制御装置、車両機器制御方法、およびプログラム
JP2021032698A (ja) * 2019-08-23 2021-03-01 株式会社デンソーテン 車載装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6869570B1 (ja) 2019-12-26 2021-05-12 株式会社トライサウンド 電子撥弦楽器用プラグ及び電子撥弦楽器用プラグ付きケーブル

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008126766A (ja) * 2006-11-17 2008-06-05 Fuji Heavy Ind Ltd 車両の運動制御装置
JP2012133530A (ja) * 2010-12-21 2012-07-12 Denso Corp 車載装置
JP2018133696A (ja) * 2017-02-15 2018-08-23 株式会社デンソーテン 車載装置、コンテンツ提供システムおよびコンテンツ提供方法
JP2020157944A (ja) * 2019-03-27 2020-10-01 本田技研工業株式会社 車両機器制御装置、車両機器制御方法、およびプログラム
JP2021032698A (ja) * 2019-08-23 2021-03-01 株式会社デンソーテン 車載装置

Also Published As

Publication number Publication date
CN117580732A (zh) 2024-02-20
JP2023004437A (ja) 2023-01-17
DE112022003245T5 (de) 2024-04-18
US20240157897A1 (en) 2024-05-16

Similar Documents

Publication Publication Date Title
JP5152570B2 (ja) 自動車用ユーザーもてなしシステム
JP4332813B2 (ja) 自動車用ユーザーもてなしシステム
US10040423B2 (en) Vehicle with wearable for identifying one or more vehicle occupants
US10155524B2 (en) Vehicle with wearable for identifying role of one or more users and adjustment of user settings
US20170151918A1 (en) Vehicle with wearable to provide intelligent user settings
CN105015445B (zh) 用于对车辆的驾驶员进行与人相关的辅助的方法和系统
JP4525925B2 (ja) 自動車用ユーザーもてなしシステム
JP4793085B2 (ja) 車両用居眠り防止装置
JP6466385B2 (ja) サービス提供装置、サービス提供方法およびサービス提供プログラム
US10189434B1 (en) Augmented safety restraint
CN110774996B (zh) 车内环境的调节方法、装置和系统
WO2021254141A1 (fr) Procédé d'interaction de véhicule et véhicule
JP2019158975A (ja) 発話システム
JP2018027731A (ja) 車載装置、車載装置の制御方法およびコンテンツ提供システム
WO2022270234A1 (fr) Dispositif pour objet mobile et procédé de commande pour objet mobile
JP2020103462A (ja) 感情推定装置、環境提供システム、車両、感情推定方法、および情報処理プログラム
JP4968532B2 (ja) 自動車用ユーザーもてなしシステム
JP7286368B2 (ja) 車両機器制御装置、車両機器制御方法、およびプログラム
JP2021165948A (ja) 情報処理装置、情報処理システムおよび情報処理方法
JP6785889B2 (ja) サービス提供装置
CN114734912A (zh) 座舱内通过氛围灯进行提醒的方法和装置、电子设备和存储介质
CN114194122A (zh) 一种安全提示系统及汽车
JP2008230280A (ja) 車載制御装置
WO2023021872A1 (fr) Système d'informations personnelles
WO2022124090A1 (fr) Dispositif de détermination d'état d'occupant et procédé de détermination d'état d'occupant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22828153

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202280044906.6

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 112022003245

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22828153

Country of ref document: EP

Kind code of ref document: A1