WO2021172038A1 - Information processing device, information processing method, and program - Google Patents
Information processing device, information processing method, and program Download PDFInfo
- Publication number
- WO2021172038A1 WO2021172038A1 PCT/JP2021/005166 JP2021005166W WO2021172038A1 WO 2021172038 A1 WO2021172038 A1 WO 2021172038A1 JP 2021005166 W JP2021005166 W JP 2021005166W WO 2021172038 A1 WO2021172038 A1 WO 2021172038A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- occupant
- seat
- sensor
- sensing
- observation
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60N—SEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
- B60N2/00—Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
- B60N2/005—Arrangement or mounting of seats in vehicles, e.g. dismountable auxiliary seats
- B60N2/01—Arrangement of seats relative to one another
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60N—SEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
- B60N2/00—Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
- B60N2/90—Details or parts not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/04—Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
Definitions
- This technology relates to information processing devices, information processing methods, and programs, and in particular, to information processing devices, information processing methods, and programs designed to reduce the processing burden of unnecessary information.
- Patent Document 1 discloses a technique of detecting the number of passengers based on an image of the interior of a vehicle taken by a camera and optimizing a method of editing an image of the interior of the vehicle according to the number of passengers.
- Patent Document 2 discloses a technique for presenting information according to the number of people in a vehicle interior and the like.
- This technology was made in view of such a situation, and makes it possible to reduce the processing burden of unnecessary information.
- the information processing device or program of the present technology is an information processing device having a processing unit that sets a sensing range of an observation sensor that observes a user who uses the equipment based on the state of equipment provided in the vehicle. Or, it is a program for operating a computer as such an information processing device.
- the processing unit of the information processing device having the processing unit sets the sensing range of the observation sensor for observing the user who uses the equipment based on the state of the equipment provided in the vehicle. This is the information processing method to be set.
- the sensing range of the observation sensor that observes the user who uses the equipment is set based on the state of the equipment installed in the vehicle.
- FIG. 1 It is a block diagram which showed the structural example of one Embodiment of the information processing system to which this technology is applied. It is a block diagram which showed the functional structure example of an information processing apparatus. It is a flowchart which showed the processing example which a processing unit 41 carries out. It is a flowchart which showed the processing example of the sensing setting. It is a figure which illustrated the arrangement of a seat and an occupant observation sensor. It is a figure explaining the sensing range of the occupant observation sensor 62 in Example 1. FIG. It is a figure explaining the sensing range of the occupant observation sensor 62 in Example 1. FIG. It is a figure which illustrated the depth image (observation information). It is a figure which illustrated the depth image (observation information).
- FIG. It is a flowchart which showed the processing example performed by the sensing setting unit 44 in Example 1.
- FIG. It is a flowchart which showed the processing example when the detection frequency is changed according to a sensing range. It is a figure explaining the sensing range in Example 2.
- FIG. It is a figure which illustrated the depth image (observation information).
- FIG. It is a figure explaining the sensing range in Example 3.
- FIG. It is a figure explaining the sensing range in the state which the seat belt is worn with respect to the state of FIG.
- FIG. It is a figure which illustrated the observation information (depth image).
- FIG. It is a figure explaining the arrangement of the occupant observation sensor in Example 5. It is a figure which showed the sensing range of the occupant observation sensor 161 to 163. It is a figure which showed the sensing range when the seat is displaced greatly. It is a figure explaining the arrangement of the occupant observation sensor in Example 6. It is a figure which showed the sensing range of the occupant observation sensor of FIG. It is a figure which showed the sensing range when the seat is largely displaced from a standard position. It is a figure explaining the arrangement of the occupant observation sensor in Example 7. It is a figure explaining the arrangement of the occupant observation sensor in Example 7.
- the information processing system 11 detects, for example, the state of an occupant in a vehicle, and performs processing or control related to various devices or equipment of the vehicle in response to the state of the occupant.
- the information processing system 11 includes an information processing device 21, an in-vehicle system 22, and a sensor 23.
- the information processing device 21 mainly detects the state of the occupant and executes processing (response processing) corresponding to the state of the occupant.
- the response process includes a process of instructing a part of the control unit of the vehicle-mounted system 22 mounted on the automobile to execute a predetermined process or control.
- the information processing device 21 of FIG. 1 is exemplified as a configuration including a function of executing a predetermined application such as moving image playback designated by an occupant and displaying the presentation information of the application on a display or the like in the vehicle. Therefore, the response processing performed by the information processing apparatus 21 includes the response processing of the application executed by the information processing apparatus 21.
- the in-vehicle system 22 includes a control system, a body system, and an information system control unit (ECU: Electronic Control Unit) included in the automobile, and is a system in which each control unit is communicably connected via an in-vehicle network.
- the control system performs engine control, steering control, brake control, and the like.
- the body system performs seat control, door control, mirror control, air conditioner control, and the like.
- the information system is composed of an audio system, a navigation system, a back monitor, and the like.
- a part of the control unit of the in-vehicle system 22 has a function of executing a predetermined control according to an instruction from the information processing device 21.
- the sensor 23 is a sensor installed in an automobile for the purpose of being used by the in-vehicle system 22.
- the sensor 23 represents an arbitrary sensor used in the in-vehicle system 22, and is not limited to a specific type and a specific number.
- the information processing device 21 includes a CPU (Central Processing Unit) 31, a RAM (Random Access Memory) 32, a ROM (Read Only Memory) 33, an input unit 34, an output unit 35, a storage unit 36, a sensor 37, and a communication unit 38. do.
- the CPU 31, RAM 32, and ROM 33 are connected to each other by a bus 39.
- An input / output interface 40 is further connected to the bus 39.
- An input unit 34, an output unit 35, a storage unit 36, a sensor 37, and a communication unit 38 are connected to the input / output interface 40.
- the CPU 31 loads and executes, for example, the program stored in the ROM 33 or the storage unit 36 into the RAM 32 via the bus 39 or via the input / output interface 40 and the bus 39.
- the CPU 31 executes the sensing setting process, the occupant state detection process, the operation recognition process, the operation response process, and the like, which will be described later.
- RAM 32 is a volatile memory and temporarily stores data such as programs processed by CPU 31.
- the ROM 33 is a non-volatile memory, and stores data such as a program loaded in the RAM 32 and executed by the CPU 31.
- the input unit 34 includes a switch, a microphone, and the like.
- the data input from the input unit 34 is supplied to the CPU 32 via the input / output interface 40 and the bus 39.
- the output unit 35 includes a projector, a display, a speaker, and the like.
- the projector is installed in the vehicle and projects image information, text information, and the like on the ceiling, headrest, door glass, windshield, and the like.
- the output unit 35 may include one or more projectors that project information onto one or more parts.
- the display may be an image display (liquid crystal display, organic EL (Electro-Luminescence), etc.) installed on a dashboard, ceiling, headrest, door glass, windshield, or the like.
- image display liquid crystal display, organic EL (Electro-Luminescence), etc.
- the information processing device 21 of FIG. 1 does not have to have a function of displaying information, and may not have an output unit 35. Further, the function of displaying information may be included in the in-vehicle system 22.
- the storage unit 36 includes a hard disk, a non-volatile memory, and the like.
- the storage unit 36 stores an application program executed by the CPU 31, data referenced when an arbitrary program is executed by the CPU 31, and the like.
- the sensor 37 is an occupant observation sensor (described later) that observes an occupant (equipment user) and a seat-related sensor (described later) that detects the state of each seat (equipment) of the automobile.
- the sensor 37 is representative of the sensor included in the information processing device 21, and is not limited to a specific type and a specific number.
- the data acquired by the sensor 37 is supplied to the CPU 31 via the input / output interface 40 and the bus 39.
- the communication unit 38 includes a network interface and the like.
- the communication unit 38 is connected to the in-vehicle system 22 so as to be able to communicate by wire or wirelessly.
- the CPU 31 supplies an instruction signal or the like instructing the execution of a predetermined process or control to the control unit included in the in-vehicle system 22 via the communication unit 38 connected to the communication.
- the CPU 31 can acquire the data obtained by the sensor 23 used in the in-vehicle system 22 via the in-vehicle system 22 and the communication unit 38 which are connected by communication. Therefore, the information processing device 21 can use the sensor 23 used in the in-vehicle system 22.
- the communication unit 38 has a communication interface such as a wireless LAN (Local Area Network), Bluetooth (registered trademark), and mobile communication, and can be connected to a terminal such as a smart phone or the Internet. May be good.
- a wireless LAN Local Area Network
- Bluetooth registered trademark
- mobile communication can be connected to a terminal such as a smart phone or the Internet. May be good.
- FIG. 2 is a block diagram showing a functional configuration example of the information processing device 21 that functionally represents the information processing device 21 of FIG.
- the information processing device 21 of FIG. 2 has an output unit 35, a processing unit 41, a seat-related sensor 42, and an occupant observation sensor 43.
- the processing unit 41 sets the range observed by the occupant observation sensor 43 (hereinafter referred to as the sensing range) and the like based on the detection information (hereinafter referred to as the seat information) detected by the seat-related sensor 42. Further, the processing unit 41 detects the state of the occupant based on the detection information (hereinafter referred to as observation information) detected by the occupant observation sensor 43, and executes the response processing corresponding to the state of the occupant. The processing in the processing unit 41 will be described later.
- the seat-related sensor 42 is a sensor provided as the sensor 23 or the sensor 37 in FIG. 1, and is a sensor that detects the state of the seat.
- the condition of the seat also includes the condition of ancillary equipment (reclining mechanism, seat belt, steering wheel, etc.) related to the seat. Therefore, the seat-related sensor 42 includes a seating sensor, a reclining sensor, a seatbelt sensor, a seat position sensor, a handle sensor, and the like provided for each seat. Further, the seat-related sensors 42 are provided for each seat except for the sensors for only some seats, and the processing unit 41 acquires seat information from the seat-related sensors 42 for all seats.
- the seat-related sensor 42 is basically a sensor provided in the automobile regardless of the present technology, and the presence / absence, position, orientation, posture, etc. of the occupant are estimated using these sensors.
- the seat-related sensor 42 may be a sensor installed separately from the sensor provided in the automobile.
- the seat-related sensor 42 may be a motion sensor.
- the occupant observation sensor 43 is a sensor provided as the sensor 23 or the sensor 37 in FIG. 1, and is a sensor for observing the occupant.
- the occupant observation sensor 43 is, for example, an image sensor (camera) or a depth sensor that acquires spatial information.
- the image sensor may be a visible light camera or an infrared camera.
- the depth sensor may be a sensor that acquires three-dimensional information (depth information) of the shooting space by a stereo camera, or may be a ToF (time of flight) sensor or the like.
- the occupant observation sensor 43 is provided for each row with respect to the arrangement of seats (front row, middle row, rear row, etc.) in the vehicle, and one occupant observation sensor 43 observes occupants of a plurality of seats.
- a plurality of occupant observation sensors 43 are provided when the seats are arranged in two or more rows, and the processing unit 41 acquires observation information supplied from all the occupant observation sensors 43.
- one occupant observation sensor 43 may observe one or three or more people. Further, the occupant observation sensor 43 is not limited to a specific arrangement, and may be arranged for each seat. Further, the occupant observation sensor 43 may observe a part of the occupants in the vehicle.
- the occupant observation sensor 43 may be a sensor that acquires physical information (including biological information) such as the occupant's body temperature, heart rate, sleep state, wakefulness state, and emotions.
- the processing unit 41 includes a sensing setting unit 44, an occupant state detection unit 45, an operation recognition unit 46, and an operation response unit 47.
- the sensing setting unit 44 sets the sensing range, detection density, and detection frequency of the occupant observation sensor 43 based on the seat information acquired from the seat-related sensor 42.
- the sensing range of the occupant observation sensor 43 indicates the observation range effectively observed by the occupant observation sensor 43 or the range of observation information effectively detected.
- the sensing range of the occupant observation sensor 43 is set to a range that excludes as much as possible the range in which no occupant exists within the maximum observable observation range of the occupant observation sensor 43. For example, the sensing range becomes smaller as the number of occupants decreases.
- the detection density of the occupant observation sensor 43 represents the density of the observation points (detection points) of the observation information effectively detected by the occupant observation sensor 43.
- the detection density represents the pixel density of the captured image as observation information.
- the detection frequency of the occupant observation sensor 43 represents the frequency of detection effectively performed by the occupant observation sensor 43.
- the detection frequency represents the number of detections per unit time.
- the smaller the sensing range of the occupant observation sensor 43 that is, the smaller the amount of information acquired from the occupant observation sensor 43 in one detection, the smaller the detection density and detection frequency of the occupant observation sensor 43. Set one or both of them to a large value. As a result, the accuracy of detecting the occupant state by the occupant state detection unit 45, which will be described later, can be improved without increasing the processing load on the processing unit 41 or the like.
- the sensing setting unit 44 sets the operation of each occupant observation sensor 43 based on the set sensing range, detection density, and detection frequency, and changes the processing operation of each occupant observation sensor 43.
- the occupant state detection unit 45 acquires the observation information supplied from the occupant observation sensor 43 and also acquires the information of the sensing range set by the sensing setting unit 44.
- the occupant state detection unit 45 detects the state of each occupant based on the observation information from each occupant observation sensor 43 and supplies the state to the operation recognition unit 46.
- the occupant state detection unit 45 extracts the physical feature points of each occupant from the continuously obtained observation information, and detects the positions, orientations, and movements of the hands, arms, eyes, face, and the like. , Detects the behavior of the occupants (gestures, etc.).
- the occupant state detection unit 45 may detect the facial expression of the occupant and other physical information as the occupant state.
- the occupant state detection unit 45 even when the body of one occupant is divided and observed by a plurality of occupant observation sensors 43, which occupant observation sensor 43 has the observation information for the occupant. It can be grasped from the information of the sensing range from the sensing setting unit 44 whether or not it is divided into the sensing ranges and acquired (described later).
- the occupant state detection unit 45 integrates (links) the occupant observation information acquired by being divided into the sensing ranges of the plurality of occupant observation sensors 43 based on the information of the sensing range from the sensing setting unit 44. As a result, the occupant state detection unit 45 can appropriately detect the occupant state even for the occupants who are divided and observed by the plurality of occupant observation sensors 43.
- the operation recognition unit 46 recognizes an operation associated with the occupant state detected by the occupant state detection unit 45 in advance.
- the occupant state detection unit 45 detects the movement (state) of the occupant who holds his hand between the driver's seat and the passenger seat and draws a circle with his fingers, and the information indicating the occupant's state is operated from the occupant state detection unit 45. It is assumed that it is supplied to the recognition unit 46.
- the operation recognition unit 46 recognizes, for example, an operation of raising the volume of the audio system included in the in-vehicle system 22 as an operation associated with the state of the occupant in advance.
- the operation recognition unit 46 recognizes, for example, an operation of generating a warning sound, an operation of decelerating, or an operation of evacuating to the shoulder as an operation associated with the state of the occupant in advance.
- the operation recognition unit 46 supplies the operation recognized for the occupant's state from the occupant state detection unit 45 to the operation response unit 47.
- the operation response unit 47 executes response processing corresponding to the operation recognized by the operation recognition unit 46. For example, it is assumed that the operation recognition unit 46 recognizes an operation of raising the volume of the audio system as described above, and the operation recognition unit 46 supplies information indicating the operation to the operation response unit 47. The operation response unit 47 gives an instruction to raise the volume to the control unit of the audio system included in the in-vehicle system 22 as a response process corresponding to the operation.
- the operation recognition unit 46 recognizes the operation of decelerating as described above, and the information indicating the operation is supplied to the operation response unit 47.
- the operation response unit 47 gives a deceleration instruction to the vehicle control control unit included in the in-vehicle system 22 as a response process corresponding to the operation.
- the operation response unit 47 starts the designated application, executes the processing of the application, and displays the information presented by the application in FIG. It is supplied to the output unit 35.
- the operation response unit 47 executes the processing of the application corresponding to the operation.
- FIG. 3 is a flowchart showing a processing example executed by the processing unit 41 of FIG.
- step S11 the sensing setting unit 44 of the processing unit 41 performs the sensing setting process based on the seat information from the seat-related sensor 42.
- the sensing setting unit 44 sets the sensing range, the detection density, and the detection frequency of the occupant observation sensor 43 by the sensing setting process. The process proceeds from step S11 to step S12.
- step S12 the occupant state detection unit 45 of the processing unit 41 detects the occupant state based on the observation information from the occupant observation sensor 43.
- the occupant state detection unit 45 supplies the detected occupant state to the operation recognition unit 46.
- the process proceeds from step S12 to step S13.
- step S13 the operation recognition unit 46 of the processing unit 41 recognizes the operation associated with the occupant state supplied from the occupant state detection unit 45 in step S12.
- the operation recognition unit 46 supplies the recognized operation to the operation response unit 47.
- the process proceeds from step S13 to step S14.
- step S14 the operation response unit 47 of the processing unit 41 executes the response processing corresponding to the operation supplied from the operation recognition unit 46 in step S13.
- the process returns from step S14 to step S11, and repeats step S11 and subsequent steps.
- a predetermined operation is automatically recognized with respect to the state of the occupant of the automobile, and response processing corresponding to the recognized operation is performed. ..
- the occupant can operate the switches and screens of the navigation system and the audio system, for example, without directly touching them by the movements of his / her hands and arms.
- the sensing setting unit 44 sets the sensing range of the occupant observation sensor 43 based on the seat information from the seat-related sensor 42.
- the sensing setting unit 44 detects the state of each seat based on the seat information from the seat-related sensor 42 installed for each seat in the vehicle.
- the seat-related sensor 42 includes a seating sensor, a reclining sensor, a seatbelt sensor, a seat position sensor, and the like.
- the seating sensor detects whether or not the occupant is seated (seat state or non-seat state). That is, the presence or absence of seating of each seat is detected as a state of whether or not each seat, which is a facility provided in the vehicle, is used.
- the state of the seat includes the form of the seat, and the reclining state (tilted state) or the non-reclining state (standing state) of the seat is detected by the reclining sensor.
- the state of the seat includes the state of ancillary equipment related to the seat, and the seatbelt sensor detects the wearing state or the non-wearing state of the seatbelt.
- the seat position and the orientation of the seat that can be moved back and forth are detected by the seat position sensor.
- the seat-related sensor 42 may include, for example, a handle sensor for detecting whether or not the handle is gripped for the driver's seat or the like.
- the position of the occupant existing in the vehicle is estimated from the state of each seat detected by the seat information from the seat-related sensor 42.
- the sensing setting unit 44 detects the state of each seat based on the seat information of only the seating sensor, which is the seat-related sensor 42, when the seating state is detected, the occupant is placed in the peripheral range of the seated portion of the seat. It is presumed to exist. On the contrary, when it is detected that the vehicle is not seated, it is presumed that there is no occupant in the area around the seated portion of the seat.
- the sensing setting unit 44 may estimate the position of the occupant based on the state of equipment other than the seats provided in the vehicle.
- the sensing setting unit 44 uses the partial sensing range to set the sensing range of each occupant observation sensor 43 so that the existence range of each occupant estimated from the state of each seat becomes the sensing range of each occupant observation sensor 43. To set.
- the partial sensing range is the observation range when the occupants seated in each seat are individually observed by the occupant observation sensor 43, or the observation information of the occupants seated in each seat is observed by the occupant observation sensor 43. This is the range of observation information when it is determined to be detected individually.
- the partial sensing range is predetermined for each seat and for each seat condition and is stored in the storage unit 36 or ROM 33 of FIG.
- each seat is associated and stored with a partial sensing range for observing an occupant seated in that seat.
- the partial sensing range of a different range is associated with and stored in each seat state.
- the position of the partial sensing range may be changed according to the position of the seat or the like, and the size of the sensing range may be changed as appropriate.
- the sensing setting unit 44 detects the state of each seat based on the seat information of only the seating sensor, which is the seat-related sensor 42, the sensing setting unit 44 is associated with the seat (seat in the seated state) that has detected that the seat is in the seated state.
- the partial sensing range is enabled, and the partial sensing range associated with that seat is disabled for non-seat seats.
- the sensing setting unit 44 sets the effective partial sensing range of the maximum sensing range of each occupant observation sensor 43 to the sensing range of the occupant observation sensor 43.
- the maximum sensing range of each occupant observation sensor 43 is a sensing range when all the partial sensing ranges included in the observation range of each occupant observation sensor 43 are enabled.
- the maximum observable observation range of each occupant observation sensor 43 may be set as the maximum sensing range.
- one partial sensing range may be divided into observation ranges of a plurality of occupant observation sensors 43.
- the partial sensing range when the partial sensing range is set as the sensing range, the corresponding portion of the partial sensing range is the sensing range for each observation range of the plurality of occupant observation sensors 43.
- the sensing setting unit 44 sets the detection density and the detection frequency of each occupant observation sensor 43 based on the sensing range set for each occupant observation sensor 43.
- the detection density of the occupant observation sensor 43 represents the density of observation points where observation information (observation data) is detected by the occupant observation sensor 43 within the sensing range of the occupant observation sensor 43.
- the detection density of the occupant observation sensor 43 is, for example, the pixel density of the captured image acquired by the image sensor when the occupant observation sensor 43 is an image sensor, and when the occupant observation sensor 43 is a depth sensor. It is the pixel density of the depth image acquired by the depth sensor.
- the detection frequency of the occupant observation sensor 43 is the number of times the observation information of the entire sensing range of the occupant observation sensor 43 is acquired by the occupant observation sensor 43 per unit time.
- the detection frequency of the occupant observation sensor 43 is, for example, the number of frames (frame rate) of the captured image acquired by the image sensor per unit time when the occupant observation sensor 43 is an image sensor, and the occupant observation sensor 43 determines. In the case of a depth sensor, it is the number of frames (frame rate) of the depth image acquired by the depth sensor per unit time.
- the sensing setting unit 44 increases the processing amount in a state in which the information processing device 21 is assumed to require the most processing capacity, that is, in a state in which the sensing ranges of all the occupant observation sensors 43 are set to the maximum sensing range.
- the detection density and detection frequency of the occupant observation sensor 43 are set so as not to exceed the limit of the processing capacity of the information processing device 21.
- the detection density and detection frequency of each occupant observation sensor 43 at this time are referred to as a reference detection density and a reference detection frequency.
- the reference detection density and the reference detection frequency do not necessarily have to match between the occupant observation sensors 43.
- the sensing setting unit 44 determines the detection density and detection frequency of the occupant observation sensor 43. At least one of them should be larger than the reference detection density or the reference detection frequency.
- the detection density is D times the reference detection density and the detection frequency is relative to the reference detection frequency. And assume that it is set to E times.
- the sensing range of the occupant observation sensor 43 is set to an appropriate range with less waste according to the position of the occupant existing in the vehicle. Further, according to the sensing range of the occupant observation sensor 43, the detection density or the detection frequency of the occupant observation sensor 43 is appropriately set within a range that does not exceed the processing capacity of the information processing device 21.
- FIG. 4 is a flowchart showing a processing example of the sensing setting (process of step S11 in FIG. 3) performed by the sensing setting unit 44.
- step S31 the sensing setting unit 44 detects the state of each seat based on the seat information from the seat-related sensor 42. The process proceeds from step S31 to step S32.
- step S32 the sensing setting unit 44 sets the variable n representing the seat number to 1.
- the variable n represents the seat number when the seats in the vehicle are numbered 1 to N, and the seat whose seat number is n is represented by the seat n.
- the process proceeds from step S32 to step S33.
- step S33 the sensing setting unit 44 effectively sets the partial sensing range corresponding to the state of the seat n detected in step S31. Specifically, when the state of the seat n is the seated state, the sensing setting unit 44 effectively sets the partial sensing range corresponding to the seat n. When the state of the seat n is the non-seated state, the partial sensing range corresponding to the seat n is set to be invalid. The process proceeds from step S33 to step S34.
- step S34 the sensing setting unit 44 determines whether or not the variable n (seat n) is N.
- step S34 If it is determined in step S34 that the variable n is not N, the process proceeds to step S35, and the sensing setting unit 44 increments the value of the variable n. The process returns to step S33, and steps S33 and S34 are repeated.
- step S34 If it is determined in step S34 that the variable n is N, the process proceeds from step S34 to step S36.
- step S36 the sensing setting unit 44 sets the sensing range of each occupant observation sensor 43 based on the partial sensing range (effective partial sensing range) enabled in step S33.
- the process proceeds from step S36 to step S37.
- step S37 the sensing setting unit 44 sets the detection density and detection frequency of each occupant observation sensor 43 based on the sensing range of each occupant observation sensor 43 set in step S36.
- step S37 When the process of step S37 is completed, the process of this flowchart is completed.
- the sensing range of the occupant observation sensor 43 is set to an appropriate range with few unnecessary ranges according to the position of the occupant existing in the vehicle. Further, according to the sensing range of the occupant observation sensor 43, the detection density or the detection frequency of the occupant observation sensor 43 is appropriately set within a range that does not exceed the processing capacity of the information processing device 21.
- FIG. 5 is a diagram illustrating the arrangement of the seat and the occupant observation sensor in the first embodiment of the sensing setting process.
- a front row seat portion 53 and a rear row seat portion 54 are arranged in the vehicle interior 52 of the automobile 51.
- a seat 53A which is a driver's seat
- a seat 53B which is a passenger seat
- Seats 54A and 54B are arranged in the back row seat portion 54.
- an occupant observation sensor 61 and an occupant observation sensor 62 which are a form of the occupant observation sensor 43 of FIG. 2, are arranged.
- the occupant observation sensor 61 is arranged at a position near the center of the width of the entire front row seat portion 53 and above the position near the front edge of the seat surface of the front row seat portion 53.
- the occupant observation sensor 61 observes the occupant (driver) seated in the seat 53A and the occupant seated in the seat 53B from diagonally above the front side.
- the occupant observation sensor 62 is arranged at a position near the center of the width of the entire rear row seat portion 54 and above the position near the front edge of the seat surface of the rear row seat portion 54.
- the occupant observation sensor 62 observes the occupants seated in the seats 54A and 54B from diagonally above the front side.
- a seating sensor which is a form of the seat-related sensor 42, is installed in each of the seats 53A, 53B, 54A, and 54B.
- FIG. 6 is a diagram for explaining the sensing range of the occupant observation sensor 62 of FIG. 5 in the first embodiment.
- FIG. 6 occupants P1 and P2 are seated in the seats 54A and 54B of the back row seat portion 54 in FIG. 5, respectively.
- a seating sensor 91 and a seating sensor 92 are installed as one form of the seat-related sensor 42 in FIG. 2, respectively.
- the observation range of the occupant observation sensor 62 includes the partial sensing range 101-1 (partial sensing range 101- corresponding to the seated seat 54A), which is the sensing range when only the occupant P1 seated in the seat 54A is observed. 1) and a partial sensing range 101-2 (partial sensing range 101-2 corresponding to the seated seat 54B), which is a sensing range when only the occupant P2 seated in the seat 54B is observed, are included. ..
- the maximum sensing range 101 that surrounds the entire partial sensing range 101-1 and 101-2 is a range that combines both the partial sensing range 101-1 and the partial sensing range 101-2.
- These partial sensing ranges 101-1 and 101-2 are predetermined ranges, and are associated with each of the seats 54A and 54B in the seated state and stored in the storage unit 36 or the like in FIG. There is.
- the sensing setting unit 44 acquires an on signal from the seating sensors 91 and 92. As a result, the sensing setting unit 44 detects that the seats 54A and 54B are in the seated state.
- the sensing setting unit 44 enables the partial sensing range 101-1 corresponding to the seat 54A in the seated state. Further, the sensing setting unit 44 enables the partial sensing range 101-2 corresponding to the seat 54B in the seated state.
- the sensing setting unit 44 sets the maximum sensing range 101, which is the combination of the valid partial sensing ranges 101-1 and 101-2, as the sensing range of the occupant observation sensor 62.
- the occupant state detection unit 45 acquires the observation information of the maximum sensing range 101 from the occupant observation sensor 43 and detects the states of the occupants P1 and P2. Further, the detection density and the detection frequency of the occupant observation sensor 62 at this time are set to a predetermined reference detection density and the reference detection frequency.
- FIG. 7 is a diagram for explaining the sensing range of the occupant observation sensor 62 of FIG. 5 in the first embodiment.
- FIG. 7 is different from the case of FIG. 6 in that no occupant is seated in the seat 54B of the rear row seat portion 54.
- the sensing setting unit 44 acquires an on signal from the seating sensor 91 and an off signal from the seating sensor 92. As a result, the sensing setting unit 44 detects that the seat 54A is in the seated state and the seat 54B is in the non-seat state.
- the sensing setting unit 44 enables the partial sensing range 101-1 corresponding to the seat 54A in the seated state.
- the sensing setting unit 44 invalidates the partial sensing range 101-2 corresponding to the case where the non-seated seat 54B is in the seated state.
- the sensing setting unit 44 sets the valid partial sensing range 101-1 as the sensing range of the occupant observation sensor 62.
- the sensing range of the occupant observation sensor 62 in this case is about half the amount of information (area) of the maximum sensing range 101, which is the sensing range of the occupant observation sensor 62 when two occupants are present in FIG. Become.
- the sensing range of the occupant observation sensor 62 is limited to an appropriate range according to the occupant's existence range, and the burden of processing with unnecessary information is reduced.
- the sensing setting unit 44 can set the detection density of the occupant observation sensor 62 to twice the reference detection density, for example, after limiting the sensing range.
- the reference detection density is the detection density when the sensing range of the occupant observation sensor 62 is the maximum sensing range as described above. As a result, the density of the observation information acquired by the occupant state detection unit 45 from the occupant observation sensor 43 is doubled.
- FIG. 8 is a diagram illustrating a depth image (observation information) in FIG. 6 in which the detection density of the occupant observation sensor 62 is set to the reference detection density when the occupant observation sensor 62 is used as a depth sensor.
- the occupant images P1A and P2A are depth images of the occupants P1 and P2 in FIG. 6, respectively.
- FIG. 9 is a diagram illustrating a depth image (observation information) in FIG. 7 in which the detection density of the occupant observation sensor 62 is set to twice the reference detection density when the occupant observation sensor 62 is used as a depth sensor. ..
- the occupant image P1B is a depth image of the occupant P1 in FIG.
- the resolution of the occupant image P1B of FIG. 9 is higher than that of the occupant image P1A of FIG. Therefore, by setting the detection density of the occupant observation sensor 62 to twice the reference detection density, the detection accuracy of the occupant state detection unit 45 that detects the occupant state based on the observation information from the occupant observation sensor 62 is improved. do.
- FIG. 10 is a flowchart showing a processing example performed by the sensing setting unit 44 in the first embodiment of FIGS. 5 to 9.
- step S51 the sensing setting unit 44 detects the states of the seats 54A and 54B based on the seat information from the seating sensors 91 and 92, which are seat-related sensors. The process proceeds from step S51 to step S52.
- step S52 the sensing setting unit 44 sets the variable n representing the seat number to 1.
- the seat number of seat 54A is 1, and the seat number of seat 54B is 2.
- the process proceeds from step S52 to step S53.
- step S53 the sensing setting unit 44 detects whether or not the seat n is in the seated state.
- the seat n represents a seat 54A when n is 1, and represents a seat 54B when n is 2.
- step S53 If it is determined in step S53 that the seat n is in the seated state, the process proceeds from step S53 to step S54, and the sensing setting unit 44 performs partial sensing corresponding to the seated state of the seat n (seat n in the seated state). Enable the range.
- the partial sensing range corresponding to the seat n in the seated state when the variable n is 1 is the partial sensing range 101-1 shown in FIGS. 6 and 7.
- the partial sensing range corresponding to the seat n in the seated state when the variable n is 2 is the partial sensing range 101-2 shown in FIGS. 6 and 7.
- the process proceeds from step S54 to step S56.
- step S53 If it is determined in step S53 that the seat n is not in the seated state, the process proceeds from step S53 to step S55, and the sensing setting unit 44 has a partial sensing range corresponding to the seated state of the seat n (seat n in the seated state). Is disabled. The process proceeds from step S55 to step S56.
- step S56 the sensing setting unit 44 determines whether or not the variable n is 2.
- step S56 If it is determined in step S56 that the variable n is not 2, the process proceeds to step S57, and the sensing setting unit 44 increments the value of the variable n. The process returns from step S57 to step S53.
- step S56 If it is determined in step S56 that the variable n is 2, the process proceeds from step S56 to step S58.
- step S58 the sensing setting unit 44 sets the partial sensing range enabled in step S54 as the sensing range of the occupant observation sensor 62. The process proceeds from step S58 to step S59.
- step S59 the sensing setting unit 44 determines whether or not both the partial sensing ranges 101-1 and 101-2 corresponding to the two seats 54A and 54B are valid.
- step S59 If it is determined in step S59 that both the partial sensing ranges 101-1 and 101-2 corresponding to the two seats 54A and 54B are valid, the process proceeds from step S59 to step S60.
- step S60 the sensing setting unit 44 sets the detection density of the occupant observation sensor 43 to 1 times the basic detection density. Since the processing of this flowchart does not assume a change in the detection frequency, the sensing setting unit 44 sets the detection frequency of the occupant observation sensor 43 to one times the basic detection frequency. When the process of step S60 is completed, the process of this flowchart is completed.
- step S59 when it is determined that both the partial sensing ranges 101-1 and 101-2 corresponding to the two seats 54A and 54B are not valid, that is, at least one of the partial sensing ranges 101-1 and 101-2. If is invalid, the process proceeds from step S59 to step S61.
- step S61 the sensing setting unit 44 determines whether or not the partial sensing range 101-1 or 101-2 corresponding to one seat 54A or 54B is valid.
- step S61 If it is determined in step S61 that the partial sensing range 101-1 or 101-2 corresponding to one seat 54A or 54B is valid, the process proceeds from step S61 to step S62.
- step S62 the sensing setting unit 44 sets the detection density of the occupant observation sensor 62 to twice the basic detection density, and sets the detection frequency of the occupant observation sensor 62 to one times the basic detection frequency.
- step S61 when the sensing setting unit 44 determines that the partial sensing range 101-1 or 101-2 corresponding to one seat 54A or 54B is not valid, either the partial sensing range 101-1 or 101-2. Is invalid, and the observation information itself is not detected by the occupant observation sensor 62. Therefore, the sensing setting unit 44 does not set the detection density and the detection frequency of the occupant observation sensor 62. Then, the processing of this flowchart is completed.
- the sensing range of the occupant observation sensor 62 is limited to an appropriate range according to the existence range of the occupant, and the burden of processing by unnecessary information is reduced. Further, since the detection density of the occupant observation sensor 62 is set to twice the reference detection density when there is only one occupant, the occupant state detection unit 45 that detects the occupant state recognizes the detection accuracy and operation. The recognition accuracy of the operation recognition unit 46 is improved.
- the sensing setting unit 44 may change the detection frequency of the occupant observation sensor 62 according to the sensing range of the occupant observation sensor 62.
- FIG. 11 is a flowchart showing a processing example performed by the sensing setting unit 44 in the first embodiment of FIGS. 5 to 9 when the detection frequency is changed according to the sensing range.
- steps S81 to S89 are common to steps S51 to S59 of FIG. 10, the description of steps S81 to S89 will be omitted.
- step S89 If it is determined in step S89 that both the partial sensing ranges 101-1 and 101-2 corresponding to the two seats 54A and 54B are valid, the process proceeds from step S89 to step S90.
- step S90 the sensing setting unit 44 sets the detection frequency of the occupant observation sensor 62 to one times the basic detection frequency. Since the processing of this flowchart does not assume a change in the detection density, the sensing setting unit 44 sets the detection density of the occupant observation sensor 62 to 1 times the basic detection density. When the process of step S90 is completed, the process of this flowchart is completed.
- step S89 when it is determined that both the partial sensing ranges 101-1 and 101-2 corresponding to the two seats 54A and 54B are not valid, that is, at least one of the partial sensing ranges 101-1 and 101-2. If is invalid, the process proceeds from step S89 to step S91.
- step S91 the sensing setting unit 44 determines whether or not the partial sensing range 101-1 or 101-2 corresponding to one seat 54A or 54B is valid.
- step S91 If it is determined in step S91 that the partial sensing range 101-1 or 101-2 corresponding to one seat 54A or 54B is valid, the process proceeds from step S91 to step S92.
- step S92 the sensing setting unit 44 sets the detection frequency of the occupant observation sensor 62 to twice the basic detection frequency, and sets the detection density of the occupant observation sensor 62 to one times the basic detection density.
- step S91 when the sensing setting unit 44 determines that the partial sensing range 101-1 or 101-2 corresponding to one seat 54A or 54B is not valid, either the partial sensing range 101-1 or 101-2. Is invalid, and the observation information itself is not detected by the occupant observation sensor 62. Therefore, the sensing setting unit 44 does not set the detection density and the detection frequency of the occupant observation sensor 62. Then, the processing of this flowchart is completed.
- the sensing range of the occupant observation sensor 62 is limited to an appropriate range according to the existence range of the occupant, and the burden of processing by unnecessary information is reduced. Further, since the detection frequency of the occupant observation sensor 62 is set to twice the reference detection frequency when there is only one occupant, the occupant state detection unit 45 that detects the occupant state recognizes the detection accuracy and operation. The recognition accuracy of the operation recognition unit 46 is improved.
- the sensing setting unit 44 may change both the detection frequency and the detection frequency of the occupant observation sensor 62 according to the sensing range of the occupant observation sensor 62.
- the arrangement of the seat and the occupant observation sensor in the automobile and the setting of the sensing range of the occupant observation sensor when two occupants (adults) are seated in the rear row seat portion 54 are described in the first embodiment.
- the seating sensors 91 and 92 of FIG. 6 are different from the case of the first embodiment in that the seating sensors 91 and 92 in the second embodiment detect not only the on / off of the seating but also the weight. Since the other parts are the same as those in the first embodiment, the description thereof will be omitted.
- FIG. 12 is a diagram illustrating a sensing range of the occupant observation sensor 62 of FIG. 5 in the second embodiment.
- FIG. 12 shows that no occupant is seated in the seat 54B of the rear row seat portion 54 and that the occupant seated in the seat 54A is a lightweight occupant P3 having a predetermined weight g0 or less, which is estimated to be a child. It is different from the case of. It is assumed that the occupants P1 and P2 in FIG. 6 are standard occupants heavier than the predetermined weight g0.
- the sensing setting unit 44 acquires a weight of the seating sensor 91 on and a weight g0 or less from the seating sensor 91. Further, the sensing setting unit 44 acquires from the seating sensor 92 that the seating sensor 92 is off.
- the sensing setting unit 44 detects that the seat 54A is in the seated state and the seat 54B is in the non-seat state. Further, the sensing setting unit 44 detects that the occupant P3 seated in the seat 54A is a lightweight occupant because the weight is g0 or less. In this case, the sensing setting unit 44 detects that the state of the seat 54A is the seated state of the lightweight occupant.
- the sensing setting unit 44 enables the partial sensing range 101-3 corresponding to the case where the seat 54A is in the seated state of the lightweight occupant.
- the sensing setting unit 44 invalidates the partial sensing ranges 101-2 and 101-4 corresponding to the case where the non-seated seat 54B is in the seated state.
- the partial sensing range 101-3 is the sensing range of the occupant observation sensor 62 when observing only the lightweight occupant seated in the seat 54A. Comparing the partial sensing range 101-1 and the partial sensing range 101-3, which are the sensing ranges of the occupant observation sensor 62 when observing the standard occupant seated in the seat 54A, the lightweight occupant is the standard occupant. Since it is estimated that the height is shorter than that of the above, the width of the partial sensing range 101-3 in the vertical direction is particularly small.
- the partial sensing range 101-4 is the sensing range of the occupant observation sensor 62 when observing only the lightweight occupant seated in the seat 54B. Similar to the partial sensing range 101-3, the partial sensing range 101-4 is larger than the partial sensing range 101-2, which is the sensing range of the occupant observation sensor 62 when observing a standard occupant seated in the seat 54B.
- the vertical width of is particularly small.
- These partial sensing ranges 101-3 and 101-4 are predetermined ranges, and are associated with each other when the seat 54A is in the seated state of the lightweight occupant and the seat 54B is in the seated state of the lightweight occupant. It is stored in the storage unit 36 and the like of 1.
- the sensing setting unit 44 sets the enabled partial sensing range 101-3 as the sensing range of the occupant observation sensor 62. do. As a result, the sensing range becomes smaller than when the partial sensing range 101-1 is set as the sensing range of the occupant observation sensor 62.
- the sensing range of the occupant observation sensor 62 is, for example, about two-fifths of the maximum sensing range 101 (area).
- the sensing range of the occupant observation sensor 62 is limited to an appropriate range in consideration of the height of the occupant and the like, and the burden of processing with unnecessary information is reduced.
- the sensing setting unit 44 sets the detection density of the occupant observation sensor 62 to 2.5 times the reference detection density, for example, after limiting the sensing range. As a result, the density of the observation information acquired by the occupant state detection unit 45 from the occupant observation sensor 43 becomes 2.5 times.
- FIG. 13 is a diagram illustrating a depth image (observation information) in FIG. 11 when the detection density of the occupant observation sensor 62 is set to 2.5 times the reference detection density when the occupant observation sensor 62 is used as a depth sensor. Is.
- the lightweight occupant image P3A is a depth image of the lightweight occupant P3 in FIG.
- the lightweight occupant image P3A of FIG. 12 Comparing the lightweight occupant image P3A of FIG. 12 with the occupant image P1B of FIG. 9 when the detection density of the occupant observation sensor 62 is set to twice the reference detection density, the lightweight occupant image P3A of FIG. , High resolution. Therefore, by setting the detection density of the occupant observation sensor 62 to 2.5 times the reference detection density, the occupant state detection accuracy of the occupant state detection unit 45 and the operation recognition accuracy of the operation recognition unit 46 are improved. do.
- FIG. 14 is a flowchart showing a processing example performed by the sensing setting unit 44 in the second embodiment of FIGS. 12 and 13.
- step S111 the sensing setting unit 44 detects the state of each of the seats 54A and 54B based on the seat information from the seating sensors 91 and 92, which are seat-related sensors. The process proceeds from step S111 to step S112.
- step S112 the sensing setting unit 44 sets the variable n representing the seat number to 1.
- the seat number of seat 54A is 1, and the seat number of seat 54B is 2.
- the process proceeds from step S112 to step S113.
- step S113 the sensing setting unit 44 detects whether or not the seat n is in the seated state.
- the seat n represents a seat 54A when n is 1, and represents a seat 54B when n is 2.
- step S113 If it is determined in step S113 that the seat n is in the seated state, the process proceeds from step S113 to step S114, and the sensing setting unit 44 determines whether or not the seat n is in the seated state of the lightweight occupant.
- step S114 If it is determined in step S114 that the seat n is not in the seated state of the lightweight occupant, the process proceeds from step S114 to step S115, and the sensing setting unit 44 puts the seat n in the seated state of the non-lightweight occupant (standard occupant). Enable the corresponding partial sensing range.
- the partial sensing range corresponding to the seated state of the non-lightweight occupant of the seat n when the variable n is 1 is the partial sensing range 101-1 shown in FIG.
- the partial sensing range corresponding to the seated state of the non-lightweight occupant of the seat n when the variable n is 2 is the partial sensing range 101-2 shown in FIG.
- the process proceeds from step S115 to step S118.
- step S114 If it is determined in step S114 that the seat n is in the seated state of the lightweight occupant, the process proceeds from step S114 to step S116, and the sensing setting unit 44 determines the partial sensing range in which the seat n corresponds to the seated state of the lightweight occupant. Is enabled.
- the partial sensing range in which the seat n corresponds to the seated state of the lightweight occupant when the variable n is 1 is the partial sensing range 101-3 shown in FIG.
- the partial sensing range in which the seat n corresponds to the seated state of the lightweight occupant when the variable n is 2 is the partial sensing range 101-4 shown in FIG.
- the process proceeds from step S116 to step S118.
- step S113 If it is determined in step S113 that the seat n is not in the seated state, the process proceeds from step S113 to step S117, and the sensing setting unit 44 invalidates the partial sensing range corresponding to all the states of the seat n. The process proceeds from step S117 to step S118.
- step S118 the sensing setting unit 44 determines whether or not the variable n is 2.
- step S118 If it is determined in step S118 that the variable n is not 2, the process proceeds to step S119, and the sensing setting unit 44 increments the value of the variable n. The process returns from step S119 to step S113.
- step S118 If it is determined in step S118 that the variable n is 2, the process proceeds from step S118 to step S120.
- step S120 the sensing setting unit 44 sets the partial sensing range enabled in steps S115 and S116 as the sensing range of the occupant observation sensor 62.
- the process proceeds from step S120 to step S121.
- step S121 the sensing setting unit 44 determines whether or not the partial sensing range corresponding to the two seats 54A and 54B is effective.
- step S121 If it is determined in step S121 that the partial sensing range corresponding to the two seats 54A and 54B is valid, the process proceeds from step S121 to step S122, and the sensing setting unit 44 corresponds to the two seats 54A and 54B. It is determined whether or not both of the partial sensing ranges to be performed correspond to the seated state of the lightweight occupant.
- step S122 If it is determined in step S122 that both of the partial sensing ranges corresponding to the two seats 54A and 54B are not the partial sensing ranges corresponding to the seated state of the lightweight occupant, the process proceeds from step S122 to step S123.
- step S123 the sensing setting unit 44 determines whether or not any of the partial sensing ranges corresponding to the two seats 54A and 54B is the partial sensing range corresponding to the lightweight occupant.
- step S123 when it is determined that any of the partial sensing ranges corresponding to the two seats 54A and 54B is not the partial sensing range corresponding to the lightweight occupant (when the seats 54A and 54B are in the seated state of the non-lightweight occupant). The process proceeds from step S123 to step S124.
- step S124 the sensing setting unit 44 sets the detection density of the occupant observation sensor 62 to 1 times. Then, the processing of this flowchart is completed.
- step S123 when it is determined that any of the partial sensing ranges corresponding to the two seats 54A and 54B is the partial sensing range corresponding to the lightweight occupant (one seat is in the seated state of the lightweight occupant and the other). If the seat is seated by a non-lightweight occupant), the process proceeds from step S123 to step S125.
- step S125 the sensing setting unit 44 sets the detection density of the occupant observation sensor 62 to 1 time (accurately, 10/9 times). Then, the processing of this flowchart is completed.
- step S122 If it is determined in step S122 that both of the partial sensing ranges corresponding to the two seats 54A and 54B are the partial sensing ranges corresponding to the seated state of the lightweight occupant, the process proceeds from step S122 to step S126.
- step S126 the sensing setting unit 44 sets the detection density of the occupant observation sensor 62 to 1.3 times. Then, the processing of this flowchart is completed.
- step S121 If it is determined in step S121 that the partial sensing ranges corresponding to the two seats 54A and 54B are not valid, the process proceeds from step S121 to step S127.
- step S127 the sensing setting unit 44 determines whether or not the partial sensing range corresponding to one seat 54A or 54B is valid.
- step S127 If it is determined in step S127 that the partial sensing range corresponding to one seat 54A or 54B is valid, the process proceeds from step S127 to step S128.
- step S1208 the sensing setting unit 44 determines whether or not the partial sensing range corresponding to one seat 54A or 54B is the partial sensing range corresponding to the lightweight occupant.
- step S128 when it is determined that the partial sensing range corresponding to one seat 54A or 54B is not the partial sensing range corresponding to the lightweight occupant (one seat is in the seated state of the non-lightweight occupant and the other seat is not. (In the case of a seated state), the process proceeds from step S128 to step S129.
- step S129 the sensing setting unit 44 doubles the detection density of the occupant observation sensor 62. Then, the processing of this flowchart is completed.
- step S128 If it is determined in step S128 that the partial sensing range corresponding to one seat 54A or 54B is the partial sensing range corresponding to the lightweight occupant, the process proceeds from step S128 to step S130.
- step S130 the sensing setting unit 44 sets the detection density of the occupant observation sensor 62 to 2.5 times. Then, the processing of this flowchart is completed.
- step S127 If it is determined in step S127 that the partial sensing range corresponding to one seat 54A or 54B is not valid, any of the partial sensing ranges is invalid and the observation information itself is not detected by the occupant observation sensor 62. .. Therefore, the sensing setting unit 44 does not set the detection density and the detection frequency of the occupant observation sensor 62. Then, the processing of this flowchart is completed.
- the sensing range of the occupant observation sensor 62 is appropriately set according to the position of the occupant existing in the vehicle and the height of the occupant. Further, the detection density of the occupant observation sensor 62 is appropriately set according to the sensing range of the occupant observation sensor 62 within a range not exceeding the processing capacity of the information processing device 21.
- the sensing setting unit 44 may change the detection frequency instead of the detection density of the occupant observation sensor 62 according to the sensing range, and the detection density and the detection frequency may be changed. You may change both.
- the vertical width of the sensing range may be determined.
- the arrangement of the seat and the occupant observation sensor in the automobile is the same as in the case of FIG. 5 of the first embodiment.
- the third embodiment differs from the first embodiment in that the reclining sensor and the seatbelt sensor are used as one form of the seat-related sensor 42 of FIG. 2 in addition to the seating sensor.
- FIG. 15 is a diagram for explaining the sensing range of the occupant observation sensor 62 of FIG. 5 in the third embodiment.
- the seat 54B is the seat on the left side of the back row seat portion 54 in FIG. An occupant P4 is seated in the seat 54B, and the seat 54B is in a reclining state with the backrest tilted backward. Further, it is assumed that the seat belt is not worn (the seat belt is not worn).
- a seating sensor 92 is arranged on the seat 54B as in the first embodiment.
- a reclining sensor 93 is arranged on the seat 54B.
- a seatbelt sensor 94 is arranged at the seatbelt attachment / detachment portion of the seat 54B.
- the sensing setting unit 44 acquires sensor signals from each of the seating sensor 92, the reclining sensor 93, and the seatbelt sensor 94 as seat information from the seat-related sensor 42. As a result, the sensing setting unit 44 detects that the seat 54B is in the seated state, that the seat 54B is in the reclining state, and that the seat belt is not worn. It is assumed that the seat 54A of FIG. 5 (not shown) in FIG. 15 is not seated.
- the sensing setting unit 44 enables the partial sensing range 101-5 associated with the detected state of the seat 54B.
- the non-seated seat 54A is in the seated state, all the associated partial sensing ranges are invalidated.
- the sensing setting unit 44 sets the valid partial sensing range 101-5 as the sensing range of the occupant observation sensor 62.
- the sensing range of the occupant observation sensor 62 is about half of the maximum sensing range.
- the sensing setting unit 44 sets, for example, the detection density of the occupant observation sensor 62 to twice the reference detection density.
- FIG. 16 is a diagram illustrating a sensing range of the occupant observation sensor 62 in a state where the seatbelt is worn with respect to the state of FIG.
- the sensing range can be limited to the range of the height near the seat surface as compared with the case of FIG.
- the sensing setting unit 44 acquires sensor signals from each of the seating sensor 92, the reclining sensor 93, and the seatbelt sensor 94 as seat information from the seat-related sensor 42. As a result, the sensing setting unit 44 detects that the seat 54B is in the seated state, that the seat 54B is in the reclining state, and that the seat belt is in the wearing state.
- the sensing setting unit 44 enables the partial sensing range 101-6 associated with the detected state of FIG.
- the partial sensing range 101-6 is smaller than the partial sensing range 101-5 when the seatbelt is not fastened.
- the detection density or detection frequency of the occupant observation sensor 62 can be higher than when the seatbelt is not fastened, and the detection accuracy of the occupant state by the occupant state detection unit 45 and the operation recognition unit 46 can be increased. It is possible to improve the recognition accuracy of the operation.
- FIG. 17 is a diagram illustrating a sensing range of the occupant observation sensors 61 and 62 in the fourth embodiment of the sensing setting process.
- the seat 53B is the seat on the left side of the front row seat portion 53 in FIG. It is assumed that the occupant P5 is seated in the seat 53B, and the seat 54B is in the reclining state with the backrest tilted backward. Further, it is assumed that the seat belt is not worn and the seat belt is not worn.
- a seating sensor 95 is arranged on the seat 53B as in the first embodiment.
- a reclining sensor 96 is arranged on the seat 53B.
- a seatbelt sensor 97 is arranged at the seatbelt attachment / detachment portion of the seat 53B.
- the sensing setting unit 44 acquires sensor signals from each of the seating sensor 95, the reclining sensor 96, and the seatbelt sensor 97 as seat information from the seat-related sensor 42. As a result, the sensing setting unit 44 detects that the seat 53B is in the seated state, that the seat 53B is in the reclining state, and that the seat belt is not worn. It is assumed that the seat 53A of FIG. 5 (not shown) in FIG. 17 is not seated.
- the sensing setting unit 44 enables the partial sensing range 103 corresponding to the detected state of the seat 53B.
- the seat 53A in the non-seat state is in the seated state, all the associated partial sensing ranges are invalidated.
- the occupant P5 covers both the observation range of the occupant observation sensor 61 and the observation range of the occupant observation sensor 62. In this case, the occupant P5 cannot be observed only by either one of the occupant observation sensor 61 and the occupant observation sensor 62.
- the partial sensing range 103 corresponding to such a state of the seat 53B is set to a range included in the observation range of the occupant observation sensor 61 and the observation range of the occupant observation sensor 62. Then, of the partial sensing range 103, the portion included in the observation range of the occupant observation sensor 61 is set as the partial sensing range 103A, and the portion included in the occupant observation sensor 62 is separated as the partial sensing range 103B.
- the sensing setting unit 44 sets one of the separated partial sensing ranges 103A of the valid partial sensing ranges 103 as the sensing range of the occupant observation sensor 61. Further, the sensing setting unit 44 sets the other separated partial sensing range 103B of the valid partial sensing range 103 as the sensing range of the occupant observation sensor 62.
- the occupant state detection unit 45 acquires the observation information of the sensing range of the occupant observation sensor 61 and the occupant observation sensor 62, the observation information of the partial sensing range 103A from the occupant observation sensor 61 and the portion from the occupant observation sensor 62 The observation information of the partial sensing range 103 is acquired by combining (integrating) with the observation information of the sensing range 103B.
- FIG. 18 is a diagram illustrating observation information (depth image) acquired from each of the occupant observation sensor 61 and the occupant observation sensor 62 in the state of FIG. 17 when the occupant observation sensor 61 and the occupant observation sensor 62 are used as depth sensors. Is.
- the occupant image P5A is a depth image extracted from the partial sensing range 103A of the sensing range of the occupant observation sensor 61.
- the occupant image P5B is a depth image acquired from the partial sensing range 103B of the sensing range of the occupant observation sensor 62.
- the occupant state detection unit 45 acquires the occupant image P5A by extracting the depth image in the range corresponding to the partial sensing range 103A from the depth image acquired from the occupant observation sensor 61. Further, the occupant state detection unit 45 acquires the occupant image P5B by extracting the depth image in the range corresponding to the partial sensing range 103B from the depth image acquired from the occupant observation sensor 62. Then, the occupant state detection unit 45 acquires a depth image of the occupant P5 in the partial sensing range 103 of FIG. 17 by integrating (connecting) the occupant image P5A and the occupant image P5B extracted from each.
- the occupant observation sensors 61 and 62 By integrating the observation information obtained from each of the above, a wide range of observation information of the occupant can be obtained. Therefore, the occupant state detection unit 45 appropriately detects the occupant state.
- FIG. 19 is a flowchart showing a processing example performed by the sensing setting unit 44 in the fourth embodiment of FIGS. 17 and 18.
- step S151 the sensing setting unit 44 detects the state of each seat based on the seat information from the seat-related sensors 42 (seat sensor 95, reclining sensor 96, and seat belt sensor 97). The process proceeds from step S151 to step S152.
- step S152 the sensing setting unit 44 sets the variable n representing the seat number to 1.
- the seat number of seat 53A is 1, and the seat number of seat 53B is 2.
- the process proceeds from step S152 to step S153.
- step S153 the sensing setting unit 44 detects whether or not the seat n is in the seated state.
- the seat n represents a seat 53A when n is 1, and represents a seat 53B when n is 2.
- step S153 If it is determined in step S153 that the seat n is in the seated state, the process proceeds from step S153 to step S154, and the sensing setting unit 44 determines whether or not the seat n is in the reclining state.
- step S154 If it is determined in step S154 that the seat n is not in the reclining state, the process proceeds from step S154 to step S155, and the sensing setting unit 44 effectively activates the partial sensing range corresponding to the seated state and the non-reclining state of the seat n. Set. The process proceeds from step S155 to step S158.
- step S154 When it is determined in step S154 that the seat n is in the reclining state, the process proceeds from step S154 to step S156, and the sensing setting unit 44 effectively activates the partial sensing range corresponding to the seated state and the reclining state of the seat n. Set. The process proceeds from step S156 to step S158.
- step S153 If it is determined in step S153 that the seat n is not in the seated state, the process proceeds from step S153 to step S157, and the sensing setting unit 44 invalidates the partial sensing range corresponding to all the states of the seat n. The process proceeds from step S157 to step S158.
- step S158 the sensing setting unit 44 determines whether or not the variable n is 2.
- step S158 If it is determined in step S158 that the variable n is not 2, the process proceeds to step S159, and the sensing setting unit 44 increments the value of the variable n. The process returns from step S159 to step S153.
- step S158 If it is determined in step S158 that the variable n is 2, the process proceeds from step S158 to step S160.
- step S160 the sensing setting unit 44 sets the variable n to 1. The process proceeds from step S160 to step S161.
- step S161 it is determined whether or not the valid partial sensing range corresponding to the seat n is the partial sensing range corresponding to the reclining state.
- step S161 If it is determined in step S161 that the valid partial sensing range corresponding to the seat n is not the partial sensing range corresponding to the reclining state, the process proceeds from step S161 to step S162.
- step S162 the sensing setting unit 44 sets the valid partial sensing range as the sensing range of the occupant observation sensor 61 (occupant observation sensor 61 of the seat n) for observing the occupant of the seat n.
- the process proceeds from step S162 to step S165.
- step S161 If it is determined in step S161 that the valid partial sensing range corresponding to the seat n is the partial sensing range corresponding to the reclining state, the process proceeds from step S161 to step S163.
- step S163 of the valid partial sensing ranges, the range for the occupant observation sensor 61 for seat n (occupant observation sensor 61 for observing the occupants for seat n) is set as the sensing range for the occupant observation sensor 61 for seat n.
- the process proceeds from step S163 to step S164.
- step S165 it is determined whether or not the variable n is 2.
- step S165 If it is determined in step S165 that the variable n is not 2, the process proceeds from step S165 to step S166, and the sensing setting unit 44 increments the value of the variable n and returns from step S166 to step S161.
- step S165 If it is determined in step S165 that the variable n is 2, the process of this flowchart ends.
- the occupant observation sensors 61 and 62 By integrating the observation information obtained from each of the above, a wide range of observation information of the occupant can be obtained. Therefore, the occupant state detection unit 45 appropriately detects the occupant state.
- FIG. 20 is a diagram illustrating the arrangement of the occupant observation sensor in the fifth embodiment of the sensing setting process.
- the interior 152 of the automobile 151 is provided with three rows of front row seats 153, middle row seats 154, and rear row seats 155.
- a driver's seat 153A and a passenger seat 153B are arranged in the front row seat portion 153.
- Seats 154A and 154B are arranged in the middle row seat portion 154.
- Seats 155A and 155B are arranged in the back row seat portion 155.
- Each seat 153A to 155A and 153B to 155B are provided with seating sensors 191A to 193A and 191B and 193B for detecting the presence or absence of seating, respectively.
- the seating sensors 191A to 193A, and 191B and 193B are one form of the seat-related sensor 42 of FIG.
- the seats 153A to 155A and 153B to 155B are provided with seat position sensors 194A to 196A, and 194B and 196B, respectively, which detect the position of the seat with respect to the sliding movement and the direction in the front-back or left-right direction.
- the seat position sensors 194A to 196A, and 194B and 196B are one form of the seat-related sensor 42 of FIG.
- the occupant observation sensors 161 to 163, which are the occupant observation sensors 43 of FIG. 2, are arranged on the ceiling inside the vehicle.
- the occupant observation sensors 161 to 163 are arranged near the center of the width of the ceiling, and are arranged at positions closer to the front side than directly above the front row seat portion 153, the middle row seat portion 154, and the rear row seat portion 155, respectively.
- the occupant observation sensors 161 to 163 are a form of the occupant observation sensor 43 of FIG.
- the occupant observation sensor 161 observes the occupants of the two seats 153A and 153B of the front row seat portion 153.
- the occupant observation sensor 162 observes the occupants of the two seats 154A and 154B of the middle row seat portion 154.
- the occupant observation sensor 163 observes the occupants of the seats 155A and 155B of the rear row seat portion 155.
- FIG. 21 is a diagram showing a sensing range of the occupant observation sensors 161 to 163 of FIG.
- the sensing ranges 181, 182, and 183 indicate the maximum sensing ranges of the occupant observation sensors 161, 162, and 163, respectively.
- the partial sensing ranges 181-1, 182-1, and 183-1 within the maximum sensing ranges 181, 182, and 183 are seated by the seating sensors 191A to 193A of the right seats 153A to 155A. This is the partial sensing range that becomes effective when detected.
- the partial sensing ranges 181-2, 182-2, and 183-2 within the maximum sensing ranges 181, 182, and 183 are seated by the seating sensors 191B to 193B of the left seats 153B to 155B. This is the partial sensing range that becomes effective when a condition is detected.
- the sensing setting unit 44 uses the seating sensors 191A to 193A and 191B to 193B of the seats 153A to 155A and 153B to 155B to determine whether or not the seats 153A to 155A and 153B to 155B are seated (seated state or Non-seated state) is detected.
- the sensing setting unit 44 detects the positions of the seats 153A to 155A and 153B to 155B by the seat position sensors 194A to 196A and 194B to 196B of the seats 153A to 155A and 153B to 155B.
- the sensing setting unit 44 can observe the occupants present in the vehicle based on the detected states (presence or absence of seating and the position of the seat), and excludes the range in which the occupants do not exist from the sensing range as much as possible. As such, set the valid partial sensing range. Then, the sensing setting unit 44 sets the effective partial sensing range as the sensing range of the occupant observation sensors 161 and 162, and 163.
- the seats 153A to 155A and 153B to 155B may be configured to be movable. In this case, the partial sensing range corresponding to the seat position in the seated state is moved according to the seat position.
- each occupant is assigned to any of the occupant observation sensors 161, 162, and 163. It can be observed within the observation range of the occupant observation sensor.
- each seat 153A to 155A and 153B to 155B are largely displaced from the standard position, it may not be possible to observe within the observation range of one occupant observation sensor. Therefore, it may be necessary to acquire the observation information of one occupant by dividing it into a plurality of occupant observation sensors.
- FIG. 22 is a diagram showing a sensing range when the seat is largely displaced in FIG. 21.
- the seat 153B and the seat 154B are largely displaced rearward from the standard position.
- the seat 153B is out of the maximum sensing range 181 of the occupant observation sensor 161 and the seat 154B is partially out of the maximum sensing range 182 of the occupant observation sensor 162.
- the occupant observation sensor 161 cannot observe the occupant of the seat 153B, the occupant observation sensor 162 observes the occupant of the seat 153B.
- the occupant observation sensor 162 can only partially observe the occupant of the seat 154B, the occupant observation sensor 162 and the occupant observation sensor 163 acquire and integrate the observation information of the occupant of the seat 154B.
- the sensing setting unit 44 uses the seating sensors 191A to 193A and 191B to 193B of the seats 153A to 155A and 153B to 155B to determine whether or not the seats 153A to 155A and 153B to 155B are seated (seated state or Non-seated state) is detected.
- the sensing setting unit 44 detects the positions of the seats 153A to 155A and 153B to 155B by the seat position sensors 194A to 196A and 194B to 196B of the seats 153A to 155A and 153B to 155B.
- the sensing setting unit 44 can observe the occupants present in the vehicle based on the detected states (presence or absence of seating and the position of the seat), and excludes the range in which the occupants do not exist from the sensing range as much as possible. As such, set the valid partial sensing range.
- the sensing setting unit 44 may generate a partial sensing range corresponding to the position based on the position of the seat 153B, or the sensing setting unit 44 may generate the partial sensing range corresponding to the position of the seat 153B.
- the corresponding data may be read out from the data of the partial sensing range stored in the storage unit 36 in advance in association with each displaceable position.
- the partial sensing range is set for the seat 154B as well as for the seat 153B.
- the partial sensing range 186 is the partial sensing range for the seat 154B.
- the sensing setting unit 44 sets each of the effective partial sensing ranges as the sensing ranges of the occupant observation sensors 161 and 162, and 163.
- the partial sensing range 186 with respect to the seat 154B is included in the observation range of the occupant observation sensor 162 and the observation range of the occupant observation sensor 163.
- the sensing setting unit 44 sets the range 186A included in the observation range of the occupant observation sensor 162 in the partial sensing range 186 as the sensing range of the occupant observation sensor 162, and sets the observation range of the occupant observation sensor 163 in the partial sensing range 186.
- the included range 186B is set as the sensing range of the occupant observation sensor 163.
- the occupant state detection unit 45 acquires the observation information from the occupant observation sensors 161 and 162 and 163, the occupant state detection unit 45 obtains the observation information of the partial sensing range 186 from the observation information from the occupant observation sensor 162 and the occupant observation sensor 163. Obtain and integrate with the observation information of.
- the sensing setting unit 44 sets the detection densities and detection frequencies of the occupant observation sensors 161 and 162, and 163 based on the area of the sensing range of the occupant observation sensors 161 and 162 and 163.
- the sensing range of the occupant observation sensor 161 can be limited to the occupant range of the seat 153A. Therefore, the detection density can be made higher than that of the other occupant observation sensors 162 and 163, and the state of one occupant can be detected with high accuracy.
- the observation information obtained from each of the plurality of occupant observation sensors is integrated. Allows a wide range of observation information for the occupant to be obtained. Therefore, the occupant state detection unit 45 appropriately detects the occupant state.
- FIG. 23 is a diagram illustrating the arrangement of the occupant observation sensor in the sixth embodiment of the sensing setting process.
- the automobile 151 of FIG. 23 has a front row seat portion 153, a middle row seat portion 154, a rear row seat portion 155, seats 153A to 155A, and 153B to 155B, occupant observation sensors 161 to 163, 211, and 212, and a seating sensor 191A.
- seat position sensors 194A to 196A, and 194B and 196B To 193A, and 191B and 193B, seat position sensors 194A to 196A, and 194B and 196B. Therefore, the automobile 151 of FIG.
- FIG. 23 has front row seats 153, middle row seats 154, rear row seats 155, seats 153A to 155A, and 153B to 155B, occupant observation sensors 161 to 163, seating sensors 191A to 193A, and , 191B and 193B, seat position sensors 194A to 196A, and 194B and 196B, which are common to the case of FIG. 21.
- the automobile 151 of FIG. 23 is different from the case of FIG. 21 in that the occupant observation sensor 211 and 212 are newly provided.
- the occupant observation sensors 211 and 212 in FIG. 23 are the occupant observation sensors 43 in FIG. 2, and are arranged near the center of the width of the ceiling in the vehicle.
- the occupant observation sensor 211 is arranged between the occupant observation sensor 161 and the occupant observation sensor 162, and is arranged above the seats between the front row seat portion 153 and the middle row seat portion 154.
- the occupant observation sensor 212 is arranged between the occupant observation sensor 162 and the occupant observation sensor 163, and is arranged above the seats between the middle row seat portion 154 and the rear row seat portion 155.
- FIG. 24 is a diagram showing the sensing ranges of the occupant observation sensors 161 to 163, 211, and 212 of FIG. 23.
- the sensing ranges 181 to 183 represent the sensing ranges (maximum sensing range) of the occupant observation sensors 161 to 163, respectively.
- the sensing ranges 231 and 232 represent the maximum sensing ranges of the occupant observation sensors 211 and 212, respectively.
- the occupants seated in the seats 153A to 155A and 153B to 155B are occupants. Observation is observed within the sensing range of any one of the observation sensors 161 to 163. Therefore, when the seats 153A to 155A and 153B to 155B are arranged in standard positions as shown in FIG. 24, the occupant observation sensors 211 and 212 may not be used.
- the seats 153A to 155A and 153B to 155B are displaced from the standard positions and are located between the front row seat portion 153 and the middle row seat portion 154, or between the middle row seat portion 154 and the rear row seat portion 155. When placed in between, the occupants of that seat are observed by the occupant observation sensor 211 or 212.
- FIG. 25 is a diagram showing a sensing range when the seat is largely displaced from the standard position with respect to FIG. 24.
- the seat 153B and the seat 154B are largely displaced rearward from the standard position.
- seat 153B is located between the standard position of the front row seats 153 and the standard position of the middle row seats 154.
- Seat 154B is located between the standard position of the middle row seats 154 and the standard position of the rear row seats 155.
- the sensing setting unit 44 changes the occupant observation sensor for observing the occupants seated in the seats 153A to 155A and 153B to 155B by the following processing.
- the sensing setting unit 44 detects the positions of the seats 153A to 155A and 153B to 155B based on the seat information from the seat position sensors 194A to 196A and 194B and 196B of the seats 153A to 155A and 153B to 155B. ..
- the sensing setting unit 44 is located between the standard position of the front row seat portion 153 and the standard position of the middle row seat portion 154 (seat) based on the detected positions of the seats 153A to 155A and 153B to 155B. It is detected whether or not a seat is arranged at the position (between the seats) or between the standard position of the middle row seat portion 154 and the standard position of the rear row seat portion 155 (between the seats).
- the sensing setting unit 44 causes the seats arranged between the seats to be observed by the occupant observation sensor 211 or 212, and the other seats are observed by the occupant observation sensor 161 or 162 or 163. Let them observe. Then, the sensing setting unit 44 sets the sensing range of the occupant observation sensors 161 to 163, 211, and 212 based on the presence / absence and position of the seats 153A to 155A and 153B to 155B.
- the seats 153A to 155A are observed by the sensing ranges 181-1, 182-1 and 183-1 of the occupant observation sensors 161 to 163, respectively.
- Seats 153B and 154B are observed by the occupant observation sensors 211 and the sensing ranges 231-1 and 232-1 of 212, respectively.
- the sensing range is not set and observation by the occupant observation sensor is not performed.
- the number of occupant observation sensors is increased as compared with the fifth embodiment, but the occupant state detection unit 45 detects the occupant state and the operation recognition unit 46 operates without significantly increasing the processing amount. Can be recognized with high accuracy.
- Example 7> 26 and 27 are diagrams illustrating the arrangement of the occupant observation sensor in the seventh embodiment of the sensing setting process.
- FIGS. 26 and 27 the parts corresponding to those of FIG. 21 are designated by the same reference numerals and the description thereof will be omitted.
- the occupant observation sensors 251, 252, and 253 in FIG. 27 are added to the fifth embodiment.
- the occupant observation sensors 251, 252, and 253 are arranged at positions in front of the front row seats 153, the middle row seats 154, and the rear row seats 155, respectively, and the front row seats 153 and the middle row seats 153.
- the occupants seated in 154 and the back row seat portion 155 are observed from the front side thereof in a direction close to the lateral direction (horizontal direction).
- the occupant observation sensors 251, 252, and 253 have higher performance than the occupant observation sensors 161, 162, and 163, and the detection density and the detection frequency can be set higher than those of the occupant observation sensors 161, 162, and 163. can.
- the occupant observation sensor 251 is installed at a position in front of the front row seat portion 153 near the center of the width, for example, a dashboard or a windshield peripheral portion.
- the maximum sensing range 271 of the occupant observation sensor 251 includes the range of the occupants seated in the seats 153A and 153B of the front row seat portion 153, and observes the occupants seated in the seats 153A and 153B from the front side thereof.
- the seating sensors 191A and 191B arranged in the seats 153A and 153B respectively detect whether the seats 153A and 153B are in the seated state or the non-seat state, respectively, by the sensing setting unit 44.
- the sensing range of the occupant observation sensor 251 is limited to the sensing range 271-1.
- the sensing range of the occupant observation sensor 251 is limited to the sensing range 271-2.
- the occupant observation sensor 252 is installed, for example, on the back portion of the front row seat portion 153 near the center of the width.
- the maximum sensing range 272 of the occupant observation sensor 252 includes the range of the occupants seated in the seats 154A and 154B of the middle row seat portion 154, and observes the occupants seated in the seats 154A and 154B from the front side thereof.
- the seating sensors 192A and 192B arranged in the seats 154A and 154B respectively detect whether the seats 154A and 154B are in the seated state or the non-seat state, respectively, by the sensing setting unit 44.
- the sensing range of the occupant observation sensor 252 is limited to the sensing range 272-1.
- the sensing range of the occupant observation sensor 252 is limited to the sensing range 272-2.
- the occupant observation sensor 253 is installed, for example, on the back portion of the middle row seat portion 154 near the center of the width.
- the maximum sensing range 273 of the occupant observation sensor 253 includes the range of the occupants seated in the seats 155A and 155B of the rear row seat portion 155, and observes the occupants seated in the seats 155A and 155B from the front side thereof.
- the seating sensors 193A and 193B arranged in the seats 155A and 155B respectively detect whether the seats 155A and 155B are in the seated state or the non-seat state, respectively, by the sensing setting unit 44.
- the sensing range of the occupant observation sensor 253 is limited to the sensing range 273-1.
- the sensing range of the occupant observation sensor 253 is limited to the sensing range 273-2.
- the sensing setting unit 44 may allow the occupant observation sensors 251 to 253, which have higher performance than the occupant observation sensors 161 to 163, to observe the occupant.
- the occupant observation sensors 161 to 163 may have higher performance than the occupant observation sensors 251 to 253, and the performance of the occupant observation sensors 161 to 163 and the performance of the occupant observation sensors 251 to 253 are not the same. May be good.
- the sensing setting unit 44 may use the higher performance sensor for observing the occupants, or may be determined in advance. The one may be preferentially used for observing the occupants.
- FIG. 28 is a diagram illustrating a case where the seat arrangement is changed with respect to FIGS. 26 and 27.
- the occupant observation sensor 252 of FIG. 27 cannot be used for occupant observation because it is shielded by the backrests of the seats 154A and 154B.
- the occupant observation sensor 253 of FIG. 27 is installed on the back surface of the middle row seat portion 154 (the back surface of the seats 154A and 154B, etc.), the back surface of the front row seat portion 153 can be changed by changing the orientation of the seats 154A and 154B. It is suitable as an observation direction. Therefore, the occupant observation sensor 253 cannot be used for observing the occupant.
- the occupant observation sensor 251 is used for observing the occupants of the front row seat portion 153
- the occupant observation sensors 162 and 163 are used for observing the occupants of the middle row seat portion 154 and the rear row seat portion 155. To use.
- the sensing setting unit 44 detects the seat arrangement by the sensor signals from the seat position sensors 194A to 196A and 194B to 196B installed in each of the seats 153A to 155A and 153B to 155B, respectively. Then, the sensing setting unit 44 grasps the occupant observation sensor that can be effectively used based on the seat arrangement, and among the occupant observation sensors that can be effectively used, the occupant observation sensor used for observing the occupant of each seat is selected. decide. When a plurality of occupant observation sensors can be used for one seat, the sensing setting unit 44 may determine the occupant observation sensor to be used for occupant observation by giving priority to performance as described above. A predetermined occupant observation sensor may be used for observation.
- the occupant state detection unit 45 can detect the occupant state and the operation recognition unit 46 can recognize the operation with high accuracy without limiting the arrangement of the seats.
- FIG. 29 is a diagram illustrating the arrangement of the occupant observation sensor in the eighth embodiment of the sensing setting.
- a front row seat portion 303, a middle row seat portion 304, and a rear row seat portion 305 are arranged in the vehicle interior 302 of the automobile 301.
- Windows 311 to 313 are provided on the side surface of the automobile 301, and touch panels 331 to 333 are installed on the windows 311 to 313.
- a transparent image display may be provided on each of the windows 311 to 313, or image information or text information may be projected by a projector.
- touch panels 331 to 333 are a form of the occupant observation sensor 43 of FIG. 2 for observing the occupant performing the touch operation.
- FIG. 30 is a diagram illustrating a case where the seat arrangement is changed with respect to FIG. 29.
- FIG. 30 the arrangement of seats has been changed so that a meeting is held.
- the seat 304A has been changed to a position and orientation with the window 312 behind.
- the sensing setting unit 44 detects the position of each seat by the sensor signal from the seat position sensor provided in each seat. Then, when the seat is arranged at a position and orientation such that the backrest (seat back) of the seat is close to any one of the touch panels 331 to 333, the sensing by the touch panel in which the backrest is close is stopped.
- the processing load is reduced and erroneous operation is prevented.
- FIG. 31 is a diagram illustrating the arrangement of the occupant observation sensor in the ninth embodiment of the sensing setting.
- the driver's seat 352 of the automobile 351 is provided with a steering wheel (steering wheel) 353, and the steering wheel 353 is provided with a steering wheel sensor 361.
- a touch panel 371 is provided on the window 362 on the side surface of the driver's seat 352.
- the driver, occupant P6, is holding the steering wheel (steering wheel) 353.
- the window 362 may be provided with a transparent image display, or the projector may project image information or text information.
- the touch panel 371 is a form of the occupant observation sensor 43 of FIG. 2 for observing the occupant performing the touch operation.
- the handle sensor 361 is a sensor that detects whether or not the occupant P6 seated in the driver's seat is holding the handle 353, and is a form of a sensor included in the seat-related sensor 42 of FIG. 2 that detects the state of the seat. ..
- the sensing setting unit 44 detects whether or not the handle 353 is gripped by the sensor signal from the handle sensor 361. As a result, when the handle 353 is gripped, the sensing of the touch panel 371 is invalidated.
- the ninth embodiment it is possible to prevent the driver from being distracted by performing a touch operation on the touch panel while driving.
- this technology is a facility equipped with predetermined equipment such as seats, and the presence or absence and position of the user can be estimated by using the equipment. If it is such a facility, it can be applied to any facility.
- the equipment used by the user is not limited to the seat, but may be a floor surface or stairs on which the user can stand.
- the present technology can also have the following configurations.
- An information processing device having a processing unit that sets a sensing range of an observation sensor that observes a user who uses the equipment based on the state of the equipment installed in the vehicle.
- the information processing apparatus according to (1) wherein the state of the equipment includes a state of whether or not the equipment is used.
- the information processing apparatus according to any one of (1) to (5), wherein the state of the equipment includes the position of the equipment. (7) The information processing device according to any one of (1) to (6), wherein the observation sensor includes a sensor that acquires spatial information of the user. (8) The information processing device according to any one of (1) to (7), wherein the observation sensor includes a sensor that acquires physical information of the user. (9) The information processing device according to any one of (1) to (8), wherein the observation sensor includes an image sensor or a depth sensor. (10) The information processing apparatus according to any one of (1) to (9), wherein the processing unit sets a sensing range previously associated with the state of the equipment as the sensing range of the observation sensor.
- the information processing apparatus according to any one of (1) to (10), wherein the processing unit sets the detection density or detection frequency of the observation sensor according to the sensing range. (12) The information processing apparatus according to (11), wherein the processing unit sets the detection density or the detection frequency based on the amount of information acquired from the sensing range of the observation sensor. (13) The information processing device according to (1), wherein the processing unit detects the state of the user based on the observation information from the observation sensor. (14) The information processing apparatus according to (13), wherein the processing unit recognizes an operation corresponding to the state of the user. (15) The processing unit of the information processing device having the processing unit is An information processing method that sets the sensing range of an observation sensor that observes a user who uses the equipment based on the condition of the equipment. (16) Computer A program for functioning as a processing unit that sets the sensing range of an observation sensor that observes the user who uses the equipment based on the condition of the equipment.
- 11 information processing system 41 processing unit, 42 seat-related sensor, 43 occupant observation sensor, 44 sensing setting unit, 45 occupant status detection unit, 46 operation recognition unit, 47 operation response unit
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Aviation & Aerospace Engineering (AREA)
- Transportation (AREA)
- Seats For Vehicles (AREA)
Abstract
The present technology relates to an information processing device, an information processing method, and a program with which it is possible to reduce a processing burden due to unnecessary information. On the basis of the state of equipment provided inside a vehicle, a sensing range of an observation sensor that observes a user using the equipment is set. The present technology is applicable to automobiles, trains, restaurants, theaters, etc.
Description
本技術は、情報処理装置、情報処理方法、及び、プログラムに関し、特に、不要な情報による処理の負担軽減を図るようにした情報処理装置、情報処理方法、及び、プログラムに関する。
This technology relates to information processing devices, information processing methods, and programs, and in particular, to information processing devices, information processing methods, and programs designed to reduce the processing burden of unnecessary information.
特許文献1には、カメラで撮影された車室内の画像に基づいて乗車人数を検出し、乗車人数に応じて車室内の画像の編集方法を最適化する技術が開示されている。特許文献2には、車室内の人数等に応じた情報提示を行う技術が開示されている。
Patent Document 1 discloses a technique of detecting the number of passengers based on an image of the interior of a vehicle taken by a camera and optimizing a method of editing an image of the interior of the vehicle according to the number of passengers. Patent Document 2 discloses a technique for presenting information according to the number of people in a vehicle interior and the like.
センサにより取得された情報を解析する場合に情報量が多いと処理の負担が大きくなる。不要な情報はできるだけ減らすことが望ましい。
When analyzing the information acquired by the sensor, if the amount of information is large, the processing load will increase. It is desirable to reduce unnecessary information as much as possible.
本技術は、このような状況に鑑みてなされたものであり、不要な情報による処理の負担軽減を図ることができるようにするものである。
This technology was made in view of such a situation, and makes it possible to reduce the processing burden of unnecessary information.
本技術の情報処理装置、又は、プログラムは、自動車の車内に設けられる設備の状態に基づいて、前記設備を使用する使用者を観察する観察センサのセンシング範囲を設定する処理部を有する情報処理装置、又は、そのような情報処理装置として、コンピュータを機能させるためのプログラムである。
The information processing device or program of the present technology is an information processing device having a processing unit that sets a sensing range of an observation sensor that observes a user who uses the equipment based on the state of equipment provided in the vehicle. Or, it is a program for operating a computer as such an information processing device.
本技術の情報処理方法は、処理部を有する情報処理装置の前記処理部が、自動車の車内に設けられる設備の状態に基づいて、前記設備を使用する使用者を観察する観察センサのセンシング範囲を設定する情報処理方法である。
In the information processing method of the present technology, the processing unit of the information processing device having the processing unit sets the sensing range of the observation sensor for observing the user who uses the equipment based on the state of the equipment provided in the vehicle. This is the information processing method to be set.
本技術においては、自動車の車内に設けられる設備の状態に基づいて、前記設備を使用する使用者を観察する観察センサのセンシング範囲が設定される。
In this technology, the sensing range of the observation sensor that observes the user who uses the equipment is set based on the state of the equipment installed in the vehicle.
以下、図面を参照しながら本技術の実施の形態について説明する。
Hereinafter, embodiments of the present technology will be described with reference to the drawings.
<<本技術を適用した情報処理システムの一実施の形態>>
<< An embodiment of an information processing system to which this technology is applied >>
図1において、情報処理システム11は、例えば自動車の車内における乗員の状態を検出し、乗員の状態に対応して自動車の各種装置又は各種設備に関する処理又は制御を行う。
In FIG. 1, the information processing system 11 detects, for example, the state of an occupant in a vehicle, and performs processing or control related to various devices or equipment of the vehicle in response to the state of the occupant.
図1において、情報処理システム11は、情報処理装置21、車載システム22、及び、センサ23を有する。
In FIG. 1, the information processing system 11 includes an information processing device 21, an in-vehicle system 22, and a sensor 23.
情報処理装置21は、主に乗員の状態を検出し、乗員の状態に対応した処理(応答処理)を実行する。応答処理には、自動車に搭載された車載システム22の一部の制御部に対する所定の処理又は制御の実行を指示する処理が含まれる。
The information processing device 21 mainly detects the state of the occupant and executes processing (response processing) corresponding to the state of the occupant. The response process includes a process of instructing a part of the control unit of the vehicle-mounted system 22 mounted on the automobile to execute a predetermined process or control.
なお、図1の情報処理装置21は、乗員が指定した動画再生などの所定のアプリケーションを実行し、アプリケーションの提示情報を車内のディスプレイ等に表示させる機能を含む構成として例示されている。したがって、情報処理装置21が行う応答処理には、自身が実行しているアプリケーションの応答処理が含まれる。
Note that the information processing device 21 of FIG. 1 is exemplified as a configuration including a function of executing a predetermined application such as moving image playback designated by an occupant and displaying the presentation information of the application on a display or the like in the vehicle. Therefore, the response processing performed by the information processing apparatus 21 includes the response processing of the application executed by the information processing apparatus 21.
車載システム22は、自動車が備える制御系、ボディ系、及び、情報系の制御部(ECU:Electronic Control Unit)を含み、各制御部を車載ネットワークで通信可能に接続したシステムである。制御系は、エンジン制御、ステアリング制御、ブレーキ制御等を行う。ボディ系は、シート制御、ドア制御、ミラー制御、エアコン制御等を行う。情報系は、オーディオシステム、ナビゲーションシステム、バックモニタ等で構成される。車載システム22の一部の制御部は、情報処理装置21からの指示にしたがって所定の制御を実行する機能を有する。
The in-vehicle system 22 includes a control system, a body system, and an information system control unit (ECU: Electronic Control Unit) included in the automobile, and is a system in which each control unit is communicably connected via an in-vehicle network. The control system performs engine control, steering control, brake control, and the like. The body system performs seat control, door control, mirror control, air conditioner control, and the like. The information system is composed of an audio system, a navigation system, a back monitor, and the like. A part of the control unit of the in-vehicle system 22 has a function of executing a predetermined control according to an instruction from the information processing device 21.
センサ23は、車載システム22が使用する目的で自動車に設置されているセンサである。センサ23は、車載システム22で使用される任意のセンサを代表したものであり、特定の種類、特定の数に限定されない。
The sensor 23 is a sensor installed in an automobile for the purpose of being used by the in-vehicle system 22. The sensor 23 represents an arbitrary sensor used in the in-vehicle system 22, and is not limited to a specific type and a specific number.
情報処理装置21は、CPU(Central Processing Unit)31、RAM(Random Access Memory)32、ROM(Read Only Memory)33、入力部34、出力部35、記憶部36、センサ37、及び、通信部38をする。CPU31、RAM32、及び、ROM33は、バス39により相互に接続されている。
The information processing device 21 includes a CPU (Central Processing Unit) 31, a RAM (Random Access Memory) 32, a ROM (Read Only Memory) 33, an input unit 34, an output unit 35, a storage unit 36, a sensor 37, and a communication unit 38. do. The CPU 31, RAM 32, and ROM 33 are connected to each other by a bus 39.
バス39には、さらに、入出力インターフェース40が接続されている。入出力インターフェース40には、入力部34、出力部35、記憶部36、センサ37、及び、通信部38が接続されている。
An input / output interface 40 is further connected to the bus 39. An input unit 34, an output unit 35, a storage unit 36, a sensor 37, and a communication unit 38 are connected to the input / output interface 40.
CPU31は、例えば、ROM33又は記憶部36に記憶されているプログラムを、バス39を介して、又は、入出力インターフェース40及びバス39を介して、RAM32にロードして実行する。CPU31は、プログラムを実行することにより、後述するセンシング設定処理、乗員状態検出処理、操作認識処理、及び、操作応答処理等を実行する。
The CPU 31 loads and executes, for example, the program stored in the ROM 33 or the storage unit 36 into the RAM 32 via the bus 39 or via the input / output interface 40 and the bus 39. By executing the program, the CPU 31 executes the sensing setting process, the occupant state detection process, the operation recognition process, the operation response process, and the like, which will be described later.
RAM32は、揮発性メモリであり、CPU31が処理しているプログラム等のデータを一時的に格納する。
RAM 32 is a volatile memory and temporarily stores data such as programs processed by CPU 31.
ROM33は、不揮発性メモリであり、RAM32にロードされてCPU31により実行されるプログラム等のデータを記憶する。
The ROM 33 is a non-volatile memory, and stores data such as a program loaded in the RAM 32 and executed by the CPU 31.
入力部34は、スイッチ、マイクロフォンなどよりなる。入力部34から入力されたデータは入出力インターフェース40及びバス39を介してCPU32に供給される。
The input unit 34 includes a switch, a microphone, and the like. The data input from the input unit 34 is supplied to the CPU 32 via the input / output interface 40 and the bus 39.
出力部35は、プロジェクタ、ディスプレイ、及び、スピーカなどよりなる。プロジェクタは、車内に設置され、天井、ヘッドレスト、ドアガラス、又は、フロントガラスなどに画像情報やテキスト情報等を投影する。出力部35は、1又は複数の部位に情報を投影する1又は複数のプロジェクタを含んでいてもよい。
The output unit 35 includes a projector, a display, a speaker, and the like. The projector is installed in the vehicle and projects image information, text information, and the like on the ceiling, headrest, door glass, windshield, and the like. The output unit 35 may include one or more projectors that project information onto one or more parts.
ディスプレイは、ダッシュボード、天井、ヘッドレスト、ドアガラス、又は、フロントガラスなどに設置された画像表示ディスプレイ(液晶ディスプレイ、有機EL(Electro-Luminescence)等)であってよい。
The display may be an image display (liquid crystal display, organic EL (Electro-Luminescence), etc.) installed on a dashboard, ceiling, headrest, door glass, windshield, or the like.
なお、図1の情報処理装置21は、情報を表示する機能を備えていなくてもよく、出力部35を備えていなくてもよい。また、情報を表示する機能は車載システム22に含まれていてもよい。
Note that the information processing device 21 of FIG. 1 does not have to have a function of displaying information, and may not have an output unit 35. Further, the function of displaying information may be included in the in-vehicle system 22.
記憶部36は、ハードディスクや不揮発性のメモリなどよりなる。記憶部36はCPU31により実行されるアプリケーションのプログラムや、CPU31での任意のプログラムの実行時に参照されるデータ等が記憶される。
The storage unit 36 includes a hard disk, a non-volatile memory, and the like. The storage unit 36 stores an application program executed by the CPU 31, data referenced when an arbitrary program is executed by the CPU 31, and the like.
センサ37は、乗員(設備の使用者)を観察する乗員観察センサ(後述)や自動車の各座席(設備)の状態を検出する座席関連センサ(後述)である。センサ37は、情報処理装置21が備えるセンサを代表したものであり、特定の種類、特定の数に限定されない。センサ37により取得されたデータは、入出力インターフェース40及びバス39を介してCPU31に供給される。
The sensor 37 is an occupant observation sensor (described later) that observes an occupant (equipment user) and a seat-related sensor (described later) that detects the state of each seat (equipment) of the automobile. The sensor 37 is representative of the sensor included in the information processing device 21, and is not limited to a specific type and a specific number. The data acquired by the sensor 37 is supplied to the CPU 31 via the input / output interface 40 and the bus 39.
通信部38は、ネットワークインターフェースなどよりなる。通信部38は、車載システム22に対して有線又は無線により通信可能に接続される。CPU31は、車載システム22に含まれる制御部に対して、通信接続された通信部38を介して所定の処理又は制御の実行を指示する指示信号等を供給する。
The communication unit 38 includes a network interface and the like. The communication unit 38 is connected to the in-vehicle system 22 so as to be able to communicate by wire or wirelessly. The CPU 31 supplies an instruction signal or the like instructing the execution of a predetermined process or control to the control unit included in the in-vehicle system 22 via the communication unit 38 connected to the communication.
また、CPU31は、車載システム22において使用されるセンサ23により得られたデータを、通信接続された車載システム22と通信部38とを介して取得することができる。したがって、情報処理装置21は、車載システム22で使用されるセンサ23を使用することができる。
Further, the CPU 31 can acquire the data obtained by the sensor 23 used in the in-vehicle system 22 via the in-vehicle system 22 and the communication unit 38 which are connected by communication. Therefore, the information processing device 21 can use the sensor 23 used in the in-vehicle system 22.
なお、通信部38は、無線LAN(Local Area Network)、Bluetooth(登録商標)、移動体通信等の通信インターフェースを有し、スマートフォーン等の端末や、インターネット等に通信接続できるようになっていてもよい。
The communication unit 38 has a communication interface such as a wireless LAN (Local Area Network), Bluetooth (registered trademark), and mobile communication, and can be connected to a terminal such as a smart phone or the Internet. May be good.
図2は、図1の情報処理装置21を機能的に表した情報処理装置21の機能構成例を示したブロック図である。
FIG. 2 is a block diagram showing a functional configuration example of the information processing device 21 that functionally represents the information processing device 21 of FIG.
図2の情報処理装置21は、出力部35、処理部41、座席関連センサ42、及び、乗員観察センサ43を有する。
The information processing device 21 of FIG. 2 has an output unit 35, a processing unit 41, a seat-related sensor 42, and an occupant observation sensor 43.
処理部41は、座席関連センサ42により検出された検出情報(以下、座席情報という)に基づいて、乗員観察センサ43により観察される範囲(以下、センシング範囲という)等を設定する。また、処理部41は、乗員観察センサ43により検出された検出情報(以下、観察情報という)に基づいて、乗員の状態を検出し、乗員の状態に対応した応答処理を実行する。処理部41での処理については後述する。
The processing unit 41 sets the range observed by the occupant observation sensor 43 (hereinafter referred to as the sensing range) and the like based on the detection information (hereinafter referred to as the seat information) detected by the seat-related sensor 42. Further, the processing unit 41 detects the state of the occupant based on the detection information (hereinafter referred to as observation information) detected by the occupant observation sensor 43, and executes the response processing corresponding to the state of the occupant. The processing in the processing unit 41 will be described later.
座席関連センサ42は、図1のセンサ23又はセンサ37として設けられるセンサであり、座席の状態を検出するセンサである。座席の状態には、座席に関連する付帯設備(リクライニング機構、シートベルト、ハンドル等)の状態も含まれる。したがって、座席関連センサ42としては、各座席に対して設けられる着座センサ、リクライニングセンサ、シートベルトセンサ、座席位置センサ、及び、ハンドルセンサ等を含む。また、座席関連センサ42は、一部の座席のみのセンサを除いて座席ごとに設けられており、処理部41は、全ての座席の座席関連センサ42から座席情報を取得する。
The seat-related sensor 42 is a sensor provided as the sensor 23 or the sensor 37 in FIG. 1, and is a sensor that detects the state of the seat. The condition of the seat also includes the condition of ancillary equipment (reclining mechanism, seat belt, steering wheel, etc.) related to the seat. Therefore, the seat-related sensor 42 includes a seating sensor, a reclining sensor, a seatbelt sensor, a seat position sensor, a handle sensor, and the like provided for each seat. Further, the seat-related sensors 42 are provided for each seat except for the sensors for only some seats, and the processing unit 41 acquires seat information from the seat-related sensors 42 for all seats.
なお、座席関連センサ42は、基本的には本技術とは関係なく自動車が備えているセンサであり、それらのセンサを利用して乗員の有無、位置、向き、及び姿勢等が推定される。ただし、座席関連センサ42は、自動車が備えているセンサとは別に設置したセンサであってもよい。例えば、座席関連センサ42は人感センサであってもよい。
The seat-related sensor 42 is basically a sensor provided in the automobile regardless of the present technology, and the presence / absence, position, orientation, posture, etc. of the occupant are estimated using these sensors. However, the seat-related sensor 42 may be a sensor installed separately from the sensor provided in the automobile. For example, the seat-related sensor 42 may be a motion sensor.
乗員観察センサ43は、図1のセンサ23又はセンサ37として設けられるセンサであり、乗員を観察するセンサである。乗員観察センサ43は、例えば、空間的情報を取得する画像センサ(カメラ)又はデプスセンサである。画像センサは、可視光カメラであってもよいし、赤外線カメラであってもよい。デプスセンサは、ステレオカメラにより撮影空間の三次元情報(深度情報)を取得するセンサであってもよいし、ToF(time of flight)センサなどであってよい。
The occupant observation sensor 43 is a sensor provided as the sensor 23 or the sensor 37 in FIG. 1, and is a sensor for observing the occupant. The occupant observation sensor 43 is, for example, an image sensor (camera) or a depth sensor that acquires spatial information. The image sensor may be a visible light camera or an infrared camera. The depth sensor may be a sensor that acquires three-dimensional information (depth information) of the shooting space by a stereo camera, or may be a ToF (time of flight) sensor or the like.
乗員観察センサ43は、例えば、車内における座席の配列(前列、中列、後列等)に対して列ごとに設けられ、1つの乗員観察センサ43により複数の座席の乗員が観察される。乗員観察センサ43は、座席が2列以上の配列である場合には複数設けられ、処理部41は、全ての乗員観察センサ43から供給される観察情報を取得する。
The occupant observation sensor 43 is provided for each row with respect to the arrangement of seats (front row, middle row, rear row, etc.) in the vehicle, and one occupant observation sensor 43 observes occupants of a plurality of seats. A plurality of occupant observation sensors 43 are provided when the seats are arranged in two or more rows, and the processing unit 41 acquires observation information supplied from all the occupant observation sensors 43.
なお、乗員観察センサ43は、1つで2人を観察する以外にも、1つで1人又は3人以上を観察してもよい。また、乗員観察センサ43は、特定の配置に限定されず、座席ごとに配置されていてもよい。また、乗員観察センサ43は、車内の一部の乗員を観察するものであってもよい。
In addition to observing two people with one occupant observation sensor 43, one occupant observation sensor 43 may observe one or three or more people. Further, the occupant observation sensor 43 is not limited to a specific arrangement, and may be arranged for each seat. Further, the occupant observation sensor 43 may observe a part of the occupants in the vehicle.
乗員観察センサ43は、乗員の体温、心拍数、睡眠状態、覚醒状態、及び、情動などの身体的情報(生体的な情報も含む)を取得するセンサであってもよい。
The occupant observation sensor 43 may be a sensor that acquires physical information (including biological information) such as the occupant's body temperature, heart rate, sleep state, wakefulness state, and emotions.
車載システム22及び出力部35については、図1で説明した通りであるので、ここでは説明を省略する。
Since the in-vehicle system 22 and the output unit 35 are as described in FIG. 1, the description thereof will be omitted here.
(処理部41の説明)
処理部41は、センシング設定部44、乗員状態検出部45、操作認識部46、及び、操作応答部47を有する。 (Explanation of processing unit 41)
Theprocessing unit 41 includes a sensing setting unit 44, an occupant state detection unit 45, an operation recognition unit 46, and an operation response unit 47.
処理部41は、センシング設定部44、乗員状態検出部45、操作認識部46、及び、操作応答部47を有する。 (Explanation of processing unit 41)
The
センシング設定部44は、座席関連センサ42から取得した座席情報に基づいて、乗員観察センサ43のセンシング範囲、検出密度、及び、検出頻度の設定を行う。
The sensing setting unit 44 sets the sensing range, detection density, and detection frequency of the occupant observation sensor 43 based on the seat information acquired from the seat-related sensor 42.
乗員観察センサ43のセンシング範囲は、乗員観察センサ43により有効に観察される観察範囲、又は、有効に検出される観察情報の範囲を示す。乗員観察センサ43のセンシング範囲は、乗員観察センサ43の観察可能な最大の観察範囲内において、乗員が存在しない範囲をできるだけ除外するような範囲に設定される。センシング範囲は、例えば、乗員の数が少ない程小さくなる。
The sensing range of the occupant observation sensor 43 indicates the observation range effectively observed by the occupant observation sensor 43 or the range of observation information effectively detected. The sensing range of the occupant observation sensor 43 is set to a range that excludes as much as possible the range in which no occupant exists within the maximum observable observation range of the occupant observation sensor 43. For example, the sensing range becomes smaller as the number of occupants decreases.
乗員観察センサ43の検出密度は、乗員観察センサ43により有効に検出される観測情報の観察点(検出点)の密度を表す。例えば、乗員観察センサ43を画像センサとした場合には、検出密度は、観察情報となる撮影画像の画素密度を表す。
The detection density of the occupant observation sensor 43 represents the density of the observation points (detection points) of the observation information effectively detected by the occupant observation sensor 43. For example, when the occupant observation sensor 43 is used as an image sensor, the detection density represents the pixel density of the captured image as observation information.
乗員観察センサ43の検出頻度は、乗員観察センサ43により有効に行われる検出の頻度を表す。乗員観察センサ43によりセンシング範囲全体の観察情報が得られたときを1回分の検出とした場合に検出頻度は単位時間当たりの検出の回数を表す。
The detection frequency of the occupant observation sensor 43 represents the frequency of detection effectively performed by the occupant observation sensor 43. When the observation information of the entire sensing range is obtained by the occupant observation sensor 43 as one detection, the detection frequency represents the number of detections per unit time.
センシング設定部44は、例えば、乗員観察センサ43のセンシング範囲が小さい程、即ち、乗員観察センサ43から1回の検出で取得する情報量が少ない程、乗員観察センサ43の検出密度及び検出頻度のうちのいずれか一方又は両方を大きな値に設定する。これによって、処理部41等での処理の負担を増加させずに後述の乗員状態検出部45での乗員の状態の検出精度の向上が図られる。
For example, the smaller the sensing range of the occupant observation sensor 43, that is, the smaller the amount of information acquired from the occupant observation sensor 43 in one detection, the smaller the detection density and detection frequency of the occupant observation sensor 43. Set one or both of them to a large value. As a result, the accuracy of detecting the occupant state by the occupant state detection unit 45, which will be described later, can be improved without increasing the processing load on the processing unit 41 or the like.
また、センシング設定部44は、設定したセンシング範囲、検出密度、及び、検出頻度に基づいて、各乗員観察センサ43の動作設定を行い、各乗員観察センサ43の処理動作を変更する。
Further, the sensing setting unit 44 sets the operation of each occupant observation sensor 43 based on the set sensing range, detection density, and detection frequency, and changes the processing operation of each occupant observation sensor 43.
なお、センシング設定部44の処理の詳細については更に後述する。
The details of the processing of the sensing setting unit 44 will be described later.
乗員状態検出部45は、乗員観察センサ43から供給される観察情報を取得するとともに、センシング設定部44により設定されたセンシング範囲の情報を取得する。
The occupant state detection unit 45 acquires the observation information supplied from the occupant observation sensor 43 and also acquires the information of the sensing range set by the sensing setting unit 44.
乗員状態検出部45は、各乗員観察センサ43からの観察情報に基づいて、各乗員の状態を検出し、操作認識部46に供給する。
The occupant state detection unit 45 detects the state of each occupant based on the observation information from each occupant observation sensor 43 and supplies the state to the operation recognition unit 46.
具体的には、乗員状態検出部45は、継続的に得られる観察情報から各乗員の身体的な特徴点を抽出して、手、腕、目、顔等の位置や向き、動きを検出し、乗員の所作(ジェスチャ等)を検出する。なお、乗員状態検出部45は、乗員の表情やその他の身体的な情報を乗員の状態として検出してもよい。
Specifically, the occupant state detection unit 45 extracts the physical feature points of each occupant from the continuously obtained observation information, and detects the positions, orientations, and movements of the hands, arms, eyes, face, and the like. , Detects the behavior of the occupants (gestures, etc.). The occupant state detection unit 45 may detect the facial expression of the occupant and other physical information as the occupant state.
また、乗員状態検出部45は、1人の乗員の身体が、複数の乗員観察センサ43により分割されて観察されている場合であっても、その乗員に対する観察情報がどの乗員観察センサ43のどのセンシング範囲に分割されて取得されるかをセンシング設定部44からのセンシング範囲の情報によって把握することができる(後述)。
Further, in the occupant state detection unit 45, even when the body of one occupant is divided and observed by a plurality of occupant observation sensors 43, which occupant observation sensor 43 has the observation information for the occupant. It can be grasped from the information of the sensing range from the sensing setting unit 44 whether or not it is divided into the sensing ranges and acquired (described later).
乗員状態検出部45は、複数の乗員観察センサ43のセンシング範囲に分割されて取得された乗員の観察情報を、センシング設定部44からのセンシング範囲の情報に基づいて統合(連結)する。これによって、乗員状態検出部45は、複数の乗員観察センサ43により分割されて観察されている乗員に関しても、適切に乗員の状態を検出することができる。
The occupant state detection unit 45 integrates (links) the occupant observation information acquired by being divided into the sensing ranges of the plurality of occupant observation sensors 43 based on the information of the sensing range from the sensing setting unit 44. As a result, the occupant state detection unit 45 can appropriately detect the occupant state even for the occupants who are divided and observed by the plurality of occupant observation sensors 43.
操作認識部46は、乗員状態検出部45により検出された乗員の状態に対して予め対応付けられた操作を認識する。
The operation recognition unit 46 recognizes an operation associated with the occupant state detected by the occupant state detection unit 45 in advance.
例えば、乗員状態検出部45により運転席と助手席の間に手をかざして指で円を描く乗員の動作(状態)が検出され、その乗員の状態を示す情報が乗員状態検出部45から操作認識部46に供給されたとする。操作認識部46は、その乗員の状態に対して予め対応付けられた操作として、例えば、車載システム22に含まれるオーディオシステムのボリュームを上げる操作を認識する。
For example, the occupant state detection unit 45 detects the movement (state) of the occupant who holds his hand between the driver's seat and the passenger seat and draws a circle with his fingers, and the information indicating the occupant's state is operated from the occupant state detection unit 45. It is assumed that it is supplied to the recognition unit 46. The operation recognition unit 46 recognizes, for example, an operation of raising the volume of the audio system included in the in-vehicle system 22 as an operation associated with the state of the occupant in advance.
また、眠気を感じている表情の運転手の状態を示す情報が乗員状態検出部45から操作認識部46に供給されたとする。操作認識部46は、その乗員の状態に対して予め対応付けられた操作として、例えば、警告音を発生させる操作、減速を行う操作、又は、路肩への退避行動を行う操作を認識する。
Further, it is assumed that information indicating the state of the driver with a drowsy expression is supplied from the occupant state detection unit 45 to the operation recognition unit 46. The operation recognition unit 46 recognizes, for example, an operation of generating a warning sound, an operation of decelerating, or an operation of evacuating to the shoulder as an operation associated with the state of the occupant in advance.
操作認識部46は、乗員状態検出部45からの乗員の状態に対して認識した操作を操作応答部47に供給する。
The operation recognition unit 46 supplies the operation recognized for the occupant's state from the occupant state detection unit 45 to the operation response unit 47.
操作応答部47は、操作認識部46により認識された操作に対応した応答処理を実行する。例えば、上述のように操作認識部46によりオーディオシステムのボリュームを上げる操作が認識され、その操作を示す情報が操作認識部46から操作応答部47に供給されたとする。操作応答部47は、その操作に対応する応答処理として、車載システム22に含まれるオーディオシステムの制御部に対してボリュームを上げる指示を行う。
The operation response unit 47 executes response processing corresponding to the operation recognized by the operation recognition unit 46. For example, it is assumed that the operation recognition unit 46 recognizes an operation of raising the volume of the audio system as described above, and the operation recognition unit 46 supplies information indicating the operation to the operation response unit 47. The operation response unit 47 gives an instruction to raise the volume to the control unit of the audio system included in the in-vehicle system 22 as a response process corresponding to the operation.
また、上述のように操作認識部46により減速を行う操作が認識され、その操作を示す情報が操作応答部47に供給されたとする。操作応答部47は、その操作に対応する応答処理として、車載システム22に含まれる車両制御の制御部に対して減速の指示を行う。
Further, it is assumed that the operation recognition unit 46 recognizes the operation of decelerating as described above, and the information indicating the operation is supplied to the operation response unit 47. The operation response unit 47 gives a deceleration instruction to the vehicle control control unit included in the in-vehicle system 22 as a response process corresponding to the operation.
また、操作認識部46によりアプリケーションの起動を指定する操作が認識された場合、操作応答部47は、指定されたアプリケーションを起動させてアプリケーションの処理を実行し、アプリケーションが提示する情報を図1の出力部35に供給させる。また、操作認識部46により起動されているアプリケーションの処理に対する操作が検出された場合には、操作応答部47は、その操作に対応したアプリケーションの処理を実行する。
Further, when the operation recognition unit 46 recognizes the operation for designating the start of the application, the operation response unit 47 starts the designated application, executes the processing of the application, and displays the information presented by the application in FIG. It is supplied to the output unit 35. When an operation for the processing of the application started by the operation recognition unit 46 is detected, the operation response unit 47 executes the processing of the application corresponding to the operation.
図3は、図2の処理部41が実施する処理例を示したフローチャートである。
FIG. 3 is a flowchart showing a processing example executed by the processing unit 41 of FIG.
ステップS11では、処理部41のセンシング設定部44は、座席関連センサ42からの座席情報に基づいて、センシング設定処理を行う。センシング設定部44は、センシング設定処理により乗員観察センサ43のセンシング範囲、検出密度、及び、検出頻度を設定する。処理はステップS11からステップS12に進む。
In step S11, the sensing setting unit 44 of the processing unit 41 performs the sensing setting process based on the seat information from the seat-related sensor 42. The sensing setting unit 44 sets the sensing range, the detection density, and the detection frequency of the occupant observation sensor 43 by the sensing setting process. The process proceeds from step S11 to step S12.
ステップS12では、処理部41の乗員状態検出部45は、乗員観察センサ43からの観察情報に基づいて、乗員の状態を検出する。乗員状態検出部45は、検出した乗員の状態を操作認識部46に供給する。処理はステップS12からステップS13に進む。
In step S12, the occupant state detection unit 45 of the processing unit 41 detects the occupant state based on the observation information from the occupant observation sensor 43. The occupant state detection unit 45 supplies the detected occupant state to the operation recognition unit 46. The process proceeds from step S12 to step S13.
ステップS13では、処理部41の操作認識部46は、ステップS12で乗員状態検出部45から供給された乗員の状態に対して、対応付けされた操作を認識する。操作認識部46は、認識した操作を操作応答部47に供給する。処理はステップS13からステップS14に進む。
In step S13, the operation recognition unit 46 of the processing unit 41 recognizes the operation associated with the occupant state supplied from the occupant state detection unit 45 in step S12. The operation recognition unit 46 supplies the recognized operation to the operation response unit 47. The process proceeds from step S13 to step S14.
ステップS14では、処理部41の操作応答部47は、ステップS13で操作認識部46から供給された操作に対応する応答処理を実行する。処理はステップS14からステップS11に戻り、ステップS11以降を繰り返す。
In step S14, the operation response unit 47 of the processing unit 41 executes the response processing corresponding to the operation supplied from the operation recognition unit 46 in step S13. The process returns from step S14 to step S11, and repeats step S11 and subsequent steps.
以上の図1乃至図3に示した情報処理装置21によれば、自動車の乗員の状態に対して予め決められた操作が自動的に認識され、認識された操作に対応した応答処理が行われる。これにより、乗員は、例えばナビゲーションシステムやオーディオシステムのスイッチや画面を、直接触れることなく手や腕の所作だけで操作することができるようになる。
According to the information processing device 21 shown in FIGS. 1 to 3 above, a predetermined operation is automatically recognized with respect to the state of the occupant of the automobile, and response processing corresponding to the recognized operation is performed. .. As a result, the occupant can operate the switches and screens of the navigation system and the audio system, for example, without directly touching them by the movements of his / her hands and arms.
<乗員観察センサ43のセンシング設定処理の詳細>
(乗員観察センサ43のセンシング範囲の設定) <Details of sensing setting processing ofoccupant observation sensor 43>
(Setting of sensing range of occupant observation sensor 43)
(乗員観察センサ43のセンシング範囲の設定) <Details of sensing setting processing of
(Setting of sensing range of occupant observation sensor 43)
図2の処理部41のセンシング設定部44が行う乗員観察センサ43のセンシング設定処理の詳細について説明する。
The details of the sensing setting process of the occupant observation sensor 43 performed by the sensing setting unit 44 of the processing unit 41 of FIG. 2 will be described.
センシング設定部44は、座席関連センサ42からの座席情報に基づいて乗員観察センサ43のセンシング範囲を設定する。
The sensing setting unit 44 sets the sensing range of the occupant observation sensor 43 based on the seat information from the seat-related sensor 42.
具体的には、センシング設定部44は、車内の各座席に対して設置されている座席関連センサ42からの座席情報に基づいて、各座席の状態を検出する。
Specifically, the sensing setting unit 44 detects the state of each seat based on the seat information from the seat-related sensor 42 installed for each seat in the vehicle.
例えば、座席関連センサ42には、着座センサ、リクライニングセンサ、シートベルトセンサ、座席位置センサ等が含まれる。
For example, the seat-related sensor 42 includes a seating sensor, a reclining sensor, a seatbelt sensor, a seat position sensor, and the like.
そして、座席の状態として、着座センサにより、乗員の着座の有無(着座状態又は非着座状態)が検出される。即ち、車内に設けられた設備である各座席が使用されているか否かの状態として各座席の着座の有無が検出される。また、座席の状態として、座席の形態が含まれ、リクライニングセンサにより座席のリクライニング状態(傾倒状態)又は非リクライニング状態(起立状態)が検出される。また、座席の状態として、座席に関連する付帯設備の状態が含まれ、シートベルトセンサによりシートベルトの着用状態又は非着用状態が検出される。また、座席の状態として、座席位置センサにより前後等に移動可能な座席の座席位置や座席の向きが検出される。
Then, as the state of the seat, the seating sensor detects whether or not the occupant is seated (seat state or non-seat state). That is, the presence or absence of seating of each seat is detected as a state of whether or not each seat, which is a facility provided in the vehicle, is used. Further, the state of the seat includes the form of the seat, and the reclining state (tilted state) or the non-reclining state (standing state) of the seat is detected by the reclining sensor. In addition, the state of the seat includes the state of ancillary equipment related to the seat, and the seatbelt sensor detects the wearing state or the non-wearing state of the seatbelt. In addition, as the state of the seat, the seat position and the orientation of the seat that can be moved back and forth are detected by the seat position sensor.
また、座席関連センサ42には、例えば運転席などに関してはハンドルを把持しているか否かを検出するハンドルセンサ等が含まれていてもよい。
Further, the seat-related sensor 42 may include, for example, a handle sensor for detecting whether or not the handle is gripped for the driver's seat or the like.
座席関連センサ42からの座席情報により検出された各座席の状態により、車内に存在する乗員の位置(乗員の身体の空間的な範囲)が推定される。例えば、センシング設定部44が座席関連センサ42である着座センサのみの座席情報により各座席の状態を検出する場合、着座状態であることが検出されたときには、座席の着座部分の周辺範囲に乗員が存在することが推定される。反対に、非着座状態であることが検出されたときには、座席の着座部分の周辺範囲には乗員が存在しないことが推定される。なお、センシング設定部44は、車内に設けられる座席以外の設備の状態に基づいて乗員の位置を推定してもよい。
The position of the occupant existing in the vehicle (spatial range of the occupant's body) is estimated from the state of each seat detected by the seat information from the seat-related sensor 42. For example, when the sensing setting unit 44 detects the state of each seat based on the seat information of only the seating sensor, which is the seat-related sensor 42, when the seating state is detected, the occupant is placed in the peripheral range of the seated portion of the seat. It is presumed to exist. On the contrary, when it is detected that the vehicle is not seated, it is presumed that there is no occupant in the area around the seated portion of the seat. The sensing setting unit 44 may estimate the position of the occupant based on the state of equipment other than the seats provided in the vehicle.
センシング設定部44は、このように各座席の状態により推定される各乗員の存在範囲が各乗員観察センサ43のセンシング範囲となるように、部分センシング範囲を用いて各乗員観察センサ43のセンシング範囲を設定する。
The sensing setting unit 44 uses the partial sensing range to set the sensing range of each occupant observation sensor 43 so that the existence range of each occupant estimated from the state of each seat becomes the sensing range of each occupant observation sensor 43. To set.
ここで、部分センシング範囲は、各座席に着座している乗員を乗員観察センサ43により個別に観察するとした場合の観察範囲、又は、各座席に着座している乗員の観察情報を乗員観察センサ43により個別に検出するとした場合の観察情報の範囲である。
Here, the partial sensing range is the observation range when the occupants seated in each seat are individually observed by the occupant observation sensor 43, or the observation information of the occupants seated in each seat is observed by the occupant observation sensor 43. This is the range of observation information when it is determined to be detected individually.
部分センシング範囲は、座席ごと、及び、座席の状態ごとに予め決められて図1の記憶部36又はROM33に記憶されている。例えば、各座席には、その座席に着座している乗員を観察するための部分センシング範囲が対応付けられて記憶される。また例えば、リクライニング状態の場合と、非リクライニング状態の場合とでは、それぞれの座席の状態に、異なる範囲の部分センシング範囲が対応付けられて記憶される。ただし、座席の位置等に応じて部分センシング範囲の位置を変更するようにしてもよいし、センシング範囲の大きさも適宜変更してもよい。
The partial sensing range is predetermined for each seat and for each seat condition and is stored in the storage unit 36 or ROM 33 of FIG. For example, each seat is associated and stored with a partial sensing range for observing an occupant seated in that seat. Further, for example, in the case of the reclining state and the case of the non-reclining state, the partial sensing range of a different range is associated with and stored in each seat state. However, the position of the partial sensing range may be changed according to the position of the seat or the like, and the size of the sensing range may be changed as appropriate.
センシング設定部44は、例えば、座席関連センサ42である着座センサのみの座席情報により各座席の状態を検出する場合、着座状態であることを検出した座席(着座状態の座席)に対応付けられた部分センシング範囲を有効とし、非着座状態の座席に対してはその座席に対応付けられた部分センシング範囲を無効とする。
For example, when the sensing setting unit 44 detects the state of each seat based on the seat information of only the seating sensor, which is the seat-related sensor 42, the sensing setting unit 44 is associated with the seat (seat in the seated state) that has detected that the seat is in the seated state. The partial sensing range is enabled, and the partial sensing range associated with that seat is disabled for non-seat seats.
そして、センシング設定部44は、各乗員観察センサ43の最大センシング範囲のうち、有効とした部分センシング範囲を乗員観察センサ43のセンシング範囲に設定する。各乗員観察センサ43の最大センシング範囲とは、各乗員観察センサ43の観察範囲に含まれる全ての部分センシング範囲を有効とした場合のセンシング範囲である。各乗員観察センサ43の観察可能な最大の観察範囲を最大センシング範囲としてもよい。
Then, the sensing setting unit 44 sets the effective partial sensing range of the maximum sensing range of each occupant observation sensor 43 to the sensing range of the occupant observation sensor 43. The maximum sensing range of each occupant observation sensor 43 is a sensing range when all the partial sensing ranges included in the observation range of each occupant observation sensor 43 are enabled. The maximum observable observation range of each occupant observation sensor 43 may be set as the maximum sensing range.
また、1つの部分センシング範囲が複数の乗員観察センサ43の観察範囲に分割される場合もあり得る。その場合にその部分センシング範囲をセンシング範囲とするときには、複数の乗員観察センサ43の各々の観察範囲に対してその部分センシング範囲の対応部分がセンシング範囲となる。
Further, one partial sensing range may be divided into observation ranges of a plurality of occupant observation sensors 43. In that case, when the partial sensing range is set as the sensing range, the corresponding portion of the partial sensing range is the sensing range for each observation range of the plurality of occupant observation sensors 43.
(乗員観察センサ43の検出密度及び検出頻度の設定)
センシング設定部44は、各乗員観察センサ43に対して設定したセンシング範囲に基づいて、各乗員観察センサ43の検出密度及び検出頻度を設定する。 (Setting of detection density and detection frequency of occupant observation sensor 43)
Thesensing setting unit 44 sets the detection density and the detection frequency of each occupant observation sensor 43 based on the sensing range set for each occupant observation sensor 43.
センシング設定部44は、各乗員観察センサ43に対して設定したセンシング範囲に基づいて、各乗員観察センサ43の検出密度及び検出頻度を設定する。 (Setting of detection density and detection frequency of occupant observation sensor 43)
The
ここで、乗員観察センサ43の検出密度とは、乗員観察センサ43のセンシング範囲内において乗員観察センサ43により観察情報(観察データ)が検出される観察点の密度を表す。
Here, the detection density of the occupant observation sensor 43 represents the density of observation points where observation information (observation data) is detected by the occupant observation sensor 43 within the sensing range of the occupant observation sensor 43.
乗員観察センサ43の検出密度は、例えば、乗員観察センサ43が画像センサである場合には、画像センサにより取得される撮影画像の画素密度であり、乗員観察センサ43がデプスセンサである場合には、デブスセンサにより取得される深度画像の画素密度である。
The detection density of the occupant observation sensor 43 is, for example, the pixel density of the captured image acquired by the image sensor when the occupant observation sensor 43 is an image sensor, and when the occupant observation sensor 43 is a depth sensor. It is the pixel density of the depth image acquired by the depth sensor.
乗員観察センサ43の検出頻度とは、乗員観察センサ43のセンシング範囲全体の観察情報が乗員観察センサ43により単位時間あたりに取得される回数である。
The detection frequency of the occupant observation sensor 43 is the number of times the observation information of the entire sensing range of the occupant observation sensor 43 is acquired by the occupant observation sensor 43 per unit time.
乗員観察センサ43の検出頻度は、例えば、乗員観察センサ43が画像センサである場合には、画像センサにより単位時間あたり取得される撮影画像のフレーム数(フレームレート)であり、乗員観察センサ43がデプスセンサである場合には、デプスセンサにより単位時間あたりに取得される深度画像のフレーム数(フレームレート)である。
The detection frequency of the occupant observation sensor 43 is, for example, the number of frames (frame rate) of the captured image acquired by the image sensor per unit time when the occupant observation sensor 43 is an image sensor, and the occupant observation sensor 43 determines. In the case of a depth sensor, it is the number of frames (frame rate) of the depth image acquired by the depth sensor per unit time.
これらの乗員観察センサ43の検出密度又は検出頻度が高い程、観察情報に基づいて検出される乗員の状態の検出精度が高くなる。これに対して、乗員観察センサ43の検出密度又は検出頻度が高い程、乗員観察センサ43により単位時間当たりに取得される観察情報の情報量が多くなり、情報処理装置21(処理部41(CPU31等))に対して処理の負担が増加する。
The higher the detection density or the detection frequency of these occupant observation sensors 43, the higher the detection accuracy of the occupant state detected based on the observation information. On the other hand, the higher the detection density or the detection frequency of the occupant observation sensor 43, the larger the amount of observation information acquired by the occupant observation sensor 43 per unit time, and the information processing device 21 (processing unit 41 (CPU 31). Etc.)), the processing load increases.
一方、自動車に搭載される情報処理装置21に限らず、装置の処理能力には一定の限界がある。
On the other hand, there is a certain limit to the processing capacity of the device, not limited to the information processing device 21 mounted on the automobile.
そこで、センシング設定部44は、情報処理装置21が最も処理能力を要すると想定される状態、即ち、全ての乗員観察センサ43のセンシング範囲が最大センシング範囲に設定されている状態において、処理量が、情報処理装置21の処理能力の限界を越えないように乗員観察センサ43の検出密度及び検出頻度を設定する。このときの各乗員観察センサ43の検出密度及び検出頻度を、基準検出密度及び基準検出頻度という。乗員観察センサ43それぞれの間で、基準検出密度及び基準検出頻度は必ずしも一致していなくてもよい。
Therefore, the sensing setting unit 44 increases the processing amount in a state in which the information processing device 21 is assumed to require the most processing capacity, that is, in a state in which the sensing ranges of all the occupant observation sensors 43 are set to the maximum sensing range. , The detection density and detection frequency of the occupant observation sensor 43 are set so as not to exceed the limit of the processing capacity of the information processing device 21. The detection density and detection frequency of each occupant observation sensor 43 at this time are referred to as a reference detection density and a reference detection frequency. The reference detection density and the reference detection frequency do not necessarily have to match between the occupant observation sensors 43.
センシング設定部44は、車内の乗員の人数が最大人数より少なく、いずれかの乗員観察センサ43のセンシング範囲が最大センシング範囲よりも小さい場合には、その乗員観察センサ43の検出密度と検出頻度のうちの少なくとも一方を基準検出密度又は基準検出頻度よりも大きくする。
When the number of occupants in the vehicle is less than the maximum number and the sensing range of any of the occupant observation sensors 43 is smaller than the maximum sensing range, the sensing setting unit 44 determines the detection density and detection frequency of the occupant observation sensor 43. At least one of them should be larger than the reference detection density or the reference detection frequency.
例えば、乗員観察センサ43のセンシング範囲の情報量が最大センシング範囲の情報量に対して1/Cとなった場合、検出密度を基準検出密度に対してD倍、検出頻度を基準検出頻度に対してE倍に設定すると仮定する。このとき、センシング設定部44は、D・E=Cの条件を満たすようにDとEを決定して検出密度及び検出頻度を設定する。例えば、センシング設定部44は、Eを1とした場合、検出頻度は基準検出頻度の1倍とし、検出密度を基準検出密度のC倍とする。
For example, when the amount of information in the sensing range of the occupant observation sensor 43 is 1 / C with respect to the amount of information in the maximum sensing range, the detection density is D times the reference detection density and the detection frequency is relative to the reference detection frequency. And assume that it is set to E times. At this time, the sensing setting unit 44 determines D and E so as to satisfy the conditions of D and E = C, and sets the detection density and the detection frequency. For example, when E is 1, the sensing setting unit 44 sets the detection frequency to 1 times the reference detection frequency and the detection density to C times the reference detection density.
以上のセンシング設定部44のセンシング設定処理によれば、車内に存在する乗員の位置に応じて乗員観察センサ43のセンシング範囲が無駄の少ない適切な範囲に設定される。また、乗員観察センサ43のセンシング範囲に応じて、情報処理装置21の処理能力を超えない範囲で乗員観察センサ43の検出密度又は検出頻度が適切に設定される。
According to the above sensing setting process of the sensing setting unit 44, the sensing range of the occupant observation sensor 43 is set to an appropriate range with less waste according to the position of the occupant existing in the vehicle. Further, according to the sensing range of the occupant observation sensor 43, the detection density or the detection frequency of the occupant observation sensor 43 is appropriately set within a range that does not exceed the processing capacity of the information processing device 21.
<センシング設定部44によるセンシング設定の処理例>
図4は、センシング設定部44が行うセンシング設定(図3のステップS11の処理)の処理例を示したフローチャートである。 <Example of sensing setting processing by thesensing setting unit 44>
FIG. 4 is a flowchart showing a processing example of the sensing setting (process of step S11 in FIG. 3) performed by thesensing setting unit 44.
図4は、センシング設定部44が行うセンシング設定(図3のステップS11の処理)の処理例を示したフローチャートである。 <Example of sensing setting processing by the
FIG. 4 is a flowchart showing a processing example of the sensing setting (process of step S11 in FIG. 3) performed by the
ステップS31では、センシング設定部44は、座席関連センサ42からの座席情報に基づいて、各座席の状態を検出する。処理はステップS31からステップS32に進む。
In step S31, the sensing setting unit 44 detects the state of each seat based on the seat information from the seat-related sensor 42. The process proceeds from step S31 to step S32.
ステップS32では、センシング設定部44は、座席番号を表す変数nを1に設定する。変数nは、車内の座席に1乃至Nの座席番号を付したときの座席番号を表し、座席番号がnである座席を座席nで表す。処理はステップS32からステップS33に進む。
In step S32, the sensing setting unit 44 sets the variable n representing the seat number to 1. The variable n represents the seat number when the seats in the vehicle are numbered 1 to N, and the seat whose seat number is n is represented by the seat n. The process proceeds from step S32 to step S33.
ステップS33では、センシング設定部44は、ステップS31で検出した座席nの状態に対応する部分センシング範囲を有効に設定する。具体的には、センシング設定部44は、座席nの状態が着座状態である場合に、その座席nに対応する部分センシング範囲を有効に設定する。なお、座席nの状態が非着座状態である場合には、その座席nに対応する部分センシング範囲は無効に設定される。処理はステップS33からステップS34に進む。
In step S33, the sensing setting unit 44 effectively sets the partial sensing range corresponding to the state of the seat n detected in step S31. Specifically, when the state of the seat n is the seated state, the sensing setting unit 44 effectively sets the partial sensing range corresponding to the seat n. When the state of the seat n is the non-seated state, the partial sensing range corresponding to the seat n is set to be invalid. The process proceeds from step S33 to step S34.
ステップS34では、センシング設定部44は、変数n(座席n)がNであるか否かを判定する。
In step S34, the sensing setting unit 44 determines whether or not the variable n (seat n) is N.
ステップS34において、変数nがNではないと判定された場合、処理はステップS35に進み、センシング設定部44は、変数nの値をインクリメントする。処理はステップS33に戻り、ステップS33及びステップS34を繰り返す。
If it is determined in step S34 that the variable n is not N, the process proceeds to step S35, and the sensing setting unit 44 increments the value of the variable n. The process returns to step S33, and steps S33 and S34 are repeated.
ステップS34において、変数nがNであると判定された場合、処理はステップS34からステップS36に進む。
If it is determined in step S34 that the variable n is N, the process proceeds from step S34 to step S36.
ステップS36では、センシング設定部44は、ステップS33で有効とした部分センシング範囲(有効な部分センシング範囲)に基づいて各乗員観察センサ43のセンシング範囲を設定する。処理はステップS36からステップS37に進む。
In step S36, the sensing setting unit 44 sets the sensing range of each occupant observation sensor 43 based on the partial sensing range (effective partial sensing range) enabled in step S33. The process proceeds from step S36 to step S37.
ステップS37では、センシング設定部44は、ステップS36で設定した各乗員観察センサ43のセンシング範囲に基づいて各乗員観察センサ43の検出密度及び検出頻度を設定する。
In step S37, the sensing setting unit 44 sets the detection density and detection frequency of each occupant observation sensor 43 based on the sensing range of each occupant observation sensor 43 set in step S36.
ステップS37の処理が終了すると、本フローチャートの処理は終了する。
When the process of step S37 is completed, the process of this flowchart is completed.
以上のセンシング設定部44のセンシング設定処理によれば、車内に存在する乗員の位置に応じて乗員観察センサ43のセンシング範囲が、不要な範囲の少ない適切な範囲に設定される。また、乗員観察センサ43のセンシング範囲に応じて、情報処理装置21の処理能力を超えない範囲で乗員観察センサ43の検出密度又は検出頻度が適切に設定される。
According to the above sensing setting process of the sensing setting unit 44, the sensing range of the occupant observation sensor 43 is set to an appropriate range with few unnecessary ranges according to the position of the occupant existing in the vehicle. Further, according to the sensing range of the occupant observation sensor 43, the detection density or the detection frequency of the occupant observation sensor 43 is appropriately set within a range that does not exceed the processing capacity of the information processing device 21.
<<センシング設定部44によるセンシング設定処理の実施例>>
<実施例1>
図5は、センシング設定処理の実施例1における座席及び乗員観察センサの配置を例示した図である。 << Example of sensing setting processing by sensing settingunit 44 >>
<Example 1>
FIG. 5 is a diagram illustrating the arrangement of the seat and the occupant observation sensor in the first embodiment of the sensing setting process.
<実施例1>
図5は、センシング設定処理の実施例1における座席及び乗員観察センサの配置を例示した図である。 << Example of sensing setting processing by sensing setting
<Example 1>
FIG. 5 is a diagram illustrating the arrangement of the seat and the occupant observation sensor in the first embodiment of the sensing setting process.
図5において、自動車51の車内52には、前列座席部53及び後列座席部54が配置される。前列座席部53には、運転席である座席53Aと、助手席である座席53Bが配置される。後列座席部54には座席54A及び54Bが配置される。
In FIG. 5, a front row seat portion 53 and a rear row seat portion 54 are arranged in the vehicle interior 52 of the automobile 51. A seat 53A, which is a driver's seat, and a seat 53B, which is a passenger seat, are arranged in the front row seat portion 53. Seats 54A and 54B are arranged in the back row seat portion 54.
車内52の天井には、図2の乗員観察センサ43の一形態である乗員観察センサ61及び乗員観察センサ62が配置される。
On the ceiling of the vehicle interior 52, an occupant observation sensor 61 and an occupant observation sensor 62, which are a form of the occupant observation sensor 43 of FIG. 2, are arranged.
乗員観察センサ61は、前列座席部53全体の横幅中央付近の位置、かつ、前列座席部53の座面の前縁付近の位置に対して上方となる位置に配置される。乗員観察センサ61は、座席53Aに着座している乗員(運転手)と座席53Bに着座している乗員とを正面側斜め上から観察する。
The occupant observation sensor 61 is arranged at a position near the center of the width of the entire front row seat portion 53 and above the position near the front edge of the seat surface of the front row seat portion 53. The occupant observation sensor 61 observes the occupant (driver) seated in the seat 53A and the occupant seated in the seat 53B from diagonally above the front side.
乗員観察センサ62は、後列座席部54全体の横幅中央付近の位置、かつ、後列座席部54の座面の前縁付近の位置に対して上方となる位置に配置される。乗員観察センサ62は、座席54A及び54Bに着座している乗員を正面側斜め上から観察する。
The occupant observation sensor 62 is arranged at a position near the center of the width of the entire rear row seat portion 54 and above the position near the front edge of the seat surface of the rear row seat portion 54. The occupant observation sensor 62 observes the occupants seated in the seats 54A and 54B from diagonally above the front side.
なお、図5では図示しないが、座席53A、53B、54A、及び、54Bの各々には、座席関連センサ42の一形態である着座センサが設置される。
Although not shown in FIG. 5, a seating sensor, which is a form of the seat-related sensor 42, is installed in each of the seats 53A, 53B, 54A, and 54B.
以下、実施例1の説明では、乗員観察センサ61及び乗員観察センサ62のうち、後列座席部54の乗員を観察する乗員観察センサ62のみに着目して説明する。
Hereinafter, in the description of the first embodiment, among the occupant observation sensor 61 and the occupant observation sensor 62, only the occupant observation sensor 62 for observing the occupant of the rear row seat portion 54 will be described.
図6は、実施例1における図5の乗員観察センサ62のセンシング範囲を説明する図である。
FIG. 6 is a diagram for explaining the sensing range of the occupant observation sensor 62 of FIG. 5 in the first embodiment.
図6において、図5の後列座席部54の座席54A及び54Bには、それぞれ乗員P1及びP2が着座している。
In FIG. 6, occupants P1 and P2 are seated in the seats 54A and 54B of the back row seat portion 54 in FIG. 5, respectively.
また、座席54A及び54Bには、図2の座席関連センサ42の一形態として、それぞれ着座センサ91及び着座センサ92が設置されている。
Further, in the seats 54A and 54B, a seating sensor 91 and a seating sensor 92 are installed as one form of the seat-related sensor 42 in FIG. 2, respectively.
乗員観察センサ62の観察範囲には、座席54Aに着座している乗員P1のみを観察するとした場合のセンシング範囲である部分センシング範囲101-1(着座状態の座席54Aに対応する部分センシング範囲101-1)と、座席54Bに着座している乗員P2のみを観察するとした場合のセンシング範囲である部分センシング範囲101-2(着座状態の座席54Bに対応する部分センシング範囲101-2)とが含まれる。
The observation range of the occupant observation sensor 62 includes the partial sensing range 101-1 (partial sensing range 101- corresponding to the seated seat 54A), which is the sensing range when only the occupant P1 seated in the seat 54A is observed. 1) and a partial sensing range 101-2 (partial sensing range 101-2 corresponding to the seated seat 54B), which is a sensing range when only the occupant P2 seated in the seat 54B is observed, are included. ..
また、部分センシング範囲101-1及び101-2の全体を囲む最大センシング範囲101は、部分センシング範囲101-1と部分センシング範囲101-2の両方の範囲を合わせた範囲である。
Further, the maximum sensing range 101 that surrounds the entire partial sensing range 101-1 and 101-2 is a range that combines both the partial sensing range 101-1 and the partial sensing range 101-2.
これらの部分センシング範囲101-1及び101-2は、予め決められた範囲であり、着座状態の座席54A及び54Bのそれぞれに対して、対応付けられて図1の記憶部36等に記憶されている。
These partial sensing ranges 101-1 and 101-2 are predetermined ranges, and are associated with each of the seats 54A and 54B in the seated state and stored in the storage unit 36 or the like in FIG. There is.
この状態において、センシング設定部44は、着座センサ91及び92からオン信号を取得する。これによって、センシング設定部44は、座席54A及び54Bが着座状態であることを検出する。
In this state, the sensing setting unit 44 acquires an on signal from the seating sensors 91 and 92. As a result, the sensing setting unit 44 detects that the seats 54A and 54B are in the seated state.
続いて、センシング設定部44は、着座状態の座席54Aに対応する部分センシング範囲101-1を有効にする。また、センシング設定部44は、着座状態の座席54Bに対応する部分センシング範囲101-2を有効にする。
Subsequently, the sensing setting unit 44 enables the partial sensing range 101-1 corresponding to the seat 54A in the seated state. Further, the sensing setting unit 44 enables the partial sensing range 101-2 corresponding to the seat 54B in the seated state.
そして、センシング設定部44は、有効とした部分センシング範囲101-1及び101-2とを合わせた最大センシング範囲101を、乗員観察センサ62のセンシング範囲として設定する。
Then, the sensing setting unit 44 sets the maximum sensing range 101, which is the combination of the valid partial sensing ranges 101-1 and 101-2, as the sensing range of the occupant observation sensor 62.
これによって、乗員状態検出部45は、乗員観察センサ43から最大センシング範囲101の観察情報を取得し、乗員P1及びP2の状態を検出する。また、このときの乗員観察センサ62の検出密度及び検出頻度は、所定の基準検出密度及び基準検出頻度に設定される。
As a result, the occupant state detection unit 45 acquires the observation information of the maximum sensing range 101 from the occupant observation sensor 43 and detects the states of the occupants P1 and P2. Further, the detection density and the detection frequency of the occupant observation sensor 62 at this time are set to a predetermined reference detection density and the reference detection frequency.
図7は、実施例1における図5の乗員観察センサ62のセンシング範囲を説明する図である。
FIG. 7 is a diagram for explaining the sensing range of the occupant observation sensor 62 of FIG. 5 in the first embodiment.
なお、図7において、図6と対応する部分には同一の符号を付して説明を省略する。
Note that, in FIG. 7, the same reference numerals are given to the parts corresponding to those in FIG. 6, and the description thereof will be omitted.
図7は、後列座席部54の座席54Bに乗員が着座していない点で図6の場合と相違する。
FIG. 7 is different from the case of FIG. 6 in that no occupant is seated in the seat 54B of the rear row seat portion 54.
図7の状態において、センシング設定部44は、着座センサ91からオン信号を取得し、着座センサ92からオフ信号を取得する。これによって、センシング設定部44は、座席54Aが着座状態であり、座席54Bが非着座状態であることを検出する。
In the state of FIG. 7, the sensing setting unit 44 acquires an on signal from the seating sensor 91 and an off signal from the seating sensor 92. As a result, the sensing setting unit 44 detects that the seat 54A is in the seated state and the seat 54B is in the non-seat state.
続いて、センシング設定部44は、着座状態の座席54Aに対応する部分センシング範囲101-1を有効にする。一方、センシング設定部44は、非着座状態の座席54Bが着座状態の場合に対応する部分センシング範囲101-2を無効とする。
Subsequently, the sensing setting unit 44 enables the partial sensing range 101-1 corresponding to the seat 54A in the seated state. On the other hand, the sensing setting unit 44 invalidates the partial sensing range 101-2 corresponding to the case where the non-seated seat 54B is in the seated state.
そして、センシング設定部44は、有効とした部分センシング範囲101-1を乗員観察センサ62のセンシング範囲として設定する。この場合の乗員観察センサ62のセンシング範囲は、図6において2人の乗員が存在する場合の乗員観察センサ62のセンシング範囲である最大センシング範囲101の約2分の1の情報量(面積)となる。
Then, the sensing setting unit 44 sets the valid partial sensing range 101-1 as the sensing range of the occupant observation sensor 62. The sensing range of the occupant observation sensor 62 in this case is about half the amount of information (area) of the maximum sensing range 101, which is the sensing range of the occupant observation sensor 62 when two occupants are present in FIG. Become.
これにより、乗員観察センサ62のセンシング範囲が乗員の存在範囲に応じて適切な範囲に制限され、不要な情報による処理の負担が軽減される。
As a result, the sensing range of the occupant observation sensor 62 is limited to an appropriate range according to the occupant's existence range, and the burden of processing with unnecessary information is reduced.
さらに、センシング設定部44は、センシング範囲を制限した上で、例えば、乗員観察センサ62の検出密度を基準検出密度の2倍に設定することもできる。基準検出密度は、上述のように乗員観察センサ62のセンシング範囲が最大センシング範囲である場合の検出密度である。これによって、乗員状態検出部45が、乗員観察センサ43から取得する観察情報の密度が2倍となる。
Further, the sensing setting unit 44 can set the detection density of the occupant observation sensor 62 to twice the reference detection density, for example, after limiting the sensing range. The reference detection density is the detection density when the sensing range of the occupant observation sensor 62 is the maximum sensing range as described above. As a result, the density of the observation information acquired by the occupant state detection unit 45 from the occupant observation sensor 43 is doubled.
図8は、乗員観察センサ62をデプスセンサとした場合において、乗員観察センサ62の検出密度が基準検出密度に設定された図6のときの深度画像(観察情報)を例示した図である。
FIG. 8 is a diagram illustrating a depth image (observation information) in FIG. 6 in which the detection density of the occupant observation sensor 62 is set to the reference detection density when the occupant observation sensor 62 is used as a depth sensor.
図8において、乗員画像P1A及びP2Aは、それぞれ図6の乗員P1及びP2の深度画像である。
In FIG. 8, the occupant images P1A and P2A are depth images of the occupants P1 and P2 in FIG. 6, respectively.
図9は、乗員観察センサ62をデプスセンサとした場合において、乗員観察センサ62の検出密度が基準検出密度の2倍に設定された図7のときの深度画像(観察情報)を例示した図である。
FIG. 9 is a diagram illustrating a depth image (observation information) in FIG. 7 in which the detection density of the occupant observation sensor 62 is set to twice the reference detection density when the occupant observation sensor 62 is used as a depth sensor. ..
図9において、乗員画像P1Bは、図7の乗員P1の深度画像である。
In FIG. 9, the occupant image P1B is a depth image of the occupant P1 in FIG.
図8と図9とによれば、図8の乗員画像P1Aよりも、図9の乗員画像P1Bの方が、解像度が高い。したがって、乗員観察センサ62の検出密度を基準検出密度の2倍に設定することにより、乗員観察センサ62からの観察情報に基づいて乗員の状態を検出する乗員状態検出部45での検出精度が向上する。
According to FIGS. 8 and 9, the resolution of the occupant image P1B of FIG. 9 is higher than that of the occupant image P1A of FIG. Therefore, by setting the detection density of the occupant observation sensor 62 to twice the reference detection density, the detection accuracy of the occupant state detection unit 45 that detects the occupant state based on the observation information from the occupant observation sensor 62 is improved. do.
図10は、図5乃至図9の実施例1においてセンシング設定部44が行う処理例を示したフローチャートである。
FIG. 10 is a flowchart showing a processing example performed by the sensing setting unit 44 in the first embodiment of FIGS. 5 to 9.
ステップS51では、センシング設定部44は、座席関連センサである着座センサ91及び92からの座席情報に基づいて、各座席54A及び54Bの状態を検出する。処理はステップS51からステップS52に進む。
In step S51, the sensing setting unit 44 detects the states of the seats 54A and 54B based on the seat information from the seating sensors 91 and 92, which are seat-related sensors. The process proceeds from step S51 to step S52.
ステップS52では、センシング設定部44は、座席番号を表す変数nを1に設定する。座席54Aの座席番号を1とし、座席54Bの座席番号を2とする。処理はステップS52からステップS53に進む。
In step S52, the sensing setting unit 44 sets the variable n representing the seat number to 1. The seat number of seat 54A is 1, and the seat number of seat 54B is 2. The process proceeds from step S52 to step S53.
ステップS53では、センシング設定部44は、座席nが着座状態か否かを検出する。座席nは、nが1のときには座席54Aを表し、nが2のときには座席54Bを表す。
In step S53, the sensing setting unit 44 detects whether or not the seat n is in the seated state. The seat n represents a seat 54A when n is 1, and represents a seat 54B when n is 2.
ステップS53において、座席nが着座状態であると判定された場合、処理はステップS53からステップS54に進み、センシング設定部44は、座席nの着座状態(着座状態の座席n)に対応する部分センシング範囲を有効に設定する。なお、変数nが1である場合の着座状態の座席nに対応する部分センシング範囲は、図6及び図7に示した部分センシング範囲101-1である。変数nが2である場合の着座状態の座席nに対応する部分センシング範囲は、図6及び図7に示した部分センシング範囲101-2である。処理はステップS54からステップS56に進む。
If it is determined in step S53 that the seat n is in the seated state, the process proceeds from step S53 to step S54, and the sensing setting unit 44 performs partial sensing corresponding to the seated state of the seat n (seat n in the seated state). Enable the range. The partial sensing range corresponding to the seat n in the seated state when the variable n is 1 is the partial sensing range 101-1 shown in FIGS. 6 and 7. The partial sensing range corresponding to the seat n in the seated state when the variable n is 2 is the partial sensing range 101-2 shown in FIGS. 6 and 7. The process proceeds from step S54 to step S56.
ステップS53において、座席nが着座状態でないと判定された場合、処理はステップS53からステップS55に進み、センシング設定部44は、座席nの着座状態(着座状態の座席n)に対応する部分センシング範囲を無効に設定する。処理はステップS55からステップS56に進む。
If it is determined in step S53 that the seat n is not in the seated state, the process proceeds from step S53 to step S55, and the sensing setting unit 44 has a partial sensing range corresponding to the seated state of the seat n (seat n in the seated state). Is disabled. The process proceeds from step S55 to step S56.
ステップS56では、センシング設定部44は、変数nが2であるか否かを判定する。
In step S56, the sensing setting unit 44 determines whether or not the variable n is 2.
ステップS56において、変数nが2ではないと判定された場合、処理はステップS57に進み、センシング設定部44は、変数nの値をインクリメントする。処理はステップS57からステップS53に戻る。
If it is determined in step S56 that the variable n is not 2, the process proceeds to step S57, and the sensing setting unit 44 increments the value of the variable n. The process returns from step S57 to step S53.
ステップS56において、変数nが2であると判定された場合、処理はステップS56からステップS58に進む。
If it is determined in step S56 that the variable n is 2, the process proceeds from step S56 to step S58.
ステップS58では、センシング設定部44は、ステップS54で有効とした部分センシング範囲を乗員観察センサ62のセンシング範囲として設定する。処理はステップS58からステップS59に進む。
In step S58, the sensing setting unit 44 sets the partial sensing range enabled in step S54 as the sensing range of the occupant observation sensor 62. The process proceeds from step S58 to step S59.
ステップS59では、センシング設定部44は、2つの座席54A及び54Bに対応する部分センシング範囲101-1及び101-2の両方が有効であるか否かを判定する。
In step S59, the sensing setting unit 44 determines whether or not both the partial sensing ranges 101-1 and 101-2 corresponding to the two seats 54A and 54B are valid.
ステップS59において、2つの座席54A及び54Bに対応する部分センシング範囲101-1及び101-2の両方が有効であると判定された場合、処理はステップS59からステップS60に進む。
If it is determined in step S59 that both the partial sensing ranges 101-1 and 101-2 corresponding to the two seats 54A and 54B are valid, the process proceeds from step S59 to step S60.
ステップS60において、センシング設定部44は、乗員観察センサ43の検出密度を基本検出密度の1倍に設定する。なお、本フローチャートの処理では、検出頻度の変更は想定していないため、センシング設定部44は、乗員観察センサ43の検出頻度を基本検出頻度の1倍に設定する。ステップS60の処理が終了すると、本フローチャートの処理が終了する。
In step S60, the sensing setting unit 44 sets the detection density of the occupant observation sensor 43 to 1 times the basic detection density. Since the processing of this flowchart does not assume a change in the detection frequency, the sensing setting unit 44 sets the detection frequency of the occupant observation sensor 43 to one times the basic detection frequency. When the process of step S60 is completed, the process of this flowchart is completed.
ステップS59において、2つの座席54A及び54Bに対応する部分センシング範囲101-1及び101-2の両方が有効でないと判定された場合、即ち、部分センシング範囲101-1及び101-2の少なくともいずれかが無効である場合、処理はステップS59からステップS61に進む。
In step S59, when it is determined that both the partial sensing ranges 101-1 and 101-2 corresponding to the two seats 54A and 54B are not valid, that is, at least one of the partial sensing ranges 101-1 and 101-2. If is invalid, the process proceeds from step S59 to step S61.
ステップS61では、センシング設定部44は、1つの座席54A又は54Bに対応する部分センシング範囲101-1又は101-2が有効であるか否かを判定する。
In step S61, the sensing setting unit 44 determines whether or not the partial sensing range 101-1 or 101-2 corresponding to one seat 54A or 54B is valid.
ステップS61において、1つの座席54A又は54Bに対応する部分センシング範囲101-1又は101-2が有効であると判定された場合、処理はステップS61からステップS62に進む。
If it is determined in step S61 that the partial sensing range 101-1 or 101-2 corresponding to one seat 54A or 54B is valid, the process proceeds from step S61 to step S62.
ステップS62において、センシング設定部44は、乗員観察センサ62の検出密度を基本検出密度の2倍に設定し、乗員観察センサ62の検出頻度を基本検出頻度の1倍に設定する。ステップS62の処理が終了すると、本フローチャートの処理が終了する。
In step S62, the sensing setting unit 44 sets the detection density of the occupant observation sensor 62 to twice the basic detection density, and sets the detection frequency of the occupant observation sensor 62 to one times the basic detection frequency. When the process of step S62 is completed, the process of this flowchart is completed.
ステップS61において、センシング設定部44は、1つの座席54A又は54Bに対応する部分センシング範囲101-1又は101-2が有効でないと判定された場合、部分センシング範囲101-1及び101-2のいずれもが無効であり、乗員観察センサ62による観察情報の検出自体が行われない。そのため、センシング設定部44は乗員観察センサ62の検出密度及び検出頻度の設定を行わない。そして、本フローチャートの処理が終了する。
In step S61, when the sensing setting unit 44 determines that the partial sensing range 101-1 or 101-2 corresponding to one seat 54A or 54B is not valid, either the partial sensing range 101-1 or 101-2. Is invalid, and the observation information itself is not detected by the occupant observation sensor 62. Therefore, the sensing setting unit 44 does not set the detection density and the detection frequency of the occupant observation sensor 62. Then, the processing of this flowchart is completed.
以上の実施例1によれば、乗員観察センサ62のセンシング範囲が乗員の存在範囲に応じて適切な範囲に制限され、不要な情報による処理の負担が軽減される。また、乗員が1人のみの場合に乗員観察センサ62の検出密度が基準検出密度の2倍に設定されるため、乗員の状態を検出する乗員状態検出部45での検出精度及び操作を認識する操作認識部46での認識精度が向上する。
According to the above-mentioned first embodiment, the sensing range of the occupant observation sensor 62 is limited to an appropriate range according to the existence range of the occupant, and the burden of processing by unnecessary information is reduced. Further, since the detection density of the occupant observation sensor 62 is set to twice the reference detection density when there is only one occupant, the occupant state detection unit 45 that detects the occupant state recognizes the detection accuracy and operation. The recognition accuracy of the operation recognition unit 46 is improved.
なお、実施例1において、センシング設定部44は、乗員観察センサ62のセンシング範囲に応じて乗員観察センサ62の検出頻度を変更してもよい。
In the first embodiment, the sensing setting unit 44 may change the detection frequency of the occupant observation sensor 62 according to the sensing range of the occupant observation sensor 62.
図11は、図5乃至図9の実施例1においてセンシング設定部44が行う処理例であって、センシング範囲に応じて検出頻度を変更する場合の処理例を示したフローチャートである。
FIG. 11 is a flowchart showing a processing example performed by the sensing setting unit 44 in the first embodiment of FIGS. 5 to 9 when the detection frequency is changed according to the sensing range.
図11のフローチャートにおいて、ステップS81乃至ステップS89は、図10のステップS51乃至ステップS59と共通するのでステップS81乃至ステップS89の説明を省略する。
In the flowchart of FIG. 11, since steps S81 to S89 are common to steps S51 to S59 of FIG. 10, the description of steps S81 to S89 will be omitted.
ステップS89において、2つの座席54A及び54Bに対応する部分センシング範囲101-1及び101-2の両方が有効であると判定された場合、処理はステップS89からステップS90に進む。
If it is determined in step S89 that both the partial sensing ranges 101-1 and 101-2 corresponding to the two seats 54A and 54B are valid, the process proceeds from step S89 to step S90.
ステップS90において、センシング設定部44は、乗員観察センサ62の検出頻度を基本検出頻度の1倍に設定する。なお、本フローチャートの処理では、検出密度の変更は想定していないため、センシング設定部44は、乗員観察センサ62の検出密度を基本検出密度の1倍に設定する。ステップS90の処理が終了すると、本フローチャートの処理が終了する。
In step S90, the sensing setting unit 44 sets the detection frequency of the occupant observation sensor 62 to one times the basic detection frequency. Since the processing of this flowchart does not assume a change in the detection density, the sensing setting unit 44 sets the detection density of the occupant observation sensor 62 to 1 times the basic detection density. When the process of step S90 is completed, the process of this flowchart is completed.
ステップS89において、2つの座席54A及び54Bに対応する部分センシング範囲101-1及び101-2の両方が有効でないと判定された場合、即ち、部分センシング範囲101-1及び101-2の少なくともいずれかが無効である場合、処理はステップS89からステップS91に進む。
In step S89, when it is determined that both the partial sensing ranges 101-1 and 101-2 corresponding to the two seats 54A and 54B are not valid, that is, at least one of the partial sensing ranges 101-1 and 101-2. If is invalid, the process proceeds from step S89 to step S91.
ステップS91では、センシング設定部44は、1つの座席54A又は54Bに対応する部分センシング範囲101-1又は101-2が有効であるか否かを判定する。
In step S91, the sensing setting unit 44 determines whether or not the partial sensing range 101-1 or 101-2 corresponding to one seat 54A or 54B is valid.
ステップS91において、1つの座席54A又は54Bに対応する部分センシング範囲101-1又は101-2が有効であると判定された場合、処理はステップS91からステップS92に進む。
If it is determined in step S91 that the partial sensing range 101-1 or 101-2 corresponding to one seat 54A or 54B is valid, the process proceeds from step S91 to step S92.
ステップS92において、センシング設定部44は、乗員観察センサ62の検出頻度を基本検出頻度の2倍に設定し、乗員観察センサ62の検出密度を基本検出密度の1倍に設定する。ステップS92の処理が終了すると、本フローチャートの処理が終了する。
In step S92, the sensing setting unit 44 sets the detection frequency of the occupant observation sensor 62 to twice the basic detection frequency, and sets the detection density of the occupant observation sensor 62 to one times the basic detection density. When the process of step S92 is completed, the process of this flowchart is completed.
ステップS91において、センシング設定部44は、1つの座席54A又は54Bに対応する部分センシング範囲101-1又は101-2が有効でないと判定された場合、部分センシング範囲101-1及び101-2のいずれもが無効であり、乗員観察センサ62による観察情報の検出自体が行われない。そのため、センシング設定部44は乗員観察センサ62の検出密度及び検出頻度の設定を行わない。そして、本フローチャートの処理が終了する。
In step S91, when the sensing setting unit 44 determines that the partial sensing range 101-1 or 101-2 corresponding to one seat 54A or 54B is not valid, either the partial sensing range 101-1 or 101-2. Is invalid, and the observation information itself is not detected by the occupant observation sensor 62. Therefore, the sensing setting unit 44 does not set the detection density and the detection frequency of the occupant observation sensor 62. Then, the processing of this flowchart is completed.
以上の実施例1によれば、乗員観察センサ62のセンシング範囲が乗員の存在範囲に応じて適切な範囲に制限され、不要な情報による処理の負担が軽減される。また、乗員が1人のみの場合に乗員観察センサ62の検出頻度が基準検出頻度の2倍に設定されるため、乗員の状態を検出する乗員状態検出部45での検出精度及び操作を認識する操作認識部46での認識精度が向上する。
According to the above-mentioned first embodiment, the sensing range of the occupant observation sensor 62 is limited to an appropriate range according to the existence range of the occupant, and the burden of processing by unnecessary information is reduced. Further, since the detection frequency of the occupant observation sensor 62 is set to twice the reference detection frequency when there is only one occupant, the occupant state detection unit 45 that detects the occupant state recognizes the detection accuracy and operation. The recognition accuracy of the operation recognition unit 46 is improved.
なお、以上の実施例1において、センシング設定部44は、乗員観察センサ62のセンシング範囲に応じて乗員観察センサ62の検出頻度と検出頻度の両方を変更するようにしてもよい。
In the above-described first embodiment, the sensing setting unit 44 may change both the detection frequency and the detection frequency of the occupant observation sensor 62 according to the sensing range of the occupant observation sensor 62.
<実施例2>
センシング設定部44によるセンシング設定処理の実施例2について説明する。 <Example 2>
The second embodiment of the sensing setting process by thesensing setting unit 44 will be described.
センシング設定部44によるセンシング設定処理の実施例2について説明する。 <Example 2>
The second embodiment of the sensing setting process by the
実施例2において、自動車における座席及び乗員観察センサの配置や、後列座席部54に2人の乗員(大人)が着座している場合の乗員観察センサのセンシング範囲の設定については、実施例1の図5及び図6の場合と共通する。ただし、図6の着座センサ91、92は、実施例2では着座のオン又はオフだけでなく、重量を検出する点で実施例1の場合と相違する。それ以外の部分については実施例1と同じであるので説明を省略する。
In the second embodiment, the arrangement of the seat and the occupant observation sensor in the automobile and the setting of the sensing range of the occupant observation sensor when two occupants (adults) are seated in the rear row seat portion 54 are described in the first embodiment. This is common to the cases of FIGS. 5 and 6. However, the seating sensors 91 and 92 of FIG. 6 are different from the case of the first embodiment in that the seating sensors 91 and 92 in the second embodiment detect not only the on / off of the seating but also the weight. Since the other parts are the same as those in the first embodiment, the description thereof will be omitted.
図12は、実施例2における図5の乗員観察センサ62のセンシング範囲を説明する図である。
FIG. 12 is a diagram illustrating a sensing range of the occupant observation sensor 62 of FIG. 5 in the second embodiment.
なお、図12において、図6と対応する部分には同一の符号を付して説明を省略する。
Note that, in FIG. 12, the parts corresponding to those in FIG. 6 are designated by the same reference numerals, and the description thereof will be omitted.
図12は、後列座席部54の座席54Bに乗員が着座していない点、及び、座席54Aに着座している乗員が子供と推定される所定重量g0以下の軽量乗員P3である点で図6の場合と相違する。なお、図6の乗員P1及びP2は、所定重量g0よりも重い標準の乗員であるとする。
FIG. 12 shows that no occupant is seated in the seat 54B of the rear row seat portion 54 and that the occupant seated in the seat 54A is a lightweight occupant P3 having a predetermined weight g0 or less, which is estimated to be a child. It is different from the case of. It is assumed that the occupants P1 and P2 in FIG. 6 are standard occupants heavier than the predetermined weight g0.
図12の状態において、センシング設定部44は、着座センサ91から、着座センサ91がオンで、かつ、重量g0以下の重量を取得する。また、センシング設定部44は、着座センサ92から着座センサ92がオフであることを取得する。
In the state of FIG. 12, the sensing setting unit 44 acquires a weight of the seating sensor 91 on and a weight g0 or less from the seating sensor 91. Further, the sensing setting unit 44 acquires from the seating sensor 92 that the seating sensor 92 is off.
これによって、センシング設定部44は、座席54Aが着座状態であり、かつ、座席54Bが非着座状態であることを検出する。また、センシング設定部44は、座席54Aに着座している乗員P3の重量が重量g0以下であることから軽量乗員であることを検出する。この場合、センシング設定部44は、座席54Aの状態を、軽量乗員の着座状態であると検出する。
As a result, the sensing setting unit 44 detects that the seat 54A is in the seated state and the seat 54B is in the non-seat state. Further, the sensing setting unit 44 detects that the occupant P3 seated in the seat 54A is a lightweight occupant because the weight is g0 or less. In this case, the sensing setting unit 44 detects that the state of the seat 54A is the seated state of the lightweight occupant.
続いて、センシング設定部44は、座席54Aが軽量乗員の着座状態の場合に対応する部分センシング範囲101-3を有効にする。一方、センシング設定部44は、非着座状態の座席54Bが着座状態の場合に対応する部分センシング範囲101-2及び101-4を無効とする。
Subsequently, the sensing setting unit 44 enables the partial sensing range 101-3 corresponding to the case where the seat 54A is in the seated state of the lightweight occupant. On the other hand, the sensing setting unit 44 invalidates the partial sensing ranges 101-2 and 101-4 corresponding to the case where the non-seated seat 54B is in the seated state.
ここで、部分センシング範囲101-3は、座席54Aに着座している軽量乗員のみを観察するとした場合の乗員観察センサ62のセンシング範囲である。座席54Aに着座している標準的な乗員を観察するとした場合の乗員観察センサ62のセンシング範囲である部分センシング範囲101-1と部分センシング範囲101-3とを比較すると、軽量乗員は標準の乗員と比較して背丈が低いと推定されるため、部分センシング範囲101-3の上下方向の幅が特に小さくなっている。
Here, the partial sensing range 101-3 is the sensing range of the occupant observation sensor 62 when observing only the lightweight occupant seated in the seat 54A. Comparing the partial sensing range 101-1 and the partial sensing range 101-3, which are the sensing ranges of the occupant observation sensor 62 when observing the standard occupant seated in the seat 54A, the lightweight occupant is the standard occupant. Since it is estimated that the height is shorter than that of the above, the width of the partial sensing range 101-3 in the vertical direction is particularly small.
部分センシング範囲101-4は、座席54Bに着座している軽量乗員のみを観察するとした場合の乗員観察センサ62のセンシング範囲である。部分センシング範囲101-3と同様に、座席54Bに着座している標準的な乗員を観察するとした場合の乗員観察センサ62のセンシング範囲である部分センシング範囲101-2よりも部分センシング範囲101-4の上下方向の幅が特に小さくなっている。
The partial sensing range 101-4 is the sensing range of the occupant observation sensor 62 when observing only the lightweight occupant seated in the seat 54B. Similar to the partial sensing range 101-3, the partial sensing range 101-4 is larger than the partial sensing range 101-2, which is the sensing range of the occupant observation sensor 62 when observing a standard occupant seated in the seat 54B. The vertical width of is particularly small.
これらの部分センシング範囲101-3及び101-4は、予め決められた範囲であり、座席54Aが軽量乗員の着座状態及び座席54Bが軽量乗員の着座状態の場合に対して、対応付けられて図1の記憶部36等に記憶されている。
These partial sensing ranges 101-3 and 101-4 are predetermined ranges, and are associated with each other when the seat 54A is in the seated state of the lightweight occupant and the seat 54B is in the seated state of the lightweight occupant. It is stored in the storage unit 36 and the like of 1.
センシング設定部44は、座席54Aの軽量乗員の軽量状態に対応付けられている部分センシング範囲101-3を有効にすると、有効にした部分センシング範囲101-3を乗員観察センサ62のセンシング範囲として設定する。これによって、部分センシング範囲101-1を乗員観察センサ62のセンシング範囲として設定した場合よりもセンシング範囲が小さくなる。
When the partial sensing range 101-3 associated with the lightweight state of the lightweight occupant of the seat 54A is enabled, the sensing setting unit 44 sets the enabled partial sensing range 101-3 as the sensing range of the occupant observation sensor 62. do. As a result, the sensing range becomes smaller than when the partial sensing range 101-1 is set as the sensing range of the occupant observation sensor 62.
この場合の乗員観察センサ62のセンシング範囲は、例えば、最大センシング範囲101の約5分の2の情報量(面積)となる。
In this case, the sensing range of the occupant observation sensor 62 is, for example, about two-fifths of the maximum sensing range 101 (area).
これにより、乗員観察センサ62のセンシング範囲が乗員の背丈等も考慮した適切な範囲に制限され、不要な情報による処理の負担が軽減される。
As a result, the sensing range of the occupant observation sensor 62 is limited to an appropriate range in consideration of the height of the occupant and the like, and the burden of processing with unnecessary information is reduced.
さらに、センシング設定部44は、センシング範囲を制限した上で、例えば、乗員観察センサ62の検出密度を基準検出密度の2.5倍に設定する。これによって、乗員状態検出部45が、乗員観察センサ43から取得する観察情報の密度が2.5倍となる。
Further, the sensing setting unit 44 sets the detection density of the occupant observation sensor 62 to 2.5 times the reference detection density, for example, after limiting the sensing range. As a result, the density of the observation information acquired by the occupant state detection unit 45 from the occupant observation sensor 43 becomes 2.5 times.
図13は、乗員観察センサ62をデプスセンサとした場合において、乗員観察センサ62の検出密度が基準検出密度の2.5倍に設定された図11のときの深度画像(観察情報)を例示した図である。
FIG. 13 is a diagram illustrating a depth image (observation information) in FIG. 11 when the detection density of the occupant observation sensor 62 is set to 2.5 times the reference detection density when the occupant observation sensor 62 is used as a depth sensor. Is.
図12において、軽量乗員画像P3Aは、図11の軽量乗員P3の深度画像である。
In FIG. 12, the lightweight occupant image P3A is a depth image of the lightweight occupant P3 in FIG.
図12の軽量乗員画像P3Aと、乗員観察センサ62の検出密度が基準検出密度の2倍に設定されたときの図9の乗員画像P1Bとを比較すると、図12の軽量乗員画像P3Aの方が、解像度が高い。したがって、乗員観察センサ62の検出密度を基準検出密度の2.5倍に設定することにより、乗員状態検出部45での乗員の状態の検出精度及び操作認識部46での操作の認識精度が向上する。
Comparing the lightweight occupant image P3A of FIG. 12 with the occupant image P1B of FIG. 9 when the detection density of the occupant observation sensor 62 is set to twice the reference detection density, the lightweight occupant image P3A of FIG. , High resolution. Therefore, by setting the detection density of the occupant observation sensor 62 to 2.5 times the reference detection density, the occupant state detection accuracy of the occupant state detection unit 45 and the operation recognition accuracy of the operation recognition unit 46 are improved. do.
図14は、図12及び図13の実施例2においてセンシング設定部44が行う処理例を示したフローチャートである。
FIG. 14 is a flowchart showing a processing example performed by the sensing setting unit 44 in the second embodiment of FIGS. 12 and 13.
ステップS111では、センシング設定部44は、座席関連センサである着座センサ91、92からの座席情報に基づいて、各座席54A、54Bの状態を検出する。処理はステップS111からステップS112に進む。
In step S111, the sensing setting unit 44 detects the state of each of the seats 54A and 54B based on the seat information from the seating sensors 91 and 92, which are seat-related sensors. The process proceeds from step S111 to step S112.
ステップS112では、センシング設定部44は、座席番号を表す変数nを1に設定する。座席54Aの座席番号を1とし、座席54Bの座席番号を2とする。処理はステップS112からステップS113に進む。
In step S112, the sensing setting unit 44 sets the variable n representing the seat number to 1. The seat number of seat 54A is 1, and the seat number of seat 54B is 2. The process proceeds from step S112 to step S113.
ステップS113では、センシング設定部44は、座席nが着座状態か否か検出する。座席nは、nが1のときには座席54Aを表し、nが2のときには座席54Bを表す。
In step S113, the sensing setting unit 44 detects whether or not the seat n is in the seated state. The seat n represents a seat 54A when n is 1, and represents a seat 54B when n is 2.
ステップS113において、座席nが着座状態であると判定された場合、処理はステップS113からステップS114に進み、センシング設定部44は、座席nが軽量乗員の着座状態であるか否かを判定する。
If it is determined in step S113 that the seat n is in the seated state, the process proceeds from step S113 to step S114, and the sensing setting unit 44 determines whether or not the seat n is in the seated state of the lightweight occupant.
ステップS114において、座席nが軽量乗員の着座状態でないと判定された場合、処理はステップS114からステップS115に進み、センシング設定部44は、座席nが非軽量乗員(標準の乗員)の着座状態に対応する部分センシング範囲を有効に設定する。なお、変数nが1である場合の座席nの非軽量乗員の着座状態に対応する部分センシング範囲は、図12に示した部分センシング範囲101-1である。変数nが2である場合の座席nの非軽量乗員の着座状態に対応する部分センシング範囲は、図12に示した部分センシング範囲101-2である。処理はステップS115からステップS118に進む。
If it is determined in step S114 that the seat n is not in the seated state of the lightweight occupant, the process proceeds from step S114 to step S115, and the sensing setting unit 44 puts the seat n in the seated state of the non-lightweight occupant (standard occupant). Enable the corresponding partial sensing range. The partial sensing range corresponding to the seated state of the non-lightweight occupant of the seat n when the variable n is 1 is the partial sensing range 101-1 shown in FIG. The partial sensing range corresponding to the seated state of the non-lightweight occupant of the seat n when the variable n is 2 is the partial sensing range 101-2 shown in FIG. The process proceeds from step S115 to step S118.
ステップS114において、座席nが軽量乗員の着座状態であると判定された場合、処理はステップS114からステップS116に進み、センシング設定部44は、座席nが軽量乗員の着座状態に対応する部分センシング範囲を有効に設定する。なお、変数nが1である場合の座席nが軽量乗員の着座状態に対応する部分センシング範囲は、図12に示した部分センシング範囲101-3である。変数nが2である場合の座席nが軽量乗員の着座状態に対応する部分センシング範囲は、図12に示した部分センシング範囲101-4である。処理はステップS116からステップS118に進む。
If it is determined in step S114 that the seat n is in the seated state of the lightweight occupant, the process proceeds from step S114 to step S116, and the sensing setting unit 44 determines the partial sensing range in which the seat n corresponds to the seated state of the lightweight occupant. Is enabled. The partial sensing range in which the seat n corresponds to the seated state of the lightweight occupant when the variable n is 1 is the partial sensing range 101-3 shown in FIG. The partial sensing range in which the seat n corresponds to the seated state of the lightweight occupant when the variable n is 2 is the partial sensing range 101-4 shown in FIG. The process proceeds from step S116 to step S118.
ステップS113において、座席nが着座状態でないと判定された場合、処理はステップS113からステップS117に進み、センシング設定部44は、座席nの全ての状態に対応する部分センシング範囲を無効に設定する。処理はステップS117からステップS118に進む。
If it is determined in step S113 that the seat n is not in the seated state, the process proceeds from step S113 to step S117, and the sensing setting unit 44 invalidates the partial sensing range corresponding to all the states of the seat n. The process proceeds from step S117 to step S118.
ステップS118では、センシング設定部44は、変数nが2であるか否かを判定する。
In step S118, the sensing setting unit 44 determines whether or not the variable n is 2.
ステップS118において、変数nが2ではないと判定された場合、処理はステップS119に進み、センシング設定部44は、変数nの値をインクリメントする。処理はステップS119からステップS113に戻る。
If it is determined in step S118 that the variable n is not 2, the process proceeds to step S119, and the sensing setting unit 44 increments the value of the variable n. The process returns from step S119 to step S113.
ステップS118において、変数nが2であると判定された場合、処理はステップS118からステップS120に進む。
If it is determined in step S118 that the variable n is 2, the process proceeds from step S118 to step S120.
ステップS120では、センシング設定部44は、ステップS115及びステップS116で有効とした部分センシング範囲を乗員観察センサ62のセンシング範囲として設定する。処理はステップS120からステップS121に進む。
In step S120, the sensing setting unit 44 sets the partial sensing range enabled in steps S115 and S116 as the sensing range of the occupant observation sensor 62. The process proceeds from step S120 to step S121.
ステップS121では、センシング設定部44は、2つの座席54A及び54Bに対応する部分センシング範囲が有効であるか否かを判定する。
In step S121, the sensing setting unit 44 determines whether or not the partial sensing range corresponding to the two seats 54A and 54B is effective.
ステップS121において、2つの座席54A及び54Bに対応する部分センシング範囲が有効であると判定された場合、処理はステップS121からステップS122に進み、センシング設定部44は、2つの座席54A及び54Bに対応する部分センシング範囲の両方が軽量乗員の着座状態に対応する部分センシング範囲か否かを判定する。
If it is determined in step S121 that the partial sensing range corresponding to the two seats 54A and 54B is valid, the process proceeds from step S121 to step S122, and the sensing setting unit 44 corresponds to the two seats 54A and 54B. It is determined whether or not both of the partial sensing ranges to be performed correspond to the seated state of the lightweight occupant.
ステップS122において、2つの座席54A及び54Bに対応する部分センシング範囲の両方が軽量乗員の着座状態に対応した部分センシング範囲ではないと判定された場合、処理はステップS122からステップS123に進む。
If it is determined in step S122 that both of the partial sensing ranges corresponding to the two seats 54A and 54B are not the partial sensing ranges corresponding to the seated state of the lightweight occupant, the process proceeds from step S122 to step S123.
ステップS123において、センシング設定部44は、2つの座席54A及び54Bに対応する部分センシング範囲のいずれかが軽量乗員に対応した部分センシング範囲であるか否かを判定する。
In step S123, the sensing setting unit 44 determines whether or not any of the partial sensing ranges corresponding to the two seats 54A and 54B is the partial sensing range corresponding to the lightweight occupant.
ステップS123において、2つの座席54A及び54Bに対応する部分センシング範囲のいずれかが軽量乗員に対応した部分センシング範囲でないと判定された場合(座席54A及び54Bが非軽量乗員の着座状態の場合)、処理はステップS123からステップS124に進む。
In step S123, when it is determined that any of the partial sensing ranges corresponding to the two seats 54A and 54B is not the partial sensing range corresponding to the lightweight occupant (when the seats 54A and 54B are in the seated state of the non-lightweight occupant). The process proceeds from step S123 to step S124.
ステップS124において、センシング設定部44は、乗員観察センサ62の検出密度を1倍に設定する。そして、本フローチャートの処理が終了する。
In step S124, the sensing setting unit 44 sets the detection density of the occupant observation sensor 62 to 1 times. Then, the processing of this flowchart is completed.
ステップS123において、2つの座席54A及び54Bに対応する部分センシング範囲のいずれかが軽量乗員に対応した部分センシング範囲であると判定された場合(一方の座席が軽量乗員の着座状態、かつ、他方の座席が非軽量乗員の着座状態の場合)、処理はステップS123からステップS125に進む。
In step S123, when it is determined that any of the partial sensing ranges corresponding to the two seats 54A and 54B is the partial sensing range corresponding to the lightweight occupant (one seat is in the seated state of the lightweight occupant and the other). If the seat is seated by a non-lightweight occupant), the process proceeds from step S123 to step S125.
ステップS125において、センシング設定部44は、乗員観察センサ62の検出密度を1倍(正確には9分の10倍)に設定する。そして、本フローチャートの処理が終了する。
In step S125, the sensing setting unit 44 sets the detection density of the occupant observation sensor 62 to 1 time (accurately, 10/9 times). Then, the processing of this flowchart is completed.
ステップS122において、2つの座席54A及び54Bに対応する部分センシング範囲の両方が軽量乗員の着座状態に対応した部分センシング範囲であると判定された場合、処理はステップS122からステップS126に進む。
If it is determined in step S122 that both of the partial sensing ranges corresponding to the two seats 54A and 54B are the partial sensing ranges corresponding to the seated state of the lightweight occupant, the process proceeds from step S122 to step S126.
ステップS126において、センシング設定部44は、乗員観察センサ62の検出密度を1.3倍に設定する。そして、本フローチャートの処理が終了する。
In step S126, the sensing setting unit 44 sets the detection density of the occupant observation sensor 62 to 1.3 times. Then, the processing of this flowchart is completed.
ステップS121において、2つの座席54A及び54Bに対応する部分センシング範囲が有効でないと判定された場合、処理はステップS121からステップS127に進む。
If it is determined in step S121 that the partial sensing ranges corresponding to the two seats 54A and 54B are not valid, the process proceeds from step S121 to step S127.
ステップS127において、センシング設定部44は、1つの座席54A又は54Bに対応する部分センシング範囲が有効か否かを判定する。
In step S127, the sensing setting unit 44 determines whether or not the partial sensing range corresponding to one seat 54A or 54B is valid.
ステップS127において、1つの座席54A又は54Bに対応する部分センシング範囲が有効であると判定された場合、処理はステップS127からステップS128に進む。
If it is determined in step S127 that the partial sensing range corresponding to one seat 54A or 54B is valid, the process proceeds from step S127 to step S128.
ステップS128において、センシング設定部44は、1つの座席54A又は54Bに対応する部分センシング範囲が軽量乗員に対応した部分センシング範囲か否かを判定する。
In step S128, the sensing setting unit 44 determines whether or not the partial sensing range corresponding to one seat 54A or 54B is the partial sensing range corresponding to the lightweight occupant.
ステップS128において、1つの座席54A又は54Bに対応する部分センシング範囲が軽量乗員に対応した部分センシング範囲でないと判定された場合(一方の座席が非軽量乗員の着座状態、かつ、他方の座席が非着座状態の場合)、処理はステップS128からステップS129に進む。
In step S128, when it is determined that the partial sensing range corresponding to one seat 54A or 54B is not the partial sensing range corresponding to the lightweight occupant (one seat is in the seated state of the non-lightweight occupant and the other seat is not. (In the case of a seated state), the process proceeds from step S128 to step S129.
ステップS129において、センシング設定部44は、乗員観察センサ62の検出密度を2倍に設定する。そして、本フローチャートの処理が終了する。
In step S129, the sensing setting unit 44 doubles the detection density of the occupant observation sensor 62. Then, the processing of this flowchart is completed.
ステップS128において、1つの座席54A又は54Bに対応する部分センシング範囲が軽量乗員に対応した部分センシング範囲であると判定された場合、処理はステップS128からステップS130に進む。
If it is determined in step S128 that the partial sensing range corresponding to one seat 54A or 54B is the partial sensing range corresponding to the lightweight occupant, the process proceeds from step S128 to step S130.
ステップS130において、センシング設定部44は、乗員観察センサ62の検出密度を2.5倍に設定する。そして、本フローチャートの処理が終了する。
In step S130, the sensing setting unit 44 sets the detection density of the occupant observation sensor 62 to 2.5 times. Then, the processing of this flowchart is completed.
ステップS127において、1つの座席54A又は54Bに対応する部分センシング範囲が有効でないと判定された場合、部分センシング範囲のいずれもが無効であり、乗員観察センサ62による観察情報の検出自体が行われない。そのため、センシング設定部44は乗員観察センサ62の検出密度及び検出頻度の設定を行わない。そして、本フローチャートの処理が終了する。
If it is determined in step S127 that the partial sensing range corresponding to one seat 54A or 54B is not valid, any of the partial sensing ranges is invalid and the observation information itself is not detected by the occupant observation sensor 62. .. Therefore, the sensing setting unit 44 does not set the detection density and the detection frequency of the occupant observation sensor 62. Then, the processing of this flowchart is completed.
以上のセンシング設定の実施例2によれば、車内に存在する乗員の位置や乗員の背丈に応じて乗員観察センサ62のセンシング範囲が適切に設定される。また、乗員観察センサ62のセンシング範囲に応じて、情報処理装置21の処理能力を超えない範囲で乗員観察センサ62の検出密度が適切に設定される。なお、実施例1と同様に実施例2においてもセンシング設定部44は、センシング範囲に応じて乗員観察センサ62の検出密度の代わりに検出頻度を変更してもよいし、検出密度と検出頻度の両方を変更してもよい。
According to the second embodiment of the above sensing setting, the sensing range of the occupant observation sensor 62 is appropriately set according to the position of the occupant existing in the vehicle and the height of the occupant. Further, the detection density of the occupant observation sensor 62 is appropriately set according to the sensing range of the occupant observation sensor 62 within a range not exceeding the processing capacity of the information processing device 21. In addition, as in Example 1, in Example 2, the sensing setting unit 44 may change the detection frequency instead of the detection density of the occupant observation sensor 62 according to the sensing range, and the detection density and the detection frequency may be changed. You may change both.
また、実施例2では、軽量乗員か否かを重量により判断したが、着座した乗員の背丈を検出するセンサを座席関連センサ42として設け、検出した背丈に基づいて、子供か否かを判断してもよいし、センシング範囲(部分センシング範囲101-3、101-4)の上下幅を決めてもよい。
Further, in the second embodiment, whether or not the occupant is a lightweight occupant is determined by the weight, but a sensor for detecting the height of the seated occupant is provided as a seat-related sensor 42, and whether or not the occupant is a child is determined based on the detected height. Alternatively, the vertical width of the sensing range (partial sensing range 101-3, 101-4) may be determined.
<実施例3>
センシング設定部44によるセンシング設定処理の実施例3について説明する。 <Example 3>
The third embodiment of the sensing setting process by thesensing setting unit 44 will be described.
センシング設定部44によるセンシング設定処理の実施例3について説明する。 <Example 3>
The third embodiment of the sensing setting process by the
実施例3において、自動車における座席及び乗員観察センサの配置は、実施例1の図5の場合と共通する。ただし、実施例3では着座センサの他にリクライニングセンサ及びシートベルトセンサが図2の座席関連センサ42の一形態として用いられる点で実施例1と相違する。
In the third embodiment, the arrangement of the seat and the occupant observation sensor in the automobile is the same as in the case of FIG. 5 of the first embodiment. However, the third embodiment differs from the first embodiment in that the reclining sensor and the seatbelt sensor are used as one form of the seat-related sensor 42 of FIG. 2 in addition to the seating sensor.
図15は、実施例3における図5の乗員観察センサ62のセンシング範囲を説明する図である。
FIG. 15 is a diagram for explaining the sensing range of the occupant observation sensor 62 of FIG. 5 in the third embodiment.
図15において、座席54Bは、図5の後列座席部54の左側の座席である。座席54Bには、乗員P4が着座しており、座席54Bは背もたれが後方に倒されてリクライニング状態となっている。また、シートベルトは着用されていない状態(シートベルト未着用状態)であるとする。
In FIG. 15, the seat 54B is the seat on the left side of the back row seat portion 54 in FIG. An occupant P4 is seated in the seat 54B, and the seat 54B is in a reclining state with the backrest tilted backward. Further, it is assumed that the seat belt is not worn (the seat belt is not worn).
座席54Bには、実施例1と同様に着座センサ92が配置されている。また、座席54Bには、リクライニングセンサ93が配置されている。更に、座席54Bのシートベルトの着脱部には、シートベルトセンサ94が配置されている。
A seating sensor 92 is arranged on the seat 54B as in the first embodiment. A reclining sensor 93 is arranged on the seat 54B. Further, a seatbelt sensor 94 is arranged at the seatbelt attachment / detachment portion of the seat 54B.
図15の状態において、センシング設定部44は、座席関連センサ42からの座席情報として、着座センサ92、リクライニングセンサ93、及び、シートベルトセンサ94の各々からセンサ信号を取得する。これにより、センシング設定部44は、座席54Bが着座状態であること、座席54Bがリクライニング状態であること、及び、シートベルトが未着用状態であることを検出する。なお、図15では不図示の図5の座席54Aは非着座状態であるとする。
In the state of FIG. 15, the sensing setting unit 44 acquires sensor signals from each of the seating sensor 92, the reclining sensor 93, and the seatbelt sensor 94 as seat information from the seat-related sensor 42. As a result, the sensing setting unit 44 detects that the seat 54B is in the seated state, that the seat 54B is in the reclining state, and that the seat belt is not worn. It is assumed that the seat 54A of FIG. 5 (not shown) in FIG. 15 is not seated.
続いて、センシング設定部44は、検出した座席54Bの状態に対応付けられている部分センシング範囲101-5を有効にする。一方、非着座状態の座席54Aが着座状態である場合に対応付けられている部分センシング範囲を全て無効とする。
Subsequently, the sensing setting unit 44 enables the partial sensing range 101-5 associated with the detected state of the seat 54B. On the other hand, when the non-seated seat 54A is in the seated state, all the associated partial sensing ranges are invalidated.
そして、センシング設定部44は、有効とした部分センシング範囲101-5を乗員観察センサ62のセンシング範囲として設定する。この場合の乗員観察センサ62のセンシング範囲は最大センシング範囲の約2分の1となる。
Then, the sensing setting unit 44 sets the valid partial sensing range 101-5 as the sensing range of the occupant observation sensor 62. In this case, the sensing range of the occupant observation sensor 62 is about half of the maximum sensing range.
センシング設定部44は、例えば、乗員観察センサ62の検出密度を基準検出密度の2倍に設定する。
The sensing setting unit 44 sets, for example, the detection density of the occupant observation sensor 62 to twice the reference detection density.
図16は、図15の状態に対してシートベルトが着用された状態での乗員観察センサ62のセンシング範囲を説明する図である。
FIG. 16 is a diagram illustrating a sensing range of the occupant observation sensor 62 in a state where the seatbelt is worn with respect to the state of FIG.
なお、図中、図15と対応する部分には同一の符号を付して説明を省略する。
In the drawings, the parts corresponding to those in FIG. 15 are designated by the same reference numerals and the description thereof will be omitted.
図16において、乗員P4は、シートベルトを着用しているものとする。この場合に、乗員P4は、上体や全身を起こすことはできないと考えられるので、図15の場合と比べて、センシング範囲を座面付近の高さの範囲まで制限することができる。
In FIG. 16, it is assumed that the occupant P4 is wearing a seat belt. In this case, since it is considered that the occupant P4 cannot raise the upper body or the whole body, the sensing range can be limited to the range of the height near the seat surface as compared with the case of FIG.
図16において、センシング設定部44は、座席関連センサ42からの座席情報として、着座センサ92、リクライニングセンサ93、及び、シートベルトセンサ94の各々からセンサ信号を取得する。これにより、センシング設定部44は、座席54Bが着座状態であること、座席54Bがリクライニング状態であること、及び、シートベルトが着用状態であることを検出する。
In FIG. 16, the sensing setting unit 44 acquires sensor signals from each of the seating sensor 92, the reclining sensor 93, and the seatbelt sensor 94 as seat information from the seat-related sensor 42. As a result, the sensing setting unit 44 detects that the seat 54B is in the seated state, that the seat 54B is in the reclining state, and that the seat belt is in the wearing state.
センシング設定部44は、検出した図16の状態に対応付けられた部分センシング範囲101-6を有効にする。ここで、部分センシング範囲101-6は、シートベルト未着用状態の部分センシング範囲101-5よりも小さい。
The sensing setting unit 44 enables the partial sensing range 101-6 associated with the detected state of FIG. Here, the partial sensing range 101-6 is smaller than the partial sensing range 101-5 when the seatbelt is not fastened.
したがって、シートベルト着用時にはシートベルト未着用時よりも乗員観察センサ62の検出密度又は検出頻度を高くすることができ、乗員状態検出部45での乗員の状態の検出精度や操作認識部46での操作の認識精度の向上を図ることができる。
Therefore, when the seatbelt is worn, the detection density or detection frequency of the occupant observation sensor 62 can be higher than when the seatbelt is not fastened, and the detection accuracy of the occupant state by the occupant state detection unit 45 and the operation recognition unit 46 can be increased. It is possible to improve the recognition accuracy of the operation.
<実施例4>
図17は、センシング設定処理の実施例4における乗員観察センサ61及び62のセンシング範囲を説明する図である。 <Example 4>
FIG. 17 is a diagram illustrating a sensing range of the occupant observation sensors 61 and 62 in the fourth embodiment of the sensing setting process.
図17は、センシング設定処理の実施例4における乗員観察センサ61及び62のセンシング範囲を説明する図である。 <Example 4>
FIG. 17 is a diagram illustrating a sensing range of the
図17において、座席53Bは、図5の前列座席部53の左側の座席である。座席53Bには、乗員P5が着座しており、座席54Bは背もたれが後方に倒されてリクライニング状態であるとする。また、シートベルトは着用されていないシートベルト未着用状態であるとする。
In FIG. 17, the seat 53B is the seat on the left side of the front row seat portion 53 in FIG. It is assumed that the occupant P5 is seated in the seat 53B, and the seat 54B is in the reclining state with the backrest tilted backward. Further, it is assumed that the seat belt is not worn and the seat belt is not worn.
座席53Bには、実施例1と同様に着座センサ95が配置されている。また、座席53Bには、リクライニングセンサ96が配置されている。更に、座席53Bのシートベルトの着脱部には、シートベルトセンサ97が配置されている。
A seating sensor 95 is arranged on the seat 53B as in the first embodiment. A reclining sensor 96 is arranged on the seat 53B. Further, a seatbelt sensor 97 is arranged at the seatbelt attachment / detachment portion of the seat 53B.
図17の状態において、センシング設定部44は、座席関連センサ42からの座席情報として、着座センサ95、リクライニングセンサ96、及び、シートベルトセンサ97の各々からセンサ信号を取得する。これにより、センシング設定部44は、座席53Bが着座状態であること、座席53Bがリクライニング状態であること、及び、シートベルトが未着用状態であることを検出する。なお、図17では不図示の図5の座席53Aは非着座状態であるとする。
In the state of FIG. 17, the sensing setting unit 44 acquires sensor signals from each of the seating sensor 95, the reclining sensor 96, and the seatbelt sensor 97 as seat information from the seat-related sensor 42. As a result, the sensing setting unit 44 detects that the seat 53B is in the seated state, that the seat 53B is in the reclining state, and that the seat belt is not worn. It is assumed that the seat 53A of FIG. 5 (not shown) in FIG. 17 is not seated.
続いて、センシング設定部44は、検出した座席53Bの状態に対応する部分センシング範囲103を有効にする。一方、非着座状態の座席53Aが着座状態の場合に対応付けられている部分センシング範囲を全て無効とする。
Subsequently, the sensing setting unit 44 enables the partial sensing range 103 corresponding to the detected state of the seat 53B. On the other hand, when the seat 53A in the non-seat state is in the seated state, all the associated partial sensing ranges are invalidated.
ここで、座席53Bがリクライニング状態である場合、乗員P5は、乗員観察センサ61の観察範囲と、乗員観察センサ62の観察範囲の両方に及ぶ。この場合に、乗員観察センサ61と乗員観察センサ62とのいずれか一方のみでは、乗員P5を観察できない。
Here, when the seat 53B is in the reclining state, the occupant P5 covers both the observation range of the occupant observation sensor 61 and the observation range of the occupant observation sensor 62. In this case, the occupant P5 cannot be observed only by either one of the occupant observation sensor 61 and the occupant observation sensor 62.
そこで、このような座席53Bの状態に対応する部分センシング範囲103は、乗員観察センサ61の観察範囲と乗員観察センサ62の観察範囲とに含まれる範囲に設定される。そして、部分センシング範囲103のうち、乗員観察センサ61の観察範囲に含まれる部分を部分センシング範囲103Aとし、乗員観察センサ62に含まれる部分を部分センシング範囲103Bとして分離する。
Therefore, the partial sensing range 103 corresponding to such a state of the seat 53B is set to a range included in the observation range of the occupant observation sensor 61 and the observation range of the occupant observation sensor 62. Then, of the partial sensing range 103, the portion included in the observation range of the occupant observation sensor 61 is set as the partial sensing range 103A, and the portion included in the occupant observation sensor 62 is separated as the partial sensing range 103B.
センシング設定部44は、有効とした部分センシング範囲103のうち、分離した一方の部分センシング範囲103Aを乗員観察センサ61のセンシング範囲として設定する。また、センシング設定部44は、有効とした部分センシング範囲103のうち、分離した他方の部分センシング範囲103Bを乗員観察センサ62のセンシング範囲として設定する。
The sensing setting unit 44 sets one of the separated partial sensing ranges 103A of the valid partial sensing ranges 103 as the sensing range of the occupant observation sensor 61. Further, the sensing setting unit 44 sets the other separated partial sensing range 103B of the valid partial sensing range 103 as the sensing range of the occupant observation sensor 62.
そして、乗員状態検出部45は、乗員観察センサ61及び乗員観察センサ62のセンシング範囲の観察情報を取得すると、乗員観察センサ61からの部分センシング範囲103Aの観察情報と、乗員観察センサ62からの部分センシング範囲103Bの観察情報とを結合(統合)して部分センシング範囲103の観察情報を取得する。
Then, when the occupant state detection unit 45 acquires the observation information of the sensing range of the occupant observation sensor 61 and the occupant observation sensor 62, the observation information of the partial sensing range 103A from the occupant observation sensor 61 and the portion from the occupant observation sensor 62 The observation information of the partial sensing range 103 is acquired by combining (integrating) with the observation information of the sensing range 103B.
図18は、乗員観察センサ61及び乗員観察センサ62をデプスセンサとした場合に、図17の状態において乗員観察センサ61及び乗員観察センサ62の各々から取得される観察情報(深度画像)を例示した図である。
FIG. 18 is a diagram illustrating observation information (depth image) acquired from each of the occupant observation sensor 61 and the occupant observation sensor 62 in the state of FIG. 17 when the occupant observation sensor 61 and the occupant observation sensor 62 are used as depth sensors. Is.
図18において、乗員画像P5Aは、乗員観察センサ61のセンシング範囲のうちの部分センシング範囲103Aから抽出される深度画像である。
In FIG. 18, the occupant image P5A is a depth image extracted from the partial sensing range 103A of the sensing range of the occupant observation sensor 61.
乗員画像P5Bは、乗員観察センサ62のセンシング範囲のうちの部分センシング範囲103Bから取得された深度画像である。
The occupant image P5B is a depth image acquired from the partial sensing range 103B of the sensing range of the occupant observation sensor 62.
乗員状態検出部45は、乗員観察センサ61から取得した深度画像に対して部分センシング範囲103Aに対応する範囲の深度画像を抽出することにより、乗員画像P5Aを取得する。また、乗員状態検出部45は、乗員観察センサ62から取得した深度画像に対して部分センシング範囲103Bに対応する範囲の深度画像を抽出することにより、乗員画像P5Bを取得する。そして、乗員状態検出部45は、各々から抽出した乗員画像P5Aと乗員画像P5Bとを統合(連結)することにより、図17の部分センシング範囲103における乗員P5の深度画像を取得する。
The occupant state detection unit 45 acquires the occupant image P5A by extracting the depth image in the range corresponding to the partial sensing range 103A from the depth image acquired from the occupant observation sensor 61. Further, the occupant state detection unit 45 acquires the occupant image P5B by extracting the depth image in the range corresponding to the partial sensing range 103B from the depth image acquired from the occupant observation sensor 62. Then, the occupant state detection unit 45 acquires a depth image of the occupant P5 in the partial sensing range 103 of FIG. 17 by integrating (connecting) the occupant image P5A and the occupant image P5B extracted from each.
この実施例4によれば、座席53Bがリクライニング状態等の場合に、1人の乗員が複数の乗員観察センサ61及び62の観察範囲に跨がる場合であっても、乗員観察センサ61及び62の各々から得られた観察情報の統合によって、その乗員の広範囲の観察情報を取得することができる。したがって、乗員状態検出部45での乗員の状態の検出が適切に行われる。
According to the fourth embodiment, when the seat 53B is in a reclining state or the like, even when one occupant straddles the observation range of the plurality of occupant observation sensors 61 and 62, the occupant observation sensors 61 and 62 By integrating the observation information obtained from each of the above, a wide range of observation information of the occupant can be obtained. Therefore, the occupant state detection unit 45 appropriately detects the occupant state.
なお、図18において、シートベルトが着用されているシートベルト着用状態の場合には、実施例3と同様に、センシング範囲を更に小さくすることができる。
Note that, in FIG. 18, when the seatbelt is worn, the sensing range can be further reduced as in the third embodiment.
図19は、図17及び図18の実施例4においてセンシング設定部44が行う処理例を示したフローチャートである。
FIG. 19 is a flowchart showing a processing example performed by the sensing setting unit 44 in the fourth embodiment of FIGS. 17 and 18.
ステップS151では、センシング設定部44は、座席関連センサ42(着座センサ95、リクライニングセンサ96、及び、シートベルトセンサ97)からの座席情報に基づいて、各座席の状態を検出する。処理はステップS151からステップS152に進む。
In step S151, the sensing setting unit 44 detects the state of each seat based on the seat information from the seat-related sensors 42 (seat sensor 95, reclining sensor 96, and seat belt sensor 97). The process proceeds from step S151 to step S152.
ステップS152では、センシング設定部44は、座席番号を表す変数nを1に設定する。座席53Aの座席番号を1とし、座席53Bの座席番号を2とする。処理はステップS152からステップS153に進む。
In step S152, the sensing setting unit 44 sets the variable n representing the seat number to 1. The seat number of seat 53A is 1, and the seat number of seat 53B is 2. The process proceeds from step S152 to step S153.
ステップS153では、センシング設定部44は、座席nが着座状態か否か検出する。座席nは、nが1のときには座席53Aを表し、nが2のときには座席53Bを表す。
In step S153, the sensing setting unit 44 detects whether or not the seat n is in the seated state. The seat n represents a seat 53A when n is 1, and represents a seat 53B when n is 2.
ステップS153において、座席nが着座状態であると判定された場合、処理はステップS153からステップS154に進み、センシング設定部44は、座席nがリクライニング状態であるか否かを判定する。
If it is determined in step S153 that the seat n is in the seated state, the process proceeds from step S153 to step S154, and the sensing setting unit 44 determines whether or not the seat n is in the reclining state.
ステップS154において、座席nがリクライニング状態でないと判定された場合、処理はステップS154からステップS155に進み、センシング設定部44は、座席nの着座状態かつ非リクライニング状態に対応する部分センシング範囲を有効に設定する。処理はステップS155からステップS158に進む。
If it is determined in step S154 that the seat n is not in the reclining state, the process proceeds from step S154 to step S155, and the sensing setting unit 44 effectively activates the partial sensing range corresponding to the seated state and the non-reclining state of the seat n. Set. The process proceeds from step S155 to step S158.
ステップS154において、座席nがリクライニング状態であると判定された場合、処理はステップS154からステップS156に進み、センシング設定部44は、座席nの着座状態かつリクライニング状態に対応する部分センシング範囲を有効に設定する。処理はステップS156からステップS158に進む。
When it is determined in step S154 that the seat n is in the reclining state, the process proceeds from step S154 to step S156, and the sensing setting unit 44 effectively activates the partial sensing range corresponding to the seated state and the reclining state of the seat n. Set. The process proceeds from step S156 to step S158.
ステップS153において、座席nが着座状態でないと判定された場合、処理はステップS153からステップS157に進み、センシング設定部44は、座席nの全ての状態に対応する部分センシング範囲を無効に設定する。処理はステップS157からステップS158に進む。
If it is determined in step S153 that the seat n is not in the seated state, the process proceeds from step S153 to step S157, and the sensing setting unit 44 invalidates the partial sensing range corresponding to all the states of the seat n. The process proceeds from step S157 to step S158.
ステップS158では、センシング設定部44は、変数nが2であるか否かを判定する。
In step S158, the sensing setting unit 44 determines whether or not the variable n is 2.
ステップS158において、変数nが2ではないと判定された場合、処理はステップS159に進み、センシング設定部44は、変数nの値をインクリメントする。処理はステップS159からステップS153に戻る。
If it is determined in step S158 that the variable n is not 2, the process proceeds to step S159, and the sensing setting unit 44 increments the value of the variable n. The process returns from step S159 to step S153.
ステップS158において、変数nが2であると判定された場合、処理はステップS158からステップS160に進む。
If it is determined in step S158 that the variable n is 2, the process proceeds from step S158 to step S160.
ステップS160では、センシング設定部44は、変数nを1に設定する。処理はステップS160からステップS161に進む。
In step S160, the sensing setting unit 44 sets the variable n to 1. The process proceeds from step S160 to step S161.
ステップS161では、座席nに対応して有効とした部分センシング範囲がリクライニング状態に対応する部分センシング範囲か否かを判定する。
In step S161, it is determined whether or not the valid partial sensing range corresponding to the seat n is the partial sensing range corresponding to the reclining state.
ステップS161において、座席nに対応して有効とした部分センシング範囲がリクライニング状態に対応する部分センシング範囲ではないと判定された場合、処理はステップS161からステップS162に進む。
If it is determined in step S161 that the valid partial sensing range corresponding to the seat n is not the partial sensing range corresponding to the reclining state, the process proceeds from step S161 to step S162.
ステップS162において、センシング設定部44は、有効とした部分センシング範囲を座席nの乗員を観察する乗員観察センサ61(座席nの乗員観察センサ61)のセンシング範囲として設定する。処理はステップS162からステップS165に進む。
In step S162, the sensing setting unit 44 sets the valid partial sensing range as the sensing range of the occupant observation sensor 61 (occupant observation sensor 61 of the seat n) for observing the occupant of the seat n. The process proceeds from step S162 to step S165.
ステップS161において、座席nに対応して有効とした部分センシング範囲がリクライニング状態に対応する部分センシング範囲であると判定された場合、処理はステップS161からステップS163に進む。
If it is determined in step S161 that the valid partial sensing range corresponding to the seat n is the partial sensing range corresponding to the reclining state, the process proceeds from step S161 to step S163.
ステップS163では、有効とした部分センシング範囲のうち、座席nの乗員観察センサ61(座席nの乗員を観察する乗員観察センサ61)に対する範囲を座席nの乗員観察センサ61のセンシング範囲として設定する。処理はステップS163からステップS164に進む。
In step S163, of the valid partial sensing ranges, the range for the occupant observation sensor 61 for seat n (occupant observation sensor 61 for observing the occupants for seat n) is set as the sensing range for the occupant observation sensor 61 for seat n. The process proceeds from step S163 to step S164.
ステップS164では、有効とした部分センシング範囲のうち、座席nの後側の座席(n=1のときは座席54A、n=2のときは座席54B)の乗員観察センサ62に対する範囲を座席nの後側の座席の乗員観察センサ62のセンシング範囲として設定する。処理はステップS164からステップS165に進む。
In step S164, of the valid partial sensing range, the range of the seat behind the seat n (seat 54A when n = 1 and seat 54B when n = 2) with respect to the occupant observation sensor 62 is set to the range of the seat n. It is set as the sensing range of the occupant observation sensor 62 on the rear seat. The process proceeds from step S164 to step S165.
ステップS165では、変数nが2であるか否かが判定される。
In step S165, it is determined whether or not the variable n is 2.
ステップS165において、変数nが2ではないと判定された場合、処理はステップS165からステップS166に進み、センシング設定部44は、変数nの値をインクリメントしてステップS166からステップS161に戻る。
If it is determined in step S165 that the variable n is not 2, the process proceeds from step S165 to step S166, and the sensing setting unit 44 increments the value of the variable n and returns from step S166 to step S161.
ステップS165において、変数nが2であると判定された場合、本フローチャートの処理が終了する。
If it is determined in step S165 that the variable n is 2, the process of this flowchart ends.
この実施例4によれば、座席53Bがリクライニング状態等の場合に、1人の乗員が複数の乗員観察センサ61及び62の観察範囲に跨がる場合であっても、乗員観察センサ61及び62の各々から得られた観察情報の統合によって、その乗員の広範囲の観察情報を取得することができる。したがって、乗員状態検出部45での乗員の状態の検出が適切に行われる。
According to the fourth embodiment, when the seat 53B is in a reclining state or the like, even when one occupant straddles the observation range of the plurality of occupant observation sensors 61 and 62, the occupant observation sensors 61 and 62 By integrating the observation information obtained from each of the above, a wide range of observation information of the occupant can be obtained. Therefore, the occupant state detection unit 45 appropriately detects the occupant state.
<実施例5>
図20は、センシング設定処理の実施例5における乗員観察センサの配置を説明する図である。 <Example 5>
FIG. 20 is a diagram illustrating the arrangement of the occupant observation sensor in the fifth embodiment of the sensing setting process.
図20は、センシング設定処理の実施例5における乗員観察センサの配置を説明する図である。 <Example 5>
FIG. 20 is a diagram illustrating the arrangement of the occupant observation sensor in the fifth embodiment of the sensing setting process.
図20において、自動車151の車内152には、3列の前列座席部153、中列座席部154、及び、後列座席部155が設けられる。前列座席部153には運転席である座席153Aと助手席である座席153Bとが配置される。中列座席部154には座席154A及び座席154Bが配置される。後列座席部155には、座席155A及び座席155Bが配置される。
In FIG. 20, the interior 152 of the automobile 151 is provided with three rows of front row seats 153, middle row seats 154, and rear row seats 155. A driver's seat 153A and a passenger seat 153B are arranged in the front row seat portion 153. Seats 154A and 154B are arranged in the middle row seat portion 154. Seats 155A and 155B are arranged in the back row seat portion 155.
各座席153A乃至155A、及び、153B乃至155Bには、それぞれ着座の有無を検出する着座センサ191A乃至193A、及び、191B及び193Bが設けられる。着座センサ191A乃至193A、及び、191B及び193Bは、図2の座席関連センサ42の一形態である。
Each seat 153A to 155A and 153B to 155B are provided with seating sensors 191A to 193A and 191B and 193B for detecting the presence or absence of seating, respectively. The seating sensors 191A to 193A, and 191B and 193B are one form of the seat-related sensor 42 of FIG.
また、座席153A乃至155A、及び、153B乃至155Bには、それぞれ前後又は左右方向へのスライド移動や向きに対する座席の位置を検出する座席位置センサ194A乃至196A、及び、194B及び196Bが設けられる。座席位置センサ194A乃至196A、及び、194B及び196Bは、図2の座席関連センサ42の一形態である。
Further, the seats 153A to 155A and 153B to 155B are provided with seat position sensors 194A to 196A, and 194B and 196B, respectively, which detect the position of the seat with respect to the sliding movement and the direction in the front-back or left-right direction. The seat position sensors 194A to 196A, and 194B and 196B are one form of the seat-related sensor 42 of FIG.
車内の天井には、図2の乗員観察センサ43である乗員観察センサ161乃至163が配置される。乗員観察センサ161乃至163は、天井の横幅中央付近に配置され、それぞれ前列座席部153、中列座席部154、及び、後列座席部155の真上よりも前側寄りの位置に配置される。乗員観察センサ161乃至163は、図2の乗員観察センサ43の一形態である。
The occupant observation sensors 161 to 163, which are the occupant observation sensors 43 of FIG. 2, are arranged on the ceiling inside the vehicle. The occupant observation sensors 161 to 163 are arranged near the center of the width of the ceiling, and are arranged at positions closer to the front side than directly above the front row seat portion 153, the middle row seat portion 154, and the rear row seat portion 155, respectively. The occupant observation sensors 161 to 163 are a form of the occupant observation sensor 43 of FIG.
乗員観察センサ161は、前列座席部153の2つの座席153A及び153Bの乗員を観察する。乗員観察センサ162は、中列座席部154の2つの座席154A及び154Bの乗員を観察する。乗員観察センサ163は、後列座席部155の座席155A及び155Bの乗員を観察する。
The occupant observation sensor 161 observes the occupants of the two seats 153A and 153B of the front row seat portion 153. The occupant observation sensor 162 observes the occupants of the two seats 154A and 154B of the middle row seat portion 154. The occupant observation sensor 163 observes the occupants of the seats 155A and 155B of the rear row seat portion 155.
図21は、図20の乗員観察センサ161乃至163のセンシング範囲を示した図である。
FIG. 21 is a diagram showing a sensing range of the occupant observation sensors 161 to 163 of FIG.
図21において、センシング範囲181、182、及び、183は、それぞれ乗員観察センサ161、162、及び、163のそれぞれの最大センシング範囲を示す。
In FIG. 21, the sensing ranges 181, 182, and 183 indicate the maximum sensing ranges of the occupant observation sensors 161, 162, and 163, respectively.
最大センシング範囲181、182、及び、183のそれぞれの範囲内における部分センシング範囲181―1、182-1、及び、183-1は、右側の座席153A乃至155Aの着座センサ191A乃至193Aにより着座状態が検出されたときに有効となる部分センシング範囲である。
The partial sensing ranges 181-1, 182-1, and 183-1 within the maximum sensing ranges 181, 182, and 183 are seated by the seating sensors 191A to 193A of the right seats 153A to 155A. This is the partial sensing range that becomes effective when detected.
また、最大センシング範囲181、182、及び、183のそれぞれの範囲内における部分センシング範囲181―2、182-2、及び、183-2は、左側の座席153B乃至155Bの着座センサ191B乃至193Bにより着座状態が検出されたときに有効となる部分センシング範囲である。
Further, the partial sensing ranges 181-2, 182-2, and 183-2 within the maximum sensing ranges 181, 182, and 183 are seated by the seating sensors 191B to 193B of the left seats 153B to 155B. This is the partial sensing range that becomes effective when a condition is detected.
センシング設定部44は、各座席153A乃至155A、及び、153B乃至155Bの着座センサ191A乃至193A、及び、191B乃至193Bにより、各座席153A乃至155A、及び、153B乃至155Bの着座の有無(着座状態又は非着座状態)を検出する。
The sensing setting unit 44 uses the seating sensors 191A to 193A and 191B to 193B of the seats 153A to 155A and 153B to 155B to determine whether or not the seats 153A to 155A and 153B to 155B are seated (seated state or Non-seated state) is detected.
また、センシング設定部44は、各座席153A乃至155A、及び、153B乃至155Bの座席位置センサ194A乃至196A、及び、194B乃至196Bにより各座席153A乃至155A、及び、153B乃至155Bの位置を検出する。
Further, the sensing setting unit 44 detects the positions of the seats 153A to 155A and 153B to 155B by the seat position sensors 194A to 196A and 194B to 196B of the seats 153A to 155A and 153B to 155B.
センシング設定部44は、検出したそれらの状態(着座の有無及び座席の位置)に基づいて、車内に存在する乗員を観察することができ、かつ、乗員が存在しない範囲はできるだけセンシング範囲から除外するように、有効とする部分センシング範囲を設定する。そして、センシング設定部44は、有効とした部分センシング範囲を各乗員観察センサ161、162、及び、163のセンシング範囲として設定する。
The sensing setting unit 44 can observe the occupants present in the vehicle based on the detected states (presence or absence of seating and the position of the seat), and excludes the range in which the occupants do not exist from the sensing range as much as possible. As such, set the valid partial sensing range. Then, the sensing setting unit 44 sets the effective partial sensing range as the sensing range of the occupant observation sensors 161 and 162, and 163.
なお、自動車151の車内152において、各座席153A乃至155A、及び、153B乃至155Bは、移動可能に構成されてもよい。この場合、着座状態の座席の位置に対応付ける部分センシング範囲は、座席の位置に応じて移動させるようにする。
In the vehicle interior 152 of the automobile 151, the seats 153A to 155A and 153B to 155B may be configured to be movable. In this case, the partial sensing range corresponding to the seat position in the seated state is moved according to the seat position.
即ち、図21のように、各座席153A乃至155A、及び、153B乃至155Bが標準的な位置に配置されている場合には、各乗員を乗員観察センサ161、162、及び、163のうちのいずれかの乗員観察センサの観察範囲内で観察することができる。
That is, as shown in FIG. 21, when the seats 153A to 155A and 153B to 155B are arranged at standard positions, each occupant is assigned to any of the occupant observation sensors 161, 162, and 163. It can be observed within the observation range of the occupant observation sensor.
これに対して、各座席153A乃至155A、及び、153B乃至155Bが標準的な位置から大きく変位している場合には、1つの乗員観察センサの観察範囲内では観察できない場合がある。そのため、1人の乗員の観察情報を複数の乗員観察センサに分けて取得することが必要となる場合がある。
On the other hand, if each seat 153A to 155A and 153B to 155B are largely displaced from the standard position, it may not be possible to observe within the observation range of one occupant observation sensor. Therefore, it may be necessary to acquire the observation information of one occupant by dividing it into a plurality of occupant observation sensors.
図22は、図21において座席が大きく変位した場合のセンシング範囲を示した図である。
FIG. 22 is a diagram showing a sensing range when the seat is largely displaced in FIG. 21.
図22において、座席153Bと座席154Bとが、標準的な位置から大きく後方に変位している。座席153Bは、乗員観察センサ161の最大センシング範囲181から外れ、座席154Bは、乗員観察センサ162の最大センシング範囲182から部分的に外れている。
In FIG. 22, the seat 153B and the seat 154B are largely displaced rearward from the standard position. The seat 153B is out of the maximum sensing range 181 of the occupant observation sensor 161 and the seat 154B is partially out of the maximum sensing range 182 of the occupant observation sensor 162.
この場合において、乗員観察センサ161では、座席153Bの乗員を観察できないため、乗員観察センサ162が座席153Bの乗員の観察を行う。
In this case, since the occupant observation sensor 161 cannot observe the occupant of the seat 153B, the occupant observation sensor 162 observes the occupant of the seat 153B.
また、乗員観察センサ162では、座席154Bの乗員を部分的にしか観察できないため、乗員観察センサ162と乗員観察センサ163とで座席154Bの乗員の観察情報を取得して、統合する。
Further, since the occupant observation sensor 162 can only partially observe the occupant of the seat 154B, the occupant observation sensor 162 and the occupant observation sensor 163 acquire and integrate the observation information of the occupant of the seat 154B.
センシング設定部44は、各座席153A乃至155A、及び、153B乃至155Bの着座センサ191A乃至193A、及び、191B乃至193Bにより、各座席153A乃至155A、及び、153B乃至155Bの着座の有無(着座状態又は非着座状態)を検出する。
The sensing setting unit 44 uses the seating sensors 191A to 193A and 191B to 193B of the seats 153A to 155A and 153B to 155B to determine whether or not the seats 153A to 155A and 153B to 155B are seated (seated state or Non-seated state) is detected.
また、センシング設定部44は、各座席153A乃至155A、及び、153B乃至155Bの座席位置センサ194A乃至196A、及び、194B乃至196Bにより各座席153A乃至155A、及び、153B乃至155Bの位置を検出する。
Further, the sensing setting unit 44 detects the positions of the seats 153A to 155A and 153B to 155B by the seat position sensors 194A to 196A and 194B to 196B of the seats 153A to 155A and 153B to 155B.
センシング設定部44は、検出したそれらの状態(着座の有無及び座席の位置)に基づいて、車内に存在する乗員を観察することができ、かつ、乗員が存在しない範囲はできるだけセンシング範囲から除外するように、有効とする部分センシング範囲を設定する。
The sensing setting unit 44 can observe the occupants present in the vehicle based on the detected states (presence or absence of seating and the position of the seat), and excludes the range in which the occupants do not exist from the sensing range as much as possible. As such, set the valid partial sensing range.
センシング設定部44は、座席153Bに対して有効とする部分センシング範囲を設定する際に、座席153Bの位置に基づいて、その位置に対応した部分センシング範囲を生成してもよいし、座席153Bの変位可能な各位置に予め対応付けて記憶部36に記憶させておいた部分センシング範囲のデータから該当するものを読み出してもよい。この場合、座席154Bについても、座席153Bと同様に部分センシング範囲が設定されるようにする。図22において、部分センシング範囲186は、座席154Bに対する部分センシング範囲である。
When setting the partial sensing range effective for the seat 153B, the sensing setting unit 44 may generate a partial sensing range corresponding to the position based on the position of the seat 153B, or the sensing setting unit 44 may generate the partial sensing range corresponding to the position of the seat 153B. The corresponding data may be read out from the data of the partial sensing range stored in the storage unit 36 in advance in association with each displaceable position. In this case, the partial sensing range is set for the seat 154B as well as for the seat 153B. In FIG. 22, the partial sensing range 186 is the partial sensing range for the seat 154B.
そして、センシング設定部44は、有効とした部分センシング範囲の各々を各乗員観察センサ161、162、及び、163のセンシング範囲として設定する。このとき、座席154Bに対する部分センシング範囲186は、乗員観察センサ162の観察範囲と乗員観察センサ163の観察範囲とに含まれる。センシング設定部44は、部分センシング範囲186のうち、乗員観察センサ162の観察範囲に含まれる範囲186Aを乗員観察センサ162のセンシング範囲とし、部分センシング範囲186のうち、乗員観察センサ163の観察範囲に含まれる範囲186Bを乗員観察センサ163のセンシング範囲として設定する。
Then, the sensing setting unit 44 sets each of the effective partial sensing ranges as the sensing ranges of the occupant observation sensors 161 and 162, and 163. At this time, the partial sensing range 186 with respect to the seat 154B is included in the observation range of the occupant observation sensor 162 and the observation range of the occupant observation sensor 163. The sensing setting unit 44 sets the range 186A included in the observation range of the occupant observation sensor 162 in the partial sensing range 186 as the sensing range of the occupant observation sensor 162, and sets the observation range of the occupant observation sensor 163 in the partial sensing range 186. The included range 186B is set as the sensing range of the occupant observation sensor 163.
乗員状態検出部45は、各乗員観察センサ161、162、及び、163から観察情報を取得した際に、部分センシング範囲186の観察情報を、乗員観察センサ162からの観察情報と乗員観察センサ163からの観察情報とから取得して統合する。
When the occupant state detection unit 45 acquires the observation information from the occupant observation sensors 161 and 162 and 163, the occupant state detection unit 45 obtains the observation information of the partial sensing range 186 from the observation information from the occupant observation sensor 162 and the occupant observation sensor 163. Obtain and integrate with the observation information of.
また、センシング設定部44は、各乗員観察センサ161、162、及び、163のセンシング範囲の面積等に基づいて、各乗員観察センサ161、162、及び、163の検出密度及び検出頻度を設定する。図22において、乗員観察センサ161のセンシング範囲を、座席153Aの乗員の範囲に制限できる。そのため、他の乗員観察センサ162及び163よりも、検出密度を高くして1人の乗員の状態を高精度で検出することができる。
Further, the sensing setting unit 44 sets the detection densities and detection frequencies of the occupant observation sensors 161 and 162, and 163 based on the area of the sensing range of the occupant observation sensors 161 and 162 and 163. In FIG. 22, the sensing range of the occupant observation sensor 161 can be limited to the occupant range of the seat 153A. Therefore, the detection density can be made higher than that of the other occupant observation sensors 162 and 163, and the state of one occupant can be detected with high accuracy.
実施例5によれば、座席の移動によって1人の乗員が複数の乗員観察センサの観察範囲に跨がる場合であっても、それら複数の乗員観察センサの各々から得られた観察情報の統合によって、その乗員の広範囲の観察情報を取得することができる。したがって、乗員状態検出部45での乗員の状態の検出が適切に行われる。
According to the fifth embodiment, even when one occupant straddles the observation range of the plurality of occupant observation sensors due to the movement of the seat, the observation information obtained from each of the plurality of occupant observation sensors is integrated. Allows a wide range of observation information for the occupant to be obtained. Therefore, the occupant state detection unit 45 appropriately detects the occupant state.
<実施例6>
図23は、センシング設定処理の実施例6における乗員観察センサの配置を説明する図である。 <Example 6>
FIG. 23 is a diagram illustrating the arrangement of the occupant observation sensor in the sixth embodiment of the sensing setting process.
図23は、センシング設定処理の実施例6における乗員観察センサの配置を説明する図である。 <Example 6>
FIG. 23 is a diagram illustrating the arrangement of the occupant observation sensor in the sixth embodiment of the sensing setting process.
なお、図中、図21と対応する部分にはついては、同一の符号を付して説明を省略する。
In the drawings, the parts corresponding to those in FIG. 21 are designated by the same reference numerals and the description thereof will be omitted.
図23の自動車151は、前列座席部153、中列座席部154、後列座席部155、座席153A乃至155A、及び、153B乃至155B、乗員観察センサ161乃至163、211、及び、212、着座センサ191A乃至193A、及び、191B及び193B、座席位置センサ194A乃至196A、及び、194B及び196Bを有する。したがって、図23の自動車151は、前列座席部153、中列座席部154、後列座席部155、座席153A乃至155A、及び、153B乃至155B、乗員観察センサ161乃至163、着座センサ191A乃至193A、及び、191B及び193B、座席位置センサ194A乃至196A、及び、194B及び196Bを有する点で、図21の場合と共通する。但し、図23の自動車151は、乗員観察センサ211、及び、212が新たに設けられている点で、図21の場合と相違する。
The automobile 151 of FIG. 23 has a front row seat portion 153, a middle row seat portion 154, a rear row seat portion 155, seats 153A to 155A, and 153B to 155B, occupant observation sensors 161 to 163, 211, and 212, and a seating sensor 191A. To 193A, and 191B and 193B, seat position sensors 194A to 196A, and 194B and 196B. Therefore, the automobile 151 of FIG. 23 has front row seats 153, middle row seats 154, rear row seats 155, seats 153A to 155A, and 153B to 155B, occupant observation sensors 161 to 163, seating sensors 191A to 193A, and , 191B and 193B, seat position sensors 194A to 196A, and 194B and 196B, which are common to the case of FIG. 21. However, the automobile 151 of FIG. 23 is different from the case of FIG. 21 in that the occupant observation sensor 211 and 212 are newly provided.
図23における乗員観察センサ211、212は、図2の乗員観察センサ43であり、車内の天井の横幅中央付近に配置される。乗員観察センサ211は、乗員観察センサ161と乗員観察センサ162との間に配置され、前列座席部153と中列座席部154との座席間の上方に配置される。
The occupant observation sensors 211 and 212 in FIG. 23 are the occupant observation sensors 43 in FIG. 2, and are arranged near the center of the width of the ceiling in the vehicle. The occupant observation sensor 211 is arranged between the occupant observation sensor 161 and the occupant observation sensor 162, and is arranged above the seats between the front row seat portion 153 and the middle row seat portion 154.
乗員観察センサ212は、乗員観察センサ162と乗員観察センサ163との間に配置され、中列座席部154と後列座席部155との座席間の上方に配置される。
The occupant observation sensor 212 is arranged between the occupant observation sensor 162 and the occupant observation sensor 163, and is arranged above the seats between the middle row seat portion 154 and the rear row seat portion 155.
図24は、図23の乗員観察センサ161乃至163、211、及び、212のセンシング範囲を示した図である。
FIG. 24 is a diagram showing the sensing ranges of the occupant observation sensors 161 to 163, 211, and 212 of FIG. 23.
図24において、センシング範囲181乃至183は、それぞれ乗員観察センサ161乃至163のセンシング範囲(最大センシング範囲)を表す。センシング範囲231及び232は、それぞれ乗員観察センサ211及び212の最大センシング範囲を表す。
In FIG. 24, the sensing ranges 181 to 183 represent the sensing ranges (maximum sensing range) of the occupant observation sensors 161 to 163, respectively. The sensing ranges 231 and 232 represent the maximum sensing ranges of the occupant observation sensors 211 and 212, respectively.
各座席153A乃至155A、及び、153B乃至155Bが図24のように標準的な位置に配置されている場合には、各座席153A乃至155A、及び、153B乃至155Bに着座している乗員は、乗員観察センサ161乃至163のうちのいずれか1つのセンシング範囲内で観察される。したがって、各座席153A乃至155A、及び、153B乃至155Bが図24のように標準的な位置に配置されている場合には、乗員観察センサ211及び212は使用しなくてもよい。
When the seats 153A to 155A and 153B to 155B are arranged in standard positions as shown in FIG. 24, the occupants seated in the seats 153A to 155A and 153B to 155B are occupants. Observation is observed within the sensing range of any one of the observation sensors 161 to 163. Therefore, when the seats 153A to 155A and 153B to 155B are arranged in standard positions as shown in FIG. 24, the occupant observation sensors 211 and 212 may not be used.
一方、座席153A乃至155A、及び、153B乃至155Bが、標準的な位置から変位し、前列座席部153と中列座席部154との間、又は、中列座席部154と後列座席部155との間に配置された場合には、その座席の乗員は、乗員観察センサ211又は212で観察される。
On the other hand, the seats 153A to 155A and 153B to 155B are displaced from the standard positions and are located between the front row seat portion 153 and the middle row seat portion 154, or between the middle row seat portion 154 and the rear row seat portion 155. When placed in between, the occupants of that seat are observed by the occupant observation sensor 211 or 212.
図25は、図24に対して座席が標準的な位置から大きく変位した場合のセンシング範囲を示した図である。
FIG. 25 is a diagram showing a sensing range when the seat is largely displaced from the standard position with respect to FIG. 24.
図25において、座席153Bと座席154Bとが、標準的な位置から大きく後方に変位している。特に、座席153Bは、前列座席部153の標準的な位置と、中列座席部154の標準的な位置との間に配置されている。座席154Bは、中列座席部154の標準的な位置と、後列座席部155の標準的な位置との間に配置されている。
In FIG. 25, the seat 153B and the seat 154B are largely displaced rearward from the standard position. In particular, seat 153B is located between the standard position of the front row seats 153 and the standard position of the middle row seats 154. Seat 154B is located between the standard position of the middle row seats 154 and the standard position of the rear row seats 155.
この場合において、センシング設定部44は、次のような処理によって各座席153A乃至155A、及び、153B乃至155Bに着座している乗員を観察する乗員観察センサを変更する。
In this case, the sensing setting unit 44 changes the occupant observation sensor for observing the occupants seated in the seats 153A to 155A and 153B to 155B by the following processing.
センシング設定部44は、各座席153A乃至155A、及び、153B乃至155Bの座席位置センサ194A乃至196A、及び、194B及び196Bからの座席情報により座席153A乃至155A、及び、153B乃至155Bの位置を検出する。
The sensing setting unit 44 detects the positions of the seats 153A to 155A and 153B to 155B based on the seat information from the seat position sensors 194A to 196A and 194B and 196B of the seats 153A to 155A and 153B to 155B. ..
センシング設定部44は、検出した各座席153A乃至155A、及び、153B乃至155Bの位置に基づいて、前列座席部153の標準的な位置と中列座席部154の標準的な位置との間(座席間)の位置、又は、中列座席部154の標準的な位置と後列座席部155の標準的な位置との間(座席間)の位置に、座席が配置されているか否かを検出する。
The sensing setting unit 44 is located between the standard position of the front row seat portion 153 and the standard position of the middle row seat portion 154 (seat) based on the detected positions of the seats 153A to 155A and 153B to 155B. It is detected whether or not a seat is arranged at the position (between the seats) or between the standard position of the middle row seat portion 154 and the standard position of the rear row seat portion 155 (between the seats).
その結果、センシング設定部44は、座席間に配置されている座席については、乗員観察センサ211、又は、212により観察させ、それ以外の座席については、乗員観察センサ161、162、又は、163により観察させる。そして、センシング設定部44は、各座席153A乃至155A、及び、153B乃至155Bの着座の有無及び位置に基づいて乗員観察センサ161乃至163、211、及び、212のセンシング範囲を設定する。
As a result, the sensing setting unit 44 causes the seats arranged between the seats to be observed by the occupant observation sensor 211 or 212, and the other seats are observed by the occupant observation sensor 161 or 162 or 163. Let them observe. Then, the sensing setting unit 44 sets the sensing range of the occupant observation sensors 161 to 163, 211, and 212 based on the presence / absence and position of the seats 153A to 155A and 153B to 155B.
以上のようなセンシング設定部44の処理により、図25において、座席153A乃至155Aについては、それぞれ乗員観察センサ161乃至163のセンシング範囲181-1、182-1、183-1により観察される。座席153B、及び、154Bについては、それぞれ乗員観察センサ211、及び、212のセンシング範囲231-1、及び、232-1により観察される。なお、着座状態でない非着座状態の座席(例えば座席155B)については、センシング範囲は設定されず乗員観察センサによる観察は行われない。
By the processing of the sensing setting unit 44 as described above, in FIG. 25, the seats 153A to 155A are observed by the sensing ranges 181-1, 182-1 and 183-1 of the occupant observation sensors 161 to 163, respectively. Seats 153B and 154B are observed by the occupant observation sensors 211 and the sensing ranges 231-1 and 232-1 of 212, respectively. For non-seated seats (for example, seat 155B) that are not seated, the sensing range is not set and observation by the occupant observation sensor is not performed.
実施例6によれば、実施例5と比較すると乗員観察センサの数が増加するが、処理量を大きく増やすことなく乗員状態検出部45での乗員の状態の検出及び操作認識部46での操作の認識を高精度に行うことができる。
According to the sixth embodiment, the number of occupant observation sensors is increased as compared with the fifth embodiment, but the occupant state detection unit 45 detects the occupant state and the operation recognition unit 46 operates without significantly increasing the processing amount. Can be recognized with high accuracy.
<実施例7>
図26及び図27は、センシング設定処理の実施例7における乗員観察センサの配置を説明する図である。 <Example 7>
26 and 27 are diagrams illustrating the arrangement of the occupant observation sensor in the seventh embodiment of the sensing setting process.
図26及び図27は、センシング設定処理の実施例7における乗員観察センサの配置を説明する図である。 <Example 7>
26 and 27 are diagrams illustrating the arrangement of the occupant observation sensor in the seventh embodiment of the sensing setting process.
なお、図26及び図27の図中、図21と対応する部分にはついては、同一の符号を付して説明を省略する。
In the drawings of FIGS. 26 and 27, the parts corresponding to those of FIG. 21 are designated by the same reference numerals and the description thereof will be omitted.
図26の実施例7における自動車151の車内152における座席153A乃至155A、及び、153B乃至155Bの配置や、乗員観察センサ161乃至163の配置等は、図21の場合と共通する。
The arrangement of seats 153A to 155A and 153B to 155B in the vehicle interior 152 of the automobile 151 in the seventh embodiment of FIG. 26, the arrangement of the occupant observation sensors 161 to 163, and the like are the same as in the case of FIG.
また、実施例7では、実施例5に対して図27における乗員観察センサ251、252、及び、253が追加される。
Further, in the seventh embodiment, the occupant observation sensors 251, 252, and 253 in FIG. 27 are added to the fifth embodiment.
乗員観察センサ251、252、及び、253は、それぞれ前列座席部153、中列座席部154、及び、後列座席部155に対して前方となる位置に配置され、前列座席部153、中列座席部154、及び、後列座席部155に着座している乗員をその正面側から横方向(水平方向)に近い向きで観察する。
The occupant observation sensors 251, 252, and 253 are arranged at positions in front of the front row seats 153, the middle row seats 154, and the rear row seats 155, respectively, and the front row seats 153 and the middle row seats 153. The occupants seated in 154 and the back row seat portion 155 are observed from the front side thereof in a direction close to the lateral direction (horizontal direction).
また、乗員観察センサ251、252、及び、253は、乗員観察センサ161、162、163よりも高性能であり、検出密度及び検出頻度を乗員観察センサ161、162、163よりも高く設定することができる。
Further, the occupant observation sensors 251, 252, and 253 have higher performance than the occupant observation sensors 161, 162, and 163, and the detection density and the detection frequency can be set higher than those of the occupant observation sensors 161, 162, and 163. can.
乗員観察センサ251は、前列座席部153の横幅中央付近の前方となる位置、例えばダッシュボードやフロントガラス周辺部分等に設置される。乗員観察センサ251の最大センシング範囲271は、前列座席部153の座席153A及び153Bに着座している乗員の範囲を含み、座席153A及び153Bに着座している乗員をその正面側から観察する。
The occupant observation sensor 251 is installed at a position in front of the front row seat portion 153 near the center of the width, for example, a dashboard or a windshield peripheral portion. The maximum sensing range 271 of the occupant observation sensor 251 includes the range of the occupants seated in the seats 153A and 153B of the front row seat portion 153, and observes the occupants seated in the seats 153A and 153B from the front side thereof.
また、座席153A及び153Bのそれぞれに配置された着座センサ191A及び191Bにより、座席153A及び153Bがそれぞれ着座状態と非着座状態のいずれであるかがセンシング設定部44により検出される。その結果、座席153Aのみが着座状態である場合には、乗員観察センサ251のセンシング範囲がセンシング範囲271-1に制限される。座席153Bのみが着座状態である場合には、乗員観察センサ251のセンシング範囲がセンシング範囲271-2に制限される。
Further, the seating sensors 191A and 191B arranged in the seats 153A and 153B respectively detect whether the seats 153A and 153B are in the seated state or the non-seat state, respectively, by the sensing setting unit 44. As a result, when only the seat 153A is seated, the sensing range of the occupant observation sensor 251 is limited to the sensing range 271-1. When only the seat 153B is seated, the sensing range of the occupant observation sensor 251 is limited to the sensing range 271-2.
乗員観察センサ252は、例えば、前列座席部153の横幅中央付近の背面部分に設置される。乗員観察センサ252の最大センシング範囲272は、中列座席部154の座席154A及び154Bに着座している乗員の範囲を含み、座席154A及び154Bに着座している乗員をその正面側から観察する。
The occupant observation sensor 252 is installed, for example, on the back portion of the front row seat portion 153 near the center of the width. The maximum sensing range 272 of the occupant observation sensor 252 includes the range of the occupants seated in the seats 154A and 154B of the middle row seat portion 154, and observes the occupants seated in the seats 154A and 154B from the front side thereof.
また、座席154A及び154Bのそれぞれに配置された着座センサ192A及び192Bにより、座席154A及び154Bがそれぞれ着座状態か非着座状態かがセンシング設定部44により検出される。その結果、座席154Aのみが着座状態である場合には、乗員観察センサ252のセンシング範囲がセンシング範囲272-1に制限される。座席154Bのみが着座状態である場合には、乗員観察センサ252のセンシング範囲がセンシング範囲272-2に制限される。
Further, the seating sensors 192A and 192B arranged in the seats 154A and 154B respectively detect whether the seats 154A and 154B are in the seated state or the non-seat state, respectively, by the sensing setting unit 44. As a result, when only the seat 154A is seated, the sensing range of the occupant observation sensor 252 is limited to the sensing range 272-1. When only the seat 154B is seated, the sensing range of the occupant observation sensor 252 is limited to the sensing range 272-2.
乗員観察センサ253は、例えば、中列座席部154の横幅中央付近の背面部分に設置される。乗員観察センサ253の最大センシング範囲273は、後列座席部155の座席155A及び155Bに着座している乗員の範囲を含み、座席155A及び155Bに着座している乗員をその正面側から観察する。
The occupant observation sensor 253 is installed, for example, on the back portion of the middle row seat portion 154 near the center of the width. The maximum sensing range 273 of the occupant observation sensor 253 includes the range of the occupants seated in the seats 155A and 155B of the rear row seat portion 155, and observes the occupants seated in the seats 155A and 155B from the front side thereof.
また、座席155A及び155Bのそれぞれに配置された着座センサ193A及び193Bにより、座席155A及び155Bがそれぞれ着座状態か非着座状態かがセンシング設定部44により検出される。その結果、座席155Aのみが着座状態である場合には、乗員観察センサ253のセンシング範囲がセンシング範囲273-1に制限される。座席155Bのみが着座状態である場合には、乗員観察センサ253のセンシング範囲がセンシング範囲273-2に制限される。
Further, the seating sensors 193A and 193B arranged in the seats 155A and 155B respectively detect whether the seats 155A and 155B are in the seated state or the non-seat state, respectively, by the sensing setting unit 44. As a result, when only the seat 155A is seated, the sensing range of the occupant observation sensor 253 is limited to the sensing range 273-1. When only the seat 155B is seated, the sensing range of the occupant observation sensor 253 is limited to the sensing range 273-2.
また、図26及び図27における座席153A乃至155A、及び、153B乃至155Bの配置においては、乗員観察センサ161乃至163と、乗員観察センサ251乃至253とが、1つの座席の乗員に対して重複して観察しているので、センシング設定部44は、乗員観察センサ161乃至163よりも高性能な乗員観察センサ251乃至253で乗員の観察を行わせるようにしてもよい。なお、乗員観察センサ251乃至253よりも乗員観察センサ161乃至163の方が高性能であってもよし、乗員観察センサ161乃至163の性能や乗員観察センサ251乃至253の性能がそれぞれ同じでなくてもよい。センシング設定部44は、1つの座席に対して観察に使用可能な複数の乗員観察センサが存在する場合に、高性能の方を乗員の観察に使用するようにしてもよいし、予め決められた方を優先的に乗員の観察に使用するようにしてもよい。
Further, in the arrangement of the seats 153A to 155A and 153B to 155B in FIGS. 26 and 27, the occupant observation sensors 161 to 163 and the occupant observation sensors 251 to 253 overlap with respect to the occupant of one seat. Therefore, the sensing setting unit 44 may allow the occupant observation sensors 251 to 253, which have higher performance than the occupant observation sensors 161 to 163, to observe the occupant. The occupant observation sensors 161 to 163 may have higher performance than the occupant observation sensors 251 to 253, and the performance of the occupant observation sensors 161 to 163 and the performance of the occupant observation sensors 251 to 253 are not the same. May be good. When there are a plurality of occupant observation sensors that can be used for observation for one seat, the sensing setting unit 44 may use the higher performance sensor for observing the occupants, or may be determined in advance. The one may be preferentially used for observing the occupants.
一方、座席の移動によって、乗員の観察が不可能となる場合がある。
On the other hand, it may not be possible to observe the occupants due to the movement of seats.
図28は、図26及び27に対して座席の配列が変更された場合を説明する図である。
FIG. 28 is a diagram illustrating a case where the seat arrangement is changed with respect to FIGS. 26 and 27.
図28において、中列座席部154の座席154A及び154Bの座席の向きが180度変更されている。
In FIG. 28, the orientations of the seats 154A and 154B of the middle row seat portion 154 are changed by 180 degrees.
この場合において、図27の乗員観察センサ252は、座席154A及び154Bの背もたれにより遮蔽されることにより、乗員の観察に使用できなくなる。
In this case, the occupant observation sensor 252 of FIG. 27 cannot be used for occupant observation because it is shielded by the backrests of the seats 154A and 154B.
また、図27の乗員観察センサ253は、中列座席部154の背面(座席154A及び154B等の背面)に設置されているため、座席154A及び154Bの向きの変更によって前列座席部153の背面を観察方向として向いている。したがって、乗員観察センサ253は、乗員の観察に使用できなくなる。
Further, since the occupant observation sensor 253 of FIG. 27 is installed on the back surface of the middle row seat portion 154 (the back surface of the seats 154A and 154B, etc.), the back surface of the front row seat portion 153 can be changed by changing the orientation of the seats 154A and 154B. It is suitable as an observation direction. Therefore, the occupant observation sensor 253 cannot be used for observing the occupant.
このような場合は、前列座席部153の乗員の観察に対しては乗員観察センサ251を使用し、中列座席部154及び後列座席部155の乗員の観察に対しては乗員観察センサ162及び163を使用する。
In such a case, the occupant observation sensor 251 is used for observing the occupants of the front row seat portion 153, and the occupant observation sensors 162 and 163 are used for observing the occupants of the middle row seat portion 154 and the rear row seat portion 155. To use.
センシング設定部44は、各座席153A乃至155A、及び、153B乃至155Bのそれぞれに設置された座席位置センサ194A乃至196A、及び、194B乃至196Bからのセンサ信号により座席の配置を検出する。そして、センシング設定部44は、座席の配置に基づいて有効に使用できる乗員観察センサを把握して、有効に使用できる乗員観察センサのうちから、各座席の乗員の観察に使用する乗員観察センサを決定する。センシング設定部44は、1つの座席に対して複数の乗員観察センサを使用可能な場合には、上述のように性能を優先して乗員の観察に使用する乗員観察センサを決めてもよいし、予め決められた乗員観察センサを観察に使用してもよい。
The sensing setting unit 44 detects the seat arrangement by the sensor signals from the seat position sensors 194A to 196A and 194B to 196B installed in each of the seats 153A to 155A and 153B to 155B, respectively. Then, the sensing setting unit 44 grasps the occupant observation sensor that can be effectively used based on the seat arrangement, and among the occupant observation sensors that can be effectively used, the occupant observation sensor used for observing the occupant of each seat is selected. decide. When a plurality of occupant observation sensors can be used for one seat, the sensing setting unit 44 may determine the occupant observation sensor to be used for occupant observation by giving priority to performance as described above. A predetermined occupant observation sensor may be used for observation.
実施例7によれば、座席の配列を制限することなく、乗員状態検出部45での乗員の状態の検出及び操作認識部46での操作の認識を高精度に行うことができる。
According to the seventh embodiment, the occupant state detection unit 45 can detect the occupant state and the operation recognition unit 46 can recognize the operation with high accuracy without limiting the arrangement of the seats.
<実施例8>
図29は、センシング設定の実施例8における乗員観察センサの配置を説明する図である。 <Example 8>
FIG. 29 is a diagram illustrating the arrangement of the occupant observation sensor in the eighth embodiment of the sensing setting.
図29は、センシング設定の実施例8における乗員観察センサの配置を説明する図である。 <Example 8>
FIG. 29 is a diagram illustrating the arrangement of the occupant observation sensor in the eighth embodiment of the sensing setting.
図29において、自動車301の車内302には、前列座席部303、中列座席部304、及び、後列座席部305が配置されている。
In FIG. 29, a front row seat portion 303, a middle row seat portion 304, and a rear row seat portion 305 are arranged in the vehicle interior 302 of the automobile 301.
自動車301の側面には窓311乃至313が設けられ、窓311乃至313にはタッチパネル331乃至333が設置されている。なお、各窓311乃至313には、透明の画像表示ディスプレイが設けられていてもよいし、プロジェクタにより画像情報やテキスト情報が投影されるようになっていてもよい。
Windows 311 to 313 are provided on the side surface of the automobile 301, and touch panels 331 to 333 are installed on the windows 311 to 313. A transparent image display may be provided on each of the windows 311 to 313, or image information or text information may be projected by a projector.
また、タッチパネル331乃至333は、タッチ操作を行う乗員を観察する図2の乗員観察センサ43の一形態である。
Further, the touch panels 331 to 333 are a form of the occupant observation sensor 43 of FIG. 2 for observing the occupant performing the touch operation.
図30は、図29に対して座席の配列が変更された場合を説明する図である。
FIG. 30 is a diagram illustrating a case where the seat arrangement is changed with respect to FIG. 29.
図30において、座席の配列は、会議が行われるような配置に変更されている。また、座席304Aは、窓312を背後にした位置及び向きに変更されている。
In FIG. 30, the arrangement of seats has been changed so that a meeting is held. In addition, the seat 304A has been changed to a position and orientation with the window 312 behind.
この場合において、窓312のタッチパネル332が使用される可能性が少ないため、自動的にタッチパネル332のセンシングが停止される(無効とされる)。
In this case, since the touch panel 332 of the window 312 is unlikely to be used, the sensing of the touch panel 332 is automatically stopped (disabled).
センシング設定部44は、各座席に設けられた座席位置センサからのセンサ信号により各座席の位置を検出する。そして、タッチパネル331乃至333のいずれかに対して座席の背もたれ(シートバック)が近接するような位置及び向きに座席が配置された場合には、背もたれが近接したタッチパネルによるセンシングを停止させる。
The sensing setting unit 44 detects the position of each seat by the sensor signal from the seat position sensor provided in each seat. Then, when the seat is arranged at a position and orientation such that the backrest (seat back) of the seat is close to any one of the touch panels 331 to 333, the sensing by the touch panel in which the backrest is close is stopped.
実施例8によれば、使用されないタッチパネルを無効とすることで処理の負担が軽減されると共に、誤操作が防止される。
According to the eighth embodiment, by disabling the touch panel that is not used, the processing load is reduced and erroneous operation is prevented.
<実施例9>
図31は、センシング設定の実施例9における乗員観察センサの配置を説明する図である。 <Example 9>
FIG. 31 is a diagram illustrating the arrangement of the occupant observation sensor in the ninth embodiment of the sensing setting.
図31は、センシング設定の実施例9における乗員観察センサの配置を説明する図である。 <Example 9>
FIG. 31 is a diagram illustrating the arrangement of the occupant observation sensor in the ninth embodiment of the sensing setting.
図31において、自動車351の運転席352には、ハンドル(ステアリングホイール)353が設けられ、ハンドル353には、ハンドルセンサ361が設けられる。
In FIG. 31, the driver's seat 352 of the automobile 351 is provided with a steering wheel (steering wheel) 353, and the steering wheel 353 is provided with a steering wheel sensor 361.
また、運転席352の側面の窓362には、タッチパネル371が設けられる。
Further, a touch panel 371 is provided on the window 362 on the side surface of the driver's seat 352.
運転手である乗員P6は、ハンドル(ステアリングホイール)353を把持している。
The driver, occupant P6, is holding the steering wheel (steering wheel) 353.
なお、窓362には、透明の画像表示ディスプレイが設けられていてもよいし、プロジェクタにより画像情報やテキスト情報が投影されるようになっていてもよい。
Note that the window 362 may be provided with a transparent image display, or the projector may project image information or text information.
また、タッチパネル371は、タッチ操作を行う乗員を観察する図2の乗員観察センサ43の一形態である。ハンドルセンサ361は、運転席に着座した乗員P6がハンドル353を握っているか否かを検出するセンサであり、座席の状態を検出する図2の座席関連センサ42に含まれるセンサの一形態である。
Further, the touch panel 371 is a form of the occupant observation sensor 43 of FIG. 2 for observing the occupant performing the touch operation. The handle sensor 361 is a sensor that detects whether or not the occupant P6 seated in the driver's seat is holding the handle 353, and is a form of a sensor included in the seat-related sensor 42 of FIG. 2 that detects the state of the seat. ..
この場合において、運転席の乗員P6がハンドル353を把持しているときは、運転中であると推定されるため、窓362のタッチパネル371のセンシングが停止される。なお、他の図21の実施例1等と組み合わせて適用される場合に、タッチパネル371に対するタッチ操作以外のジェスチャ等による操作は継続して有効とされてもよい。
In this case, when the occupant P6 in the driver's seat is holding the handle 353, it is presumed that the vehicle is driving, so that the sensing of the touch panel 371 of the window 362 is stopped. When applied in combination with the first embodiment of FIG. 21, an operation by a gesture or the like other than the touch operation on the touch panel 371 may be continuously effective.
センシング設定部44は、ハンドルセンサ361からのセンサ信号により、ハンドル353が把持されているか否かを検出する。その結果、ハンドル353が把持されている場合には、タッチパネル371のセンシングを無効とする。
The sensing setting unit 44 detects whether or not the handle 353 is gripped by the sensor signal from the handle sensor 361. As a result, when the handle 353 is gripped, the sensing of the touch panel 371 is invalidated.
実施例9によれば、運転中の運転手がタッチパネルに対するタッチ操作を行うことで注意散漫になることを防止することできる。
According to the ninth embodiment, it is possible to prevent the driver from being distracted by performing a touch operation on the touch panel while driving.
<適用例>
本技術は、自動車に適用される以外にも、座席等の所定の設備が設けられた施設であって、その設備を使用者が使用することで、使用者の存在の有無や位置を推定できるような施設であれば、任意の施設に適用できる。 <Application example>
In addition to being applied to automobiles, this technology is a facility equipped with predetermined equipment such as seats, and the presence or absence and position of the user can be estimated by using the equipment. If it is such a facility, it can be applied to any facility.
本技術は、自動車に適用される以外にも、座席等の所定の設備が設けられた施設であって、その設備を使用者が使用することで、使用者の存在の有無や位置を推定できるような施設であれば、任意の施設に適用できる。 <Application example>
In addition to being applied to automobiles, this technology is a facility equipped with predetermined equipment such as seats, and the presence or absence and position of the user can be estimated by using the equipment. If it is such a facility, it can be applied to any facility.
例えば、座席が設けられる鉄道車両やバスの車内、会議室や教室、レストラン(飲食店)、劇場、競技場の他、VR(Virtual Reality)空間やAR(Augmented Reality)空間を提供する施設等の所定の室内空間に適用可能である。この場合、使用者が使用する設備は、座席に限らず、使用者が立つことのできる床面や階段などであってもよい。
For example, in railcars and buses with seats, conference rooms and classrooms, restaurants (restaurants), theaters, stadiums, facilities that provide VR (Virtual Reality) space and AR (Augmented Reality) space, etc. It can be applied to a predetermined indoor space. In this case, the equipment used by the user is not limited to the seat, but may be a floor surface or stairs on which the user can stand.
なお、本技術は、以下のような構成も取ることができる。
(1) 自動車の車内に設けられる設備の状態に基づいて、前記設備を使用する使用者を観察する観察センサのセンシング範囲を設定する処理部
を有する情報処理装置。
(2) 前記設備の状態は、前記設備が使用されているか否かの状態を含む
(1)に記載の情報処理装置。
(3) 前記設備は、座席を含む
(1)又は(2)に記載の情報処理装置。
(4) 前記設備の状態は、変更可能な前記設備の形態を含む
(1)乃至(3)のいずれかに記載の情報処理装置。
(5) 前記設備の状態は、前記設備に関連する付帯設備の状態を含む
(1)乃至(4)のいずれかに記載の情報処理装置。
(6) 前記設備の状態は、前記設備の位置を含む
(1)乃至(5)のいずれかに記載の情報処理装置。
(7) 前記観察センサは、前記使用者の空間的情報を取得するセンサを含む
(1)乃至(6)のいずれかに記載の情報処理装置。
(8) 前記観察センサは、前記使用者の身体的情報を取得するセンサを含む
(1)乃至(7)のいずれかに記載の情報処理装置。
(9) 前記観察センサは、画像センサ又はデプスセンサを含む
(1)乃至(8)のいずれかに記載の情報処理装置。
(10) 前記処理部は、前記設備の状態に対して予め対応付けられたセンシング範囲を前記観察センサの前記センシング範囲として設定する
(1)乃至(9)のいずれかに記載の情報処理装置。
(11) 前記処理部は、前記センシング範囲に応じて前記観察センサの検出密度又は検出頻度を設定する
(1)乃至(10)のいずれかに記載の情報処理装置。
(12) 前記処理部は、前記観察センサの前記センシング範囲から取得される情報量に基づいて前記検出密度又は前記検出頻度を設定する
(11)に記載の情報処理装置。
(13) 前記処理部は、前記観察センサからの観察情報に基づいて、前記使用者の状態を検出する
(1)に記載の情報処理装置。
(14) 前記処理部は、前記使用者の状態に対応する操作を認識する
(13)に記載の情報処理装置。
(15) 処理部
を有する情報処理装置の
前記処理部が、
設備の状態に基づいて、前記設備を使用する使用者を観察する観察センサのセンシング範囲を設定する
情報処理方法。
(16) コンピュータを、
設備の状態に基づいて、前記設備を使用する使用者を観察する観察センサのセンシング範囲を設定する処理部
として機能させるためのプログラム。 The present technology can also have the following configurations.
(1) An information processing device having a processing unit that sets a sensing range of an observation sensor that observes a user who uses the equipment based on the state of the equipment installed in the vehicle.
(2) The information processing apparatus according to (1), wherein the state of the equipment includes a state of whether or not the equipment is used.
(3) The information processing device according to (1) or (2), wherein the equipment includes a seat.
(4) The information processing apparatus according to any one of (1) to (3), wherein the state of the equipment includes a changeable form of the equipment.
(5) The information processing apparatus according to any one of (1) to (4), wherein the state of the equipment includes the state of ancillary equipment related to the equipment.
(6) The information processing apparatus according to any one of (1) to (5), wherein the state of the equipment includes the position of the equipment.
(7) The information processing device according to any one of (1) to (6), wherein the observation sensor includes a sensor that acquires spatial information of the user.
(8) The information processing device according to any one of (1) to (7), wherein the observation sensor includes a sensor that acquires physical information of the user.
(9) The information processing device according to any one of (1) to (8), wherein the observation sensor includes an image sensor or a depth sensor.
(10) The information processing apparatus according to any one of (1) to (9), wherein the processing unit sets a sensing range previously associated with the state of the equipment as the sensing range of the observation sensor.
(11) The information processing apparatus according to any one of (1) to (10), wherein the processing unit sets the detection density or detection frequency of the observation sensor according to the sensing range.
(12) The information processing apparatus according to (11), wherein the processing unit sets the detection density or the detection frequency based on the amount of information acquired from the sensing range of the observation sensor.
(13) The information processing device according to (1), wherein the processing unit detects the state of the user based on the observation information from the observation sensor.
(14) The information processing apparatus according to (13), wherein the processing unit recognizes an operation corresponding to the state of the user.
(15) The processing unit of the information processing device having the processing unit is
An information processing method that sets the sensing range of an observation sensor that observes a user who uses the equipment based on the condition of the equipment.
(16) Computer
A program for functioning as a processing unit that sets the sensing range of an observation sensor that observes the user who uses the equipment based on the condition of the equipment.
(1) 自動車の車内に設けられる設備の状態に基づいて、前記設備を使用する使用者を観察する観察センサのセンシング範囲を設定する処理部
を有する情報処理装置。
(2) 前記設備の状態は、前記設備が使用されているか否かの状態を含む
(1)に記載の情報処理装置。
(3) 前記設備は、座席を含む
(1)又は(2)に記載の情報処理装置。
(4) 前記設備の状態は、変更可能な前記設備の形態を含む
(1)乃至(3)のいずれかに記載の情報処理装置。
(5) 前記設備の状態は、前記設備に関連する付帯設備の状態を含む
(1)乃至(4)のいずれかに記載の情報処理装置。
(6) 前記設備の状態は、前記設備の位置を含む
(1)乃至(5)のいずれかに記載の情報処理装置。
(7) 前記観察センサは、前記使用者の空間的情報を取得するセンサを含む
(1)乃至(6)のいずれかに記載の情報処理装置。
(8) 前記観察センサは、前記使用者の身体的情報を取得するセンサを含む
(1)乃至(7)のいずれかに記載の情報処理装置。
(9) 前記観察センサは、画像センサ又はデプスセンサを含む
(1)乃至(8)のいずれかに記載の情報処理装置。
(10) 前記処理部は、前記設備の状態に対して予め対応付けられたセンシング範囲を前記観察センサの前記センシング範囲として設定する
(1)乃至(9)のいずれかに記載の情報処理装置。
(11) 前記処理部は、前記センシング範囲に応じて前記観察センサの検出密度又は検出頻度を設定する
(1)乃至(10)のいずれかに記載の情報処理装置。
(12) 前記処理部は、前記観察センサの前記センシング範囲から取得される情報量に基づいて前記検出密度又は前記検出頻度を設定する
(11)に記載の情報処理装置。
(13) 前記処理部は、前記観察センサからの観察情報に基づいて、前記使用者の状態を検出する
(1)に記載の情報処理装置。
(14) 前記処理部は、前記使用者の状態に対応する操作を認識する
(13)に記載の情報処理装置。
(15) 処理部
を有する情報処理装置の
前記処理部が、
設備の状態に基づいて、前記設備を使用する使用者を観察する観察センサのセンシング範囲を設定する
情報処理方法。
(16) コンピュータを、
設備の状態に基づいて、前記設備を使用する使用者を観察する観察センサのセンシング範囲を設定する処理部
として機能させるためのプログラム。 The present technology can also have the following configurations.
(1) An information processing device having a processing unit that sets a sensing range of an observation sensor that observes a user who uses the equipment based on the state of the equipment installed in the vehicle.
(2) The information processing apparatus according to (1), wherein the state of the equipment includes a state of whether or not the equipment is used.
(3) The information processing device according to (1) or (2), wherein the equipment includes a seat.
(4) The information processing apparatus according to any one of (1) to (3), wherein the state of the equipment includes a changeable form of the equipment.
(5) The information processing apparatus according to any one of (1) to (4), wherein the state of the equipment includes the state of ancillary equipment related to the equipment.
(6) The information processing apparatus according to any one of (1) to (5), wherein the state of the equipment includes the position of the equipment.
(7) The information processing device according to any one of (1) to (6), wherein the observation sensor includes a sensor that acquires spatial information of the user.
(8) The information processing device according to any one of (1) to (7), wherein the observation sensor includes a sensor that acquires physical information of the user.
(9) The information processing device according to any one of (1) to (8), wherein the observation sensor includes an image sensor or a depth sensor.
(10) The information processing apparatus according to any one of (1) to (9), wherein the processing unit sets a sensing range previously associated with the state of the equipment as the sensing range of the observation sensor.
(11) The information processing apparatus according to any one of (1) to (10), wherein the processing unit sets the detection density or detection frequency of the observation sensor according to the sensing range.
(12) The information processing apparatus according to (11), wherein the processing unit sets the detection density or the detection frequency based on the amount of information acquired from the sensing range of the observation sensor.
(13) The information processing device according to (1), wherein the processing unit detects the state of the user based on the observation information from the observation sensor.
(14) The information processing apparatus according to (13), wherein the processing unit recognizes an operation corresponding to the state of the user.
(15) The processing unit of the information processing device having the processing unit is
An information processing method that sets the sensing range of an observation sensor that observes a user who uses the equipment based on the condition of the equipment.
(16) Computer
A program for functioning as a processing unit that sets the sensing range of an observation sensor that observes the user who uses the equipment based on the condition of the equipment.
11 情報処理システム, 41 処理部, 42 座席関連センサ, 43 乗員観察センサ, 44 センシング設定部, 45 乗員状態検出部, 46 操作認識部, 47 操作応答部
11 information processing system, 41 processing unit, 42 seat-related sensor, 43 occupant observation sensor, 44 sensing setting unit, 45 occupant status detection unit, 46 operation recognition unit, 47 operation response unit
Claims (16)
- 自動車の車内に設けられる設備の状態に基づいて、前記設備を使用する使用者を観察する観察センサのセンシング範囲を設定する処理部
を有する情報処理装置。 An information processing device having a processing unit that sets a sensing range of an observation sensor that observes a user who uses the equipment based on the state of the equipment installed in the vehicle. - 前記設備の状態は、前記設備が使用されているか否かの状態を含む
請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the state of the equipment includes a state of whether or not the equipment is used. - 前記設備は、座席を含む
請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the equipment includes a seat. - 前記設備の状態は、変更可能な前記設備の形態を含む
請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the state of the equipment includes a form of the equipment that can be changed. - 前記設備の状態は、前記設備に関連する付帯設備の状態を含む
請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the state of the equipment includes a state of ancillary equipment related to the equipment. - 前記設備の状態は、前記設備の位置を含む
請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the state of the equipment includes the position of the equipment. - 前記観察センサは、前記使用者の空間的情報を取得するセンサを含む
請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the observation sensor includes a sensor that acquires spatial information of the user. - 前記観察センサは、前記使用者の身体的情報を取得するセンサを含む
請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the observation sensor includes a sensor that acquires physical information of the user. - 前記観察センサは、画像センサ又はデプスセンサを含む
請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the observation sensor includes an image sensor or a depth sensor. - 前記処理部は、前記設備の状態に対して予め対応付けられたセンシング範囲を前記観察センサの前記センシング範囲として設定する
請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the processing unit sets a sensing range previously associated with the state of the equipment as the sensing range of the observation sensor. - 前記処理部は、前記センシング範囲に応じて前記観察センサの検出密度又は検出頻度を設定する
請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the processing unit sets the detection density or detection frequency of the observation sensor according to the sensing range. - 前記処理部は、前記観察センサの前記センシング範囲から取得される情報量に基づいて前記検出密度又は前記検出頻度を設定する
請求項11に記載の情報処理装置。 The information processing device according to claim 11, wherein the processing unit sets the detection density or the detection frequency based on the amount of information acquired from the sensing range of the observation sensor. - 前記処理部は、前記観察センサからの観察情報に基づいて、前記使用者の状態を検出する
請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the processing unit detects the state of the user based on the observation information from the observation sensor. - 前記処理部は、前記使用者の状態に対応する操作を認識する
請求項13に記載の情報処理装置。 The information processing device according to claim 13, wherein the processing unit recognizes an operation corresponding to the state of the user. - 処理部
を有する情報処理装置の
前記処理部が、
自動車の車内に設けられる設備の状態に基づいて、前記設備を使用する使用者を観察する観察センサのセンシング範囲を設定する
情報処理方法。 The processing unit of the information processing device having the processing unit
An information processing method for setting a sensing range of an observation sensor for observing a user who uses the equipment based on the state of equipment provided in the vehicle. - コンピュータを、
自動車の車内に設けられる設備の状態に基づいて、前記設備を使用する使用者を観察する観察センサのセンシング範囲を設定する処理部
として機能させるためのプログラム。 Computer,
A program for functioning as a processing unit that sets the sensing range of an observation sensor that observes the user who uses the equipment based on the condition of the equipment installed in the vehicle.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-034199 | 2020-02-28 | ||
JP2020034199 | 2020-02-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021172038A1 true WO2021172038A1 (en) | 2021-09-02 |
Family
ID=77491486
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/005166 WO2021172038A1 (en) | 2020-02-28 | 2021-02-12 | Information processing device, information processing method, and program |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2021172038A1 (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001331790A (en) * | 2000-05-19 | 2001-11-30 | Tokai Rika Co Ltd | Crew detecting system |
JP2005202786A (en) * | 2004-01-16 | 2005-07-28 | Denso Corp | Face authentication system for vehicle |
JP2009079996A (en) * | 2007-09-26 | 2009-04-16 | Denso Corp | Route search device |
JP2017220094A (en) * | 2016-06-09 | 2017-12-14 | 株式会社デンソー | Vehicle-mounted device |
JP2018185157A (en) * | 2017-04-24 | 2018-11-22 | パナソニックIpマネジメント株式会社 | Thermal image acquisition device and warm/cold feeling estimation device |
JP2019006331A (en) * | 2017-06-28 | 2019-01-17 | 矢崎総業株式会社 | Vehicle roof mounting system |
-
2021
- 2021-02-12 WO PCT/JP2021/005166 patent/WO2021172038A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001331790A (en) * | 2000-05-19 | 2001-11-30 | Tokai Rika Co Ltd | Crew detecting system |
JP2005202786A (en) * | 2004-01-16 | 2005-07-28 | Denso Corp | Face authentication system for vehicle |
JP2009079996A (en) * | 2007-09-26 | 2009-04-16 | Denso Corp | Route search device |
JP2017220094A (en) * | 2016-06-09 | 2017-12-14 | 株式会社デンソー | Vehicle-mounted device |
JP2018185157A (en) * | 2017-04-24 | 2018-11-22 | パナソニックIpマネジメント株式会社 | Thermal image acquisition device and warm/cold feeling estimation device |
JP2019006331A (en) * | 2017-06-28 | 2019-01-17 | 矢崎総業株式会社 | Vehicle roof mounting system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11945343B2 (en) | Vehicle seat | |
US11124118B2 (en) | Vehicular display system with user input display | |
US9942522B2 (en) | In-vehicle camera system | |
DE102013010932B4 (en) | Method for operating a user interface, user interface and motor vehicle with a user interface | |
EP3608174B1 (en) | Vehicle display system for vehicle | |
JP2014167438A (en) | Information notification device | |
ES2902698T3 (en) | Determining the position of a non-vehicle object in a vehicle | |
WO2019124158A1 (en) | Information processing device, information processing method, program, display system, and moving body | |
JP7259324B2 (en) | indoor monitoring device | |
US20170274825A1 (en) | Vehicle communication system | |
CN112455462B (en) | Passenger cabin system | |
CN110733433A (en) | Vehicle-mounted projection display system and vehicle | |
US20220073089A1 (en) | Operating system with portable interface unit, and motor vehicle having the operating system | |
WO2021172038A1 (en) | Information processing device, information processing method, and program | |
CN113246852A (en) | Vehicle interior lighting system, method of operating the same, and storage medium | |
US20220314796A1 (en) | Vehicle display device | |
JP2017095008A (en) | Control device, control method and program | |
JP2020189618A (en) | Experience apparatus for vehicle | |
EP3862225B1 (en) | Vehicle cabin lighting system | |
WO2024212672A1 (en) | Cockpit adjustment method and device, chip, and vehicle | |
US20230222814A1 (en) | Methods and Systems for Determining a State Indicating Whether a Seat Belt of a Vehicle is Used | |
JP7563293B2 (en) | Vehicle display device | |
KR102141644B1 (en) | Vital information obtaining apparatus, vehicle and method of controlling thereof | |
US20240087334A1 (en) | Information process system | |
JP7302533B2 (en) | Operation method of server device, information processing system, control device, passenger vehicle, and information processing system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21760899 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21760899 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |