CN117068141A - Vehicle path determination - Google Patents

Vehicle path determination Download PDF

Info

Publication number
CN117068141A
CN117068141A CN202210505047.8A CN202210505047A CN117068141A CN 117068141 A CN117068141 A CN 117068141A CN 202210505047 A CN202210505047 A CN 202210505047A CN 117068141 A CN117068141 A CN 117068141A
Authority
CN
China
Prior art keywords
vehicle
parking area
computer
determining
travel path
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210505047.8A
Other languages
Chinese (zh)
Inventor
哈米德·M·格尔吉里
马辰昊
托尼·泰-珍·朴
阿纳夫·夏尔马
阿赫麦德·本米蒙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ford Global Technologies LLC
Original Assignee
Ford Global Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies LLC filed Critical Ford Global Technologies LLC
Priority to CN202210505047.8A priority Critical patent/CN117068141A/en
Publication of CN117068141A publication Critical patent/CN117068141A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/06Automatic manoeuvring for parking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/04Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/12Conjoint control of vehicle sub-units of different type or different function including control of differentials
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present disclosure provides a "vehicle path determination". A determination is made that the vehicle is in the parking area based on the vehicle sensor data. A mapping mode of operation is activated based on determining that the vehicle is in the parking area. A travel path of the vehicle through the parking area is recorded based on receiving a user input selecting the mapping operation mode. Updating a map of the parking area based on the recorded travel path.

Description

Vehicle path determination
Technical Field
The present disclosure relates to vehicle path determination.
Background
Vehicles may be equipped with electronic and electromechanical components, such as computing devices, networks, sensors, controllers, and the like. The vehicle computer may acquire data regarding the environment of the vehicle and may operate the vehicle or at least some components thereof based on the acquired data. The vehicle sensors may provide data regarding the route to be traveled and the objects to be avoided in the environment of the vehicle. While the vehicle is operating, the operation of the vehicle may rely on acquiring accurate and timely data about objects in the environment of the vehicle.
Disclosure of Invention
A system includes a computer including a processor and a memory storing instructions executable by the processor to determine that a vehicle is in a parking area based on vehicle sensor data. The instructions further include instructions for: a mapping mode of operation is activated based on determining that the vehicle is in the parking area. The instructions further include instructions for: a travel path of the vehicle through the parking area is then recorded based on receiving a user input selecting the mapping mode of operation. The instructions further include instructions for: updating a map of the parking area based on the recorded travel path.
The instructions may also include instructions for: the vehicle is additionally determined to be in the parking area based on map data.
The instructions may also include instructions for: the sensor data is input into a machine learning program that outputs an identification of the parking area.
The instructions may also include instructions for: the mapping mode of operation is disabled based on determining that the vehicle is in a road region.
The instructions may also include instructions for: after determining that a portion of the recorded travel path is outside the parking area, the portion of the travel path is removed from the map.
The instructions may also include instructions for: operating the vehicle in the parking area based on the updated map.
The instructions may also include instructions for: a confidence estimate is determined based on identifying an object in the sensor data, and the vehicle is additionally determined to be in the parking area based on the confidence estimate.
The instructions may also include instructions for: the method further includes determining that the vehicle is in the parking area based on the confidence estimate being greater than a first threshold, and determining that the vehicle is in a road area based on the confidence estimate being less than a second threshold, the second threshold being less than the first threshold.
The instructions may also include instructions for: after determining that the confidence estimate is between the first threshold and the second threshold, determining that the vehicle is in the parking area or in the road area based on second sensor data.
The instructions may also include instructions for: recording the travel path is stopped based on receiving a second user input to deselect the mapping operation mode.
The instructions may also include instructions for: stopping recording the travel path based on determining that the vehicle is outside the parking area for a predetermined time.
The instructions may also include instructions for: stopping recording the travel path based on determining that the vehicle is outside the parking area and that a distance between the vehicle and the parking area is greater than a threshold.
The instructions may also include instructions for: after determining that the vehicle is in an unknown area from the sensor data, the vehicle is determined to be in the parking area based on second sensor data obtained after the vehicle moves a threshold distance.
The instructions may also include instructions for: disabling the mapping mode of operation based on determining that the vehicle is in the unknown region according to the second sensor data.
A method comprising: a determination is made that the vehicle is in the parking area based on the vehicle sensor data. The method further comprises the steps of: a mapping mode of operation is activated based on determining that the vehicle is in the parking area. The method further comprises the steps of: a travel path of the vehicle through the parking area is then recorded based on receiving a user input selecting the mapping mode of operation. The method further comprises the steps of: updating a map of the parking area based on the recorded travel path.
The method may further comprise: the vehicle is additionally determined to be in the parking area based on map data.
The method may further comprise: the sensor data is input into a machine learning program that outputs an identification of the parking area.
The method may further comprise: the mapping mode of operation is disabled based on determining that the vehicle is in a road region.
The method may further comprise: after determining that a portion of the recorded travel path is outside the parking area, the portion of the travel path is removed from the map.
The method may further comprise: operating the vehicle in the parking area based on the updated map.
Also disclosed herein is a computing device programmed to perform any of the above method steps. Also disclosed herein is a computer program product comprising a computer readable medium storing instructions executable by a computer processor to perform any of the above-described method steps.
The vehicle computer may operate the vehicle along the stored travel path in a parking area that includes a plurality of sub-areas (i.e., parking spaces). When operating along the stored travel path, the vehicle computer may search for available sub-areas, such as unoccupied parking spaces, via image data. The vehicle computer may determine the stored travel path by recording a travel path followed by a user previously operating the vehicle. The vehicle computer may initiate recording of the travel path based on receiving user input. However, when the vehicle is operating in a road area, the vehicle computer may initiate recording of the travel path based on receiving user input, which is desirable for the parking area, but not for the road area. Advantageously, after determining that the vehicle is in a parking area based on sensor data, the vehicle computer may activate a mapping mode of operation that allows the vehicle computer to initiate recording of the travel path based on receiving user input. Activating the mapping operation mode when the vehicle is in the parking area may prevent an undesired recording of a travel path in a road area, thereby improving vehicle operation in the parking area.
Drawings
FIG. 1 is a block diagram illustrating an exemplary vehicle control system for a vehicle.
Fig. 2A is a diagram illustrating determination of an exemplary travel path according to the system of fig. 1.
FIG. 2B is a diagram illustrating an exemplary travel path determined to include an invalid portion according to the system of FIG. 1.
Fig. 3 is a diagram illustrating a determination that a vehicle is in an unknown region.
Fig. 4 is an exemplary illustration of a deep neural network.
FIG. 5A is a first portion of a flowchart of an exemplary process for operating a vehicle.
Fig. 5B is a second portion of the flow chart of fig. 5A.
Detailed Description
Referring to fig. 1-2, an exemplary vehicle control system 100 includes a vehicle 105. The vehicle computer 110 in the vehicle 105 receives data from the sensors 115. The vehicle computer 110 is programmed to determine that the vehicle 105 is in the parking area 200 based on the vehicle sensor 115 data. The vehicle computer 110 is further programmed to activate a mapping mode of operation based on determining that the vehicle 105 is in the parking area 200. The vehicle computer 110 is further programmed to record a path of travel P of the vehicle 105 through the parking area 200 based on receiving a user input selecting a mapping operation mode. The vehicle computer 110 is further programmed to update the map of the parking area 200 based on the recorded travel path P.
Turning now to FIG. 1, a vehicle 105 includes a vehicle computer 110, sensors 115, actuators 120 for actuating various vehicle components 125, and a vehicle communication module 130. The communication module 130 allows the vehicle computer 110 to communicate with remote server computers 140, user devices 145, and/or other vehicles, for example, via messaging or broadcast protocols such as Dedicated Short Range Communications (DSRC), cellular, and/or other protocols that may support vehicle-to-vehicle, vehicle-to-infrastructure, vehicle-to-cloud communications, etc., and/or via the packet network 135.
The vehicle computer 110 includes, for example, a known processor and memory. The memory includes one or more forms of computer-readable media and stores instructions executable by the vehicle computer 110 for performing operations including as disclosed herein. The vehicle computer 110 may also include two or more computing devices that cooperate to perform the operations of the vehicle 105, including the operations as described herein. Further, the vehicle computer 110 may be a general purpose computer having a processor and memory as described above, and/or may include dedicated electronic circuitry including an ASIC fabricated for specific operations, such as an ASIC for processing sensor data and/or transmitting sensor data. In another example, the vehicle computer 110 may include an FPGA (field programmable gate array), which is an integrated circuit manufactured to be configurable by a user. Typically, digital and mixed signal systems such as FPGAs and ASICs are described using hardware description languages such as VHDL (very high speed integrated circuit hardware description language) in electronic design automation. For example, ASICs are manufactured based on VHDL programming provided prior to manufacture, while logic components within FPGAs may be configured based on VHDL programming stored, for example, in a memory electrically connected to FPGA circuitry. In some examples, a combination of one or more processors, one or more ASICs, and/or FPGA circuitry may be included in the vehicle computer 110.
The vehicle computer 110 may operate and/or monitor the vehicle 105 in an autonomous, semi-autonomous mode or an non-autonomous (or manual) mode, i.e., may control and/or monitor the operation of the vehicle 105, including the control and/or monitoring component 125. For purposes of this disclosure, autonomous mode is defined as a mode in which each of propulsion, braking, and steering of the vehicle 105 is controlled by the vehicle computer 110; in semi-autonomous mode, the vehicle computer 110 controls one or both of propulsion, braking, and steering of the vehicle 105; in the non-autonomous mode, a human operator controls each of propulsion, braking, and steering of the vehicle 105.
The vehicle computer 110 may include one or more of braking, propulsion (e.g., controlling acceleration of the vehicle 105 by controlling one or more of an internal combustion engine, an electric motor, a hybrid engine, etc.), steering, transmission, climate control, interior and/or exterior lights, horns, doors, etc. programmed to operate the vehicle 105, and determine whether and when the vehicle computer 110 (rather than a human operator) controls such operation.
The vehicle computer 110 may include or be communicatively coupled to more than one processor, such as included in an Electronic Controller Unit (ECU) or the like included in the vehicle 105 for monitoring and/or controlling various vehicle components 125, such as a transmission controller, a brake controller, a steering controller, and the like, for example, via a vehicle communication network such as a communication bus, as described further below. The vehicle computer 110 is typically arranged for communication over a vehicle communication network, which may include a bus in the vehicle 105, such as a Controller Area Network (CAN) or the like, and/or other wired and/or wireless mechanisms.
Via the vehicle 105 network, the vehicle computer 110 may transmit and/or receive messages (e.g., CAN messages) to and/or from various devices (e.g., sensors 115, actuators 120, ECU, etc.) in the vehicle 105. Alternatively or additionally, where the vehicle computer 110 actually includes a plurality of devices, a vehicle communication network may be used for communication between the devices represented in this disclosure as the vehicle computer 110. Further, as mentioned below, various controllers and/or sensors 115 may provide data to the vehicle computer 110 via the vehicle communication network.
The vehicle 105 sensors 115 may include a variety of devices such as are known to provide data to the vehicle computer 110. For example, the sensors 115 may include one or more light detection and ranging (lidar) sensors 115 or the like disposed on top of the vehicle 105, behind a front windshield of the vehicle 105, around the vehicle 105, etc., that provide the relative position, size, and shape of objects around the vehicle 105. As another example, one or more radar sensors 115 secured to the bumper of the vehicle 105 may provide data to provide a location of an object, a second vehicle, etc. relative to the location of the vehicle 105. Alternatively or additionally, the sensors 115 may also include, for example, one or more camera sensors 115 (e.g., front view, side view, etc.) that provide images from an area surrounding the vehicle 105. In the context of the present disclosure, an object is a physical (i.e., substance) item that has a mass and that can be represented by a physical phenomenon (e.g., light or other electromagnetic waves or sounds, etc.) that can be detected by the sensor 115. Thus, the vehicle 105, as well as other items including those discussed below, fall within the definition of "object" herein.
The vehicle computer 110 is programmed to receive data from the one or more sensors 115 substantially continuously, periodically, and/or upon direction from the remote server computer 140, etc. The data may include, for example, a location of the vehicle 105. The location data specifies one or more points on the ground and may be of known form, such as geographic coordinates, such as latitude and longitude coordinates, obtained via a navigation system using a Global Positioning System (GPS) as is known. Additionally or alternatively, the data may include a location of an object (e.g., a vehicle, a sign, a tree, etc.) relative to the vehicle 105. As one example, the data may be image data of an environment surrounding the vehicle 105. In such examples, the image data may include one or more objects and/or markers, such as lane markers, on or along the road. Image data herein means digital image data that may be acquired by the camera sensor 115, for example, including pixels having intensity values and color values. The sensor 115 may be mounted to any suitable location in or on the vehicle 105, such as on a bumper of the vehicle 105, on a roof of the vehicle 105, etc., to collect an image of the environment surrounding the vehicle 105.
The vehicle 105 actuators 120 are implemented via circuits, chips, or other electronic and/or mechanical components that may actuate various vehicle subsystems according to appropriate control signals as known. The actuators 120 may be used to control components 125 including braking, acceleration, and steering of the vehicle 105.
In the context of the present disclosure, the vehicle component 125 is one or more hardware components adapted to perform mechanical or electromechanical functions or operations, such as moving the vehicle 105, decelerating or stopping the vehicle 105, steering the vehicle 105, and the like. Non-limiting examples of components 125 include propulsion components (which include, for example, an internal combustion engine and/or an electric motor, etc.), transmission components, steering components (which may include, for example, one or more of a steering wheel, a steering rack, etc.), suspension components 125 (which may include, for example, one or more of a damper, a bushing, a spring, a control arm, a ball joint, a link, etc., for example, a shock absorber or strut), braking components, parking assist components, adaptive cruise control components, adaptive steering components, one or more passive restraint systems (e.g., airbags), movable seats, and the like.
The vehicle 105 also includes a human-machine interface (HMI) 118.HMI 118 includes user input devices such as knobs, buttons, switches, pedals, joysticks, touch screens, and/or microphones, and the like. The input devices may include sensors 115 to detect user input and provide user input data to the vehicle computer 110. That is, the vehicle computer 110 may be programmed to receive user input from the HMI 118. The user may provide each user input via the HMI 118, for example, by selecting a virtual button on a touch screen display, by providing voice commands, and the like. For example, a touch screen display included in the HMI 118 may include a sensor 115 to detect a user selecting a virtual button on the touch screen display to, for example, select or deselect an operating mode, which input may be received in the vehicle computer 110 and used to determine a selection of user input.
HMI 118 also typically includes output devices such as a display (including a touch screen display), speakers, and/or lights that output signals or data to a user. HMI 118 is coupled to the vehicle communication network and may send and/or receive messages to/from vehicle computer 110 and other vehicle subsystems.
Further, the vehicle computer 110 may be configured to communicate with devices external to the vehicle 105 via the vehicle-to-vehicle communication module 130 or interface, such as by vehicle-to-vehicle (V2V) or vehicle-to-infrastructure (V2X) wireless communication (cellular and/or DSRC, etc.) with another vehicle and/or remote server computer 140 (typically via direct radio frequency communication). The communication module 130 may include one or more mechanisms that a computer of the vehicle may utilize to communicate, such as a transceiver, including any desired combination of wireless (e.g., cellular, wireless, satellite, microwave, and radio frequency) communication mechanisms and any desired network topology (or topologies when multiple communication mechanisms are utilized). Exemplary communications provided via the communications module 130 include cellular, bluetooth, IEEE 802.11, dedicated Short Range Communications (DSRC), and/or Wide Area Networks (WAN) including the Internet, which provide data communications services.
Network 135 represents one or more mechanisms by which vehicle computer 110 may communicate with a remote computing device (e.g., remote server computer 140, another vehicle computer, etc.). Thus, the network 135 may be one or more of a variety of wired or wireless communication mechanisms, including any desired combination of wired (e.g., cable and fiber) and/or wireless (e.g., cellular, wireless, satellite, microwave, and radio frequency) communication mechanisms, as well as any desired network topology (or topologies when multiple communication mechanisms are utilized). An exemplary communication network includes wireless providing data communication servicesCommunication network (e.g. usingLow power consumption (BLE), IEEE 802.11, vehicle-to-vehicle (V2V) such as Dedicated Short Range Communication (DSRC), etc.), local Area Networks (LANs), and/or Wide Area Networks (WANs) including the internet.
The remote server computer 140 may be a conventional computing device programmed to provide operations such as those disclosed herein, i.e., including one or more processors and one or more memories. Further, the remote server computer 140 may be accessed via a network 135 (e.g., the Internet, a cellular network, and/or some other wide area network).
Fig. 2A and 2B are diagrams illustrating operation of the vehicle 105 in an exemplary parking area 200 that includes a marked sub-area 210 (e.g., parking space) for the vehicle. The vehicle computer 110 is programmed to determine whether the vehicle 105 is in the road area 205 or the parking area 200. Road area 205 is a ground area that includes any paved or dressed surface provided for land vehicle travel. Parking area 200 is a designated floor area for parking a vehicle. Parking area 200 may be on a street or road, for example, an area along a curb or street edge, a parking lot or parking structure or a portion thereof, etc. The sub-area 210 may be, for example, a parking space indicated by conventional indicia (e.g., painted lines on the ground), and the vehicle computer 110 may employ conventional image recognition techniques to identify the sub-area 210.
The vehicle computer 110 may be programmed to determine that the vehicle 105 is within the parking area 200 or the road area 205 based on the sensor 115 data. For example, the vehicle computer 110 may be programmed to determine that the vehicle 105 is within the parking area 200 or the road area 205 by, for example, GPS-based geofencing. Geofences herein have the conventional meaning of the boundaries of an area defined by a set of geographic coordinates. In such an example, one geofence specifies the perimeter of parking area 200 and another geofence specifies the perimeter of road area 205. The vehicle computer 110 may then determine that the vehicle 105 is within the parking area 200 or the road area 205 based on the position data of the vehicle 105 indicating that the vehicle 105 is within the corresponding geofence. As another example, the vehicle computer 110 may determine whether the vehicle 105 is in the road area 205 or the parking area 200 based on data (e.g., map data) received from the remote server computer 140. For example, the vehicle computer 110 may receive the location of the vehicle 105, e.g., from the sensor 115, a navigation system, a remote server computer 140, etc. The vehicle computer 110 may compare the location of the vehicle 105 with the map data, for example, to determine whether the vehicle 105 is in the road region 205 or the parking region 200 specified in the map data.
Additionally or alternatively, the vehicle computer 110 may be programmed to determine that the vehicle 105 is in the parking area 200 or the road area 205 based on identifying objects 215 and/or markers 220 (e.g., paint lines on the ground) in the environment surrounding the vehicle 105. The vehicle computer 110 may receive sensor 115 data, such as image data, of the environment surrounding the vehicle 105. The sensor 115 data (e.g., image data) may include one or more markers 220 around the vehicle 105. The indicia 220 may, for example, specify boundaries of lanes or parking spaces, specify directions of travel, specify types of vehicles authorized on the ground, and the like. The vehicle computer 110 can identify the markers 220, for example, by using suitable image recognition techniques.
In addition, the sensor 115 data (e.g., image data) may include one or more objects 215 surrounding the vehicle 105. For example, the vehicle computer 110 may be programmed to classify and/or identify one or more objects 215 based on the sensor 115 data. For example, object classification or identification techniques may be used, for example, in the vehicle computer 110 based on lidar sensor 115 data, camera sensor 115 data, etc., to identify the type of object 215 (e.g., vehicle, pedestrian, drone, etc.) and the physical characteristics of the object. Non-limiting examples of objects 215 include pedestrians, another vehicle, bicycles, shopping carts, bars, and the like.
The sensor 115 data may be interpreted and/or the object 215 classified based on the sensor 115 data using various techniques such as are known. For example, camera and/or lidar image data may be provided to a classifier that includes programming for utilizing one or more conventional image classification techniques. For example, the classifier may use a machine learning technique in which data known to represent various objects is provided to a machine learning program for training the classifier. Once trained, the classifier may accept the vehicle sensor 115 data (e.g., images) as input and then provide as output, for each of one or more respective regions of interest in the images, an identification and/or classification (i.e., movable or non-movable) of one or more objects 215 or an indication that no objects 215 are present in the respective region of interest. Further, a coordinate system (e.g., polar or Cartesian) applied to an area proximate to the vehicle 105 may be applied to specify the location and/or area of the object identified from the sensor 115 data (e.g., converted to global latitude and longitude geographic coordinates, etc., according to the vehicle 105 coordinate system). Further, the vehicle computer 110 may employ various techniques to fuse (i.e., incorporate into a common coordinate system or frame of reference) data from different sensors 115 and/or multiple types of sensors 115, such as lidar, radar, and/or optical camera data.
Additionally or alternatively, the vehicle computer 110 may be programmed to determine that the vehicle 105 is in the parking area 200 or the road area 205 based on the speed of the vehicle 105 and/or the speed of the various objects 215. The vehicle computer 110 may determine the speed of the vehicle 105 based on the sensor 115 data. For example, the vehicle speed sensor may output a vehicle speed, i.e., a rate of movement of the vehicle relative to a ground surface, such as a road, in a generally forward direction. For example, one or more conventional wheel speed sensors may be provided to detect the rotational rate of the vehicle wheels, from which the speed of the vehicle may be determined. As another example, a vehicle speed sensor may detect a rotational rate of a crankshaft from which a vehicle speed may be determined.
The vehicle computer may determine a speed of an object 215 (e.g., a second vehicle) approaching the vehicle 105 (i.e., within a field of view of the vehicle 105 sensor 115). For example, the vehicle computer 110 may receive the speed of the object 215, e.g., via V2V or V2X communications. As another example, the vehicle computer 110 may determine the speed of the object 215 based on the sensor 115 data. In such an example, the vehicle computer 110 may determine the speed of the object 215 relative to the vehicle 105 by determining the change in distance between the object 215 and the vehicle 105 over time. For example, the vehicle computer 110 may determine the speed of the object 215 relative to the vehicle 105 using the formula ΔD/ΔT, where ΔD is the difference between a pair of distances from the vehicle 105 to the object 215 taken at different times, and ΔT is the amount of time between the times at which the pair of distances was determined. For example, the difference Δd between the pair of distances may be determined by subtracting the distance determined at a later time from the distance determined at a later time. In such an example, a positive value indicates that object 215 is traveling slower than vehicle 105, and a negative value indicates that object 215 is traveling faster than vehicle 105. The vehicle computer 110 may then combine (e.g., add) the speed of the vehicle 105 to the speed of the object 215 relative to the vehicle 105 to determine the speed of the object 215.
After detecting the object 215, the marker 220, and/or the speed from the sensor 115 data, the vehicle computer 110 may then determine that the vehicle 105 is in the parking area 200 or the road area 205 based on the confidence estimate. Confidence estimates, as that term is used herein, are numerical values that represent the likelihood that the vehicle 105 is in the parking area 200. As explained further below, the confidence estimates may be integers or real numbers.
The confidence score, as that term is used herein, is a numerical value, such as an integer, a percentage, etc., that represents the likelihood that the object 215, the marker 220, and/or the speed indicator vehicle 105 is in the parking area 200 based on detecting that at least one criterion is met from the vehicle 105 sensor data. Examples of confidence scores and criteria by which the corresponding confidence scores may be determined are provided in table 1 below. The confidence scores are typically provided on a specified scale, and confidence estimates determined from one or more confidence scores will typically fall within that scale as well.
The vehicle computer 110 may determine a confidence estimate based on one or more confidence scores corresponding to each detected object 215, marker 220, and/or speed. For example, the confidence estimate may be determined as a single confidence score, where only one confidence score is available and/or the confidence score is above a threshold, i.e., considered high enough to be relied upon. For example, when a criterion identifying a flag that describes a "parking area" is identified, the highest confidence score possible may be assigned and used as a confidence estimate. In another example, the confidence estimate may be an average of the determined confidence scores.
On a scale from-3 to 3 (inclusive), the confidence score may be, for example, an integer. In this example, confidence score 3 represents a high confidence of object 215, marker 220, and/or speed indicating vehicle 105 in parking area 200, and confidence score-3 represents a high confidence of object 215, marker 220, and/or speed indicating vehicle 105 in road area 205. Additionally, a confidence score of 0 may represent that the object 215, the marker 220, and/or the speed indicating vehicle 105 is in the unknown region 300. That is, a confidence score of less than 0 may correspond to the object 215, the marker 220, and/or the speed indicating road region 205, a confidence score of greater than 0 may correspond to the object 215, the marker 220, and/or the speed indicating parking region 200, and a confidence score of 0 may correspond to the object 215, the marker 220, and/or the speed being unknown or indicating the unknown region 300.
To determine the confidence estimates, the vehicle computer 110 may access a look-up table or the like, for example, stored in a memory of the vehicle computer 110, that correlates various objects 215, markers 220, and/or speeds with corresponding confidence scores. An exemplary lookup table is set forth in table 1 below. The vehicle computer 110 may determine a confidence score for each detected object 215, marker 220, and/or velocity based on the detected object 215, marker 220, and/or velocity matching the stored object 215, marker 220, and/or velocity in the lookup table. After determining the confidence scores for the one or more detected objects 215, markers 220, and/or operational attributes (e.g., speed) of the vehicle 105, the vehicle computer 110 may determine a confidence estimate by, for example, summing or averaging the confidence scores for each detected object 215, marker 220, and/or speed.
TABLE 1
After determining the confidence estimate, the vehicle computer 110 may compare the confidence estimate to a first threshold. The first threshold value specifies a confidence estimate above which the vehicle computer 110 determines that the vehicle 105 is in the parking area 200. The first threshold may be predetermined to correspond to a confidence score (e.g., 3) that represents a high confidence of the vehicle 105 in the parking area 200. The first threshold value may be stored in, for example, a memory of the vehicle computer 110. The first threshold may be specified by the vehicle and/or the component manufacturer. When the confidence estimate is greater than or equal to the first threshold, the vehicle computer 110 determines that the vehicle 105 is in the parking area 200.
Additionally, the vehicle computer 110 may compare the confidence estimate to a second threshold. The second threshold is a confidence estimate below which the vehicle computer 110 determines that the vehicle 105 is in the road region 205. The second threshold may be predetermined to correspond to a confidence score (e.g., -3) that represents a high confidence of the vehicle 105 in the road region 205. The second threshold is less than the first threshold. The second threshold may be stored in, for example, a memory of the vehicle computer 110. The second threshold may be specified by the vehicle and/or the component manufacturer. When the confidence estimate is less than or equal to the second threshold, the vehicle computer 110 determines that the vehicle 105 is in the road region 205.
Additionally or alternatively, the vehicle computer 110 may input image data of the environment surrounding the vehicle 105 into a neural network, such as a Deep Neural Network (DNN), which may be trained to accept the image data as input and generate an output that determines whether the vehicle 105 is in the parking area 200 or in the road area 205.
Turning now to fig. 3, the vehicle computer 110 may not be able to determine whether the vehicle 105 is in the parking area 200 or the road area 205. For example, the resolution limit of the map data may prevent the vehicle computer 110 from determining whether the vehicle 105 is in the parking area 200 or the road area 205. As another example, when the confidence estimate is between the first threshold and the second threshold, the vehicle computer 110 may not be able to determine whether the vehicle 105 is in the parking area 200 or the road area 205. If the vehicle computer 110 cannot determine whether the vehicle 105 is in the parking area 200 or the road area 205, the vehicle computer 110 may determine that the vehicle 105 is in an unknown area 300, e.g., an area including a boundary between the road area 205 and the parking area 200 (see FIG. 3).
After determining that the vehicle 105 is in the unknown region 300, the vehicle computer 110 may output a message to the user indicating that the vehicle 105 is in the unknown region, e.g., via the HMI 118. Additionally, the message may instruct the user to operate the vehicle 105 a threshold distance D, such as 10 feet, 10 meters, etc. Alternatively, after determining that the vehicle 105 is in the unknown region 300, the vehicle computer 110 may operate the vehicle 105 a threshold distance D. For example, the vehicle computer 110 may actuate one or more vehicle components 125 to move the vehicle 105 a threshold distance D. The threshold distance D may be determined empirically, for example, based on determining a minimum distance that the vehicle can move from a location that allows the vehicle computer to determine an updated vehicle location that differs from the location based on the obtained sensor data. The threshold distance D may be stored in, for example, a memory of the vehicle computer 110.
The vehicle computer 110 may obtain the second sensor 115 data when the vehicle 105 is moving a threshold distance D. In this context, the second sensor 115 data is data obtained from the sensor 115 after the vehicle computer 110 determines that the vehicle 105 is in the unknown region 300. The vehicle computer 110 may determine that the vehicle 105 has moved a threshold distance D based on the sensor 115 data. For example, the vehicle computer 110 may count the number of wheel rotations detected by the wheel sensor 115, and then may determine the distance traveled based on, for example, the diameter of the wheel stored in the memory of the vehicle computer 110. The vehicle computer 110 may then determine whether the vehicle 105 is in the road area 205 or the parking area 200 based on the second sensor 115 data, e.g., in substantially the same manner as discussed above with respect to determining whether the vehicle 105 is in the road area 205 or the parking area 200 based on the sensor data 115.
Returning to fig. 2A and 2B, the vehicle computer 110 is programmed to transition the mapping operation mode between the deactivated state and the activated state based on the position of the vehicle 105. For example, after determining that the vehicle 105 has moved into the parking area 200, the vehicle computer 110 activates the mapping operation mode from the deactivated state to the activated state. As another example, after determining that the vehicle 105 has moved into the road region 205, the vehicle computer 110 deactivates the mapping operating mode from the activated state to the deactivated state. That is, the mapping operation mode is activated when the vehicle 105 is in the parking area 200, and is deactivated when the vehicle 105 is in the road area 205.
Additionally, the vehicle computer 110 may deactivate the mapping mode of operation based on, for example, determining that the vehicle 105 is in the unknown region 300 from the sensor 115 data or the second sensor 115 data. For example, the vehicle computer 110 may prevent the mapping operation mode from transitioning to the active state based on determining that the vehicle 105 is in the unknown region 300. As another example, after determining that the vehicle 105 has moved into the unknown region 300, the vehicle computer 110 may deactivate the mapping mode of operation.
The mapping operation mode records a travel path P of the vehicle 105 through the parking area 200 and stores the travel path in, for example, a memory of the vehicle computer 110. Specifically, in the mapping mode of operation, the user operates the vehicle 105 along a travel path P through the parking area 200, and the vehicle computer 110 is programmed to actuate one or more sensors 115 to record the travel path P of the vehicle 105. As discussed further below, the vehicle computer 110 may then operate the vehicle 105 in the parking area 200 based on the stored travel path P.
When the mapping operation mode is in the active state, the vehicle computer 110 activates a user selection of the mapping operation mode. For example, the vehicle computer 110 may actuate the HMI 118 to detect a first user input selecting a mapping mode of operation. For example, the HMI 118 may be programmed to display virtual buttons on a touch screen display that a user may select to select a mapping mode of operation. As another example, HMI 118 may be programmed to provide a virtual button or the like that is not selectable when the mapping mode of operation is in a deactivated state, but is selectable via a touch screen display when the mapping mode of operation is in an activated state. In other words, HMI 118 may activate sensor 115, which may detect a user selection of a virtual button to select a mapping mode of operation. After detecting the first user input, the HMI 118 may then provide the first user input to the vehicle computer 110, and the vehicle computer 110 may select a mapping mode of operation based on the first user input.
When the mapping mode of operation is in a deactivated state, the vehicle computer 110 may actuate the HMI 118 to deactivate detection of the first user input. In other words, the vehicle computer 110 prevents the user from selecting a mapping mode of operation (i.e., when the vehicle 105 is on the road area 205). For example, the HMI 118 may be programmed to remove virtual buttons from the touch screen display. As another example, the HMI 118 can be programmed to make the virtual buttons non-selectable. In other words, the HMI 118 may deactivate the sensor 115, which may detect that the user selects a virtual button to select a mapping mode of operation.
When the mapping operation mode is selected, the vehicle computer 110 may output a message, e.g., via the HMI 118, instructing the user to operate the vehicle 105 along the travel path P through the parking area 200. When the user operates the vehicle 105, the vehicle computer 110 records the travel path P. For example, the vehicle computer 110 may receive the position data of the vehicle 105 at predetermined time intervals (e.g., every 500 milliseconds, every 1 second, every 5 seconds, etc.). The vehicle computer 110 may generate a travel path P based on such temporally continuous vehicle 105 positions. For example, the vehicle computer 110 may determine geographic coordinates specified by the location data of the vehicle 105 after each time interval. The vehicle computer 110 may then store the determined geographic coordinates in, for example, a memory of the vehicle computer 110. In addition, the vehicle computer 110 may store the time interval (i.e., the timestamp) associated with the determined geographic coordinates in, for example, memory.
When a user operates the vehicle 105 in a mapping mode of operation, the vehicle computer 110 may monitor the position of the vehicle 105. For example, the vehicle computer 110 may determine a distance between the vehicle 105 and a geofence of the parking area 200. For example, the vehicle computer 110 may determine the distance based on the map data, such as by determining a distance from GPS coordinates of the vehicle 105 location to a closest point on the geofence where the travel path P intersects the geofence at the parking area 200.
When the vehicle 105 is within the parking area 200, the vehicle computer 110 may compare the distance to a first distance threshold. The first distance threshold specifies a minimum distance between the vehicle 105 and the geofence (i.e., perimeter) of the parking area 200 within which the vehicle computer 110 outputs a message to the user. The first distance threshold may be determined empirically, for example, based on determining a distance from a geofence of the parking area within which the vehicle cannot turn around and remain within the parking area. The first distance threshold may be stored in, for example, a memory of the vehicle computer 110. If the distance is greater than the first distance threshold, the vehicle computer 110 may maintain the mapping mode of operation in an active state. If the distance is less than or equal to the first distance threshold, the vehicle computer 110 may output a message, e.g., via the HMI 118, warning the user that the vehicle 105 is exiting the parking area 200.
The vehicle computer 110 may be programmed to stop recording the travel path P after the second user input is detected. For example, the vehicle computer 110 may actuate the HMI 118 to detect a second user input deselecting the mapping mode of operation. For example, the HMI 118 may be programmed to display a virtual button on a touch screen display that the user may select to deselect the mapping mode of operation. As another example, HMI 118 may be programmed to provide a virtual button or the like that is not selectable when the mapping mode of operation is in a deactivated state, but is selectable via a touch screen display when the mapping mode of operation is in an activated state. In other words, HMI 118 may activate sensor 115, which may detect that the user selects a virtual button to deselect a mapping mode of operation. After detecting the second user input, the HMI 118 may then provide the second user input to the vehicle computer 110, and the vehicle computer 110 may deselect the mapping operation mode and stop recording the travel path P based on the second user input.
Additionally or alternatively, the vehicle computer 110 may be programmed to stop recording the travel path P after detecting that the vehicle 105 is outside the parking area 200. As discussed above, the vehicle computer 110 may compare the location data of the vehicle 105 to the geofence of the parking area 200. As one example, after determining that the vehicle 105 is outside the geofence of the parking area 200, the vehicle computer 110 may stop recording the travel path P. In addition, the vehicle computer 110 may output a message to the user via the HMI 118 specifying that the vehicle 105 exits the parking area 200 and that the vehicle computer 110 stops recording the travel path P, for example.
As another example, after determining that the vehicle 105 is outside of the geofence of the parking area 200, the vehicle computer 110 may determine an amount of time that the vehicle 105 remains outside of the parking area 200. For example, after determining that the vehicle 105 exits the parking area 200, the vehicle computer 110 may start a timer. The predetermined period of time may be determined empirically, for example based on determining an amount of time that the vehicle returns to the parking area when operating in a road area adjacent the parking area. If the vehicle 105 does not return to the parking area 200 before the timer expires, the vehicle computer 110 may stop recording the travel path P and deactivate the mapping mode of operation. If the vehicle 105 returns to the parking area 200 before the timer expires, the vehicle computer 110 maintains the mapping mode of operation in an active state and resets the timer.
As another example, after determining that the vehicle 105 is outside of the geofence of the parking area 200, the vehicle computer 110 may determine a distance between the vehicle 105 and the parking area 200. For example, the vehicle computer 110 may determine the distance based on the map data, such as by determining a difference in GPS coordinates of the vehicle 105 location from a point on the geofence where the travel path P intersects the geofence of the parking area 200. After determining the distance, the vehicle computer 110 may compare the distance to a second distance threshold. The second distance threshold specifies a maximum distance that the vehicle computer 110 outside of the geofence is operable in the mapping mode of operation. The second distance threshold may be determined empirically, for example based on determining a distance outside the parking area within which the vehicle may be turned into the parking area. The second distance threshold may be stored in, for example, a memory of the vehicle computer 110. If the distance is less than or equal to the second distance threshold, the vehicle computer 110 may maintain the mapping mode of operation in an active state. If the distance is greater than the second distance threshold, the vehicle computer 110 may stop recording the travel path P and deactivate the mapping mode of operation.
After determining that the vehicle 105 is operating outside of the parking area 200, the vehicle computer 110 may identify a portion of the travel path P outside of the parking area 200 (i.e., from the geofence of the parking area 200 to the location of the vehicle 105) as invalid (shown as a solid line in fig. 2B). That is, the vehicle computer 110 may determine that the effective portions of the travel path P are those portions (shown in dashed lines in fig. 2B) within the parking area 200. Additionally or alternatively, the vehicle computer 110 may identify the portion of the travel path P ending at a point (e.g., at the geofence, i.e., perimeter, of the parking area 200) as invalid. That is, the vehicle computer 110 may determine that the effective portions of the travel path P are those portions that extend infinitely within the parking area 200. When the mapping operation mode is deactivated or deselected, the vehicle computer 110 may remove the invalid portion of the travel path P.
After determining the travel path P (including the effective portion thereof), the vehicle computer 110 may update the map to specify the recorded travel path P. The vehicle computer 110 may then operate the vehicle 105 through the parking area 200 based on the updated map. For example, the vehicle computer 110 may actuate one or more vehicle components 125 to move the vehicle 105 along the recorded travel path P in the parking area 200, e.g., to search for available sub-areas 210, to pick up a user at a specified location, etc. That is, the vehicle computer 110 may operate the vehicle 105 to follow the recorded travel path P, i.e., the active portion, specified by the user during operation of the vehicle 105 in the mapping mode of operation.
Fig. 4 is an illustration of an exemplary Deep Neural Network (DNN) 400 that may be trained to determine whether a vehicle 105 is in a parking area 200 or a road area 205 based on image data of an environment surrounding the vehicle 105. For example, DNN 400 may be a software program loadable into a memory and executed by a processor included in a computer. In an exemplary implementation, DNN 400 may include, but is not limited to, convolutional Neural Networks (CNNs), R-CNNs (region-based CNNs), fast R-CNNs, and faster R-CNNs. The DNN comprises a plurality of nodes and the nodes are arranged such that the DNN 400 comprises an input layer, one or more hidden layers and an output layer. Each layer of DNN 400 may include a plurality of nodes 405. Although three hidden layers are shown in fig. 4, it should be understood that DNN 400 may include more or fewer hidden layers. The input and output layers may also include more than one node 405.
The nodes 405 are sometimes referred to as artificial neurons 405 because they are designed to emulate biological (e.g., human) neurons. The input set (represented by the arrow) of each neuron 405 is multiplied by a respective weight. The weighted inputs may then be summed in an input function to provide a net input that may be adjusted by the bias. The net input may then be provided to an activation function, which in turn provides an output for the connected neurons 405. The activation function may be a variety of suitable functions that are typically selected based on empirical analysis. As shown by the arrows in fig. 4, the output of the neurons 405 may then be provided to be included in the input set to one or more neurons 405 in the next layer.
As one example, DNN 400 may be trained with ground truth data (i.e., data regarding real world conditions or states). For example, DNN 400 may be trained with ground truth data and/or updated with additional data by a processor of remote server computer 140. For example, the weights may be initialized by using a gaussian distribution, and the bias of each node 405 may be set to zero. Training DNN 400 may include updating weights and biases via suitable techniques, such as back propagation and optimization. Ground truth data for training may include, but is not limited to, data specifying objects (e.g., vehicles, signs, pedestrians, debris, etc.) within images. For example, ground truth data may be data representing objects and object tags. In another example, the ground truth data may be data representing an object (e.g., vehicle 105) and a relative angle and/or speed of the object (e.g., vehicle 105) with respect to another object (e.g., pedestrian, another vehicle, sign, etc.).
During operation, the vehicle computer 110 obtains sensor 115 data of the environment surrounding the vehicle 105 and provides the sensor 115 data, such as image data, vehicle 105 speed, speed of another vehicle, etc., to the DNN 400. DNN 400 generates an output based on the received input. The output is a determination of whether the vehicle 105 is in the parking area 200 or the road area 205 and a confidence estimate (as discussed above).
Fig. 5A is a first portion of a flow chart of an exemplary process 500 for operating the vehicle 105 (a second portion is shown in fig. 5B because the entire flow chart is not suitable for a single sheet of paper). Process 500 begins in block 505. The process 500 may be implemented by a vehicle computer 110 included in the vehicle 105 executing program instructions stored in its memory.
In block 505, the vehicle computer 110 receives data from one or more sensors 115, e.g., via the vehicle network, from a remote server computer 140, e.g., via the network 135, and/or from a computer in another vehicle, e.g., via V2V communication. For example, the vehicle computer 110 may receive position data, such as geographic coordinates, of the vehicle 105, such as from sensors 115, navigation systems, and the like. In addition, the vehicle computer 110 may receive image data, for example, from one or more image sensors 115. The image data may include data regarding the environment surrounding the vehicle 105, such as the parking area 200, the sub-area 210, one or more objects 215, and/or the markers 220, among others. Process 500 continues at block 510.
In block 510, the vehicle computer 110 determines whether the vehicle 105 is in the parking area 200 or the road area 205 based on the received data (e.g., image data and/or location data). For example, as discussed above, the vehicle computer 110 may compare the location of the vehicle 105 with the location of the parking area 200 to determine whether the vehicle 105 is within the geofence of the parking area 200. As another example, as discussed above, the vehicle computer 110 may input image data to the DNN 400 that outputs a determination of whether the vehicle 105 is in the road area 205 or the parking area 200 and a confidence estimate. In examples where the vehicle computer 110 cannot determine whether the vehicle 105 is in the parking area 200 or the road area 205, the vehicle computer 110 may determine that the vehicle 105 is in the unknown area 300, as discussed above. If the vehicle computer 110 determines that the vehicle 105 is in the parking area 200, the process 500 continues at block 530. If the vehicle computer 110 determines that the vehicle 105 is not in the parking area 200, i.e., in the road area 205, then the process 500 continues at block 525. If the vehicle computer 110 determines that the vehicle 105 is in the unknown region 300, the process 500 continues at block 515.
In block 515, the vehicle computer 110 determines whether the vehicle 105 has moved a threshold distance D. For example, as discussed above, after determining that the vehicle 105 is in the unknown region 300, the vehicle computer 110 may actuate the HMI 118 to output a message to the user. The message may instruct the user to operate the vehicle 105 to move the threshold distance D. The vehicle computer 110 may obtain the second sensor 115 data when the vehicle 105 is moving a threshold distance D. As discussed above, the vehicle computer 110 may determine that the vehicle 105 has moved a threshold distance D based on the second sensor 115 data. If the vehicle 105 has moved a threshold distance D, the process 500 continues at block 520. Otherwise, the process 500 remains in block 515.
In block 520, the vehicle computer 110 determines whether the vehicle 105 is in the parking area 200 or the road area 205 based on the second sensor 115 data (e.g., the second image data and/or the second position data). The vehicle computer 110 may determine whether the vehicle 105 is in the parking area 200 or the road area 205 in substantially the same manner as discussed above with respect to block 510. If the vehicle computer 110 determines that the vehicle 105 is in the parking area 200, the process 500 continues at block 530. If the vehicle computer 110 determines that the vehicle 105 is in the road region 205 or in the unknown region 300, the process 500 continues at block 525.
In block 525, the vehicle computer 110 maintains the mapping mode of operation in a deactivated state. For example, after determining that the vehicle 105 remains in the road region 205, the vehicle computer 110 may maintain the mapping mode of operation in a deactivated state. As another example, after determining that the vehicle 105 is in the unknown region 300, the vehicle computer 110 may maintain the mapping mode of operation in a deactivated state. In the deactivated state, the vehicle computer 110 prevents user selection of the mapping mode of operation, as discussed above. In addition, in the deactivated state, the vehicle computer 110 does not record the travel path P of the vehicle 105. The process 500 returns to block 505.
In block 530, the vehicle computer 110 activates the mapping mode of operation. For example, after determining that the vehicle 105 has moved from the road region 205 to the parking region 200, the vehicle computer 110 activates the mapping operation mode from the deactivated state to the activated state. In addition, after determining that the vehicle 105 remains in the parking area 200, the vehicle computer 110 may maintain the mapping operation mode in an active state. In the active state, the vehicle computer 110 enables user selection of a mapping mode of operation. Process 500 continues at block 535.
In block 535, the vehicle computer 110 determines whether a mapping mode is selected. For example, in the active state, as discussed above, the vehicle computer 110 may actuate the HMI 118 to detect a first user input selecting a mapping mode of operation. In other words, the HMI 118 may activate the sensor 115, which may detect a first user input, for example, a user selecting a virtual button on a touch screen display to select a mapping mode of operation. After detecting the first user input, the HMI 118 may then provide the first user input to the vehicle computer 110, and the vehicle computer 110 may select a mapping mode of operation based on the first user input. If the vehicle computer 110 receives a first user input selecting a mapping mode of operation, the process 500 continues at block 550. Otherwise, process 500 continues at block 540.
In block 540, the vehicle computer 110 determines whether the vehicle 105 is in the parking area 200 or the road area 205 based on the data (e.g., image data and/or location data). The vehicle computer 110 may determine whether the vehicle 105 is in the parking area 200 or the road area 205 in substantially the same manner as discussed above with respect to block 510. If the vehicle computer 110 determines that the vehicle 105 is in the parking area 200, the process 500 returns to block 535. Otherwise, process 500 continues at block 545.
In block 545, the vehicle computer 110 disables the mapping mode of operation. For example, after determining that the vehicle 105 has moved from the parking area 200 to the road area 205, the vehicle computer 110 deactivates the mapping operation mode from the activated state to the deactivated state. In the deactivated state, the vehicle computer 110 does not record the travel path P of the vehicle 105. Additionally, as discussed above, the vehicle computer 110 may prevent user selection of the mapping operation mode. After block 545, the process 500 ends.
Turning now to fig. 5B, following block 535 shown in fig. 5A, in block 550, the vehicle computer 110 records the travel path P of the vehicle 105. For example, upon detecting a first user input selecting a mapping mode of operation, the vehicle computer 110 may actuate the HMI 118 to output a message indicating a user to operate the vehicle 105 through the parking area 200. When the user is operating the vehicle 105, the vehicle computer 110 may receive the vehicle's position data, for example, at predetermined time intervals (e.g., every 500 milliseconds, every 1 second, every 5 seconds, etc.). As discussed above, the vehicle computer 110 may then generate the travel path P based on the temporally successive vehicle 105 positions. Process 500 continues at block 555.
In block 555, the vehicle computer 110 determines whether to continue recording the travel path P. For example, as discussed above, the vehicle computer 110 may stop recording the travel path P based on receiving a second user input via the HMI 118 to deselect the mapping mode of operation, for example. If the vehicle computer 110 receives the second user input, the vehicle computer 110 determines to stop recording the travel path P. If the vehicle computer 110 does not receive the second user input, the vehicle computer 110 determines to continue recording the travel path P.
Additionally or alternatively, the vehicle computer 110 may stop recording the travel path P based on detecting that the vehicle 105 is outside of the parking area 200. For example, as discussed above, the vehicle computer 110 may start a timer after detecting that the vehicle 105 is outside of the parking area 200. If the vehicle 105 does not return to the parking area 200 before the timer expires, the vehicle computer 110 may determine to stop recording the travel path P. If the vehicle 105 returns to the parking area 200 before the timer expires, the vehicle computer 110 may determine to continue recording the travel path P. As another example, as discussed above, the vehicle computer 110 may determine the distance between the vehicle 105 and the parking area 200 after determining that the vehicle 105 is outside the parking area 200. The vehicle computer 110 may then compare the distance to a second distance threshold. If the distance is greater than the second distance threshold, the vehicle computer 110 may determine to stop recording the travel path P. If the distance is less than or equal to the second distance threshold, the vehicle computer 110 may determine to continue recording the travel path P. If the vehicle computer 110 determines to stop recording the travel path P, the process 500 continues at block 560. Otherwise, the process 500 returns to block 550.
In block 560, the vehicle removes the invalid portion from the travel path P. For example, as discussed above, the vehicle computer 110 may identify a portion of the travel path P that is outside of the parking area 200 as invalid. Then, the vehicle computer 110 removes the ineffective portion from the travel path P. That is, as discussed above, the vehicle computer 110 maintains an effective portion of the travel path P, i.e., a portion within the parking area 200. The vehicle computer 110 may then update the map of the parking area 200 to specify the travel path P, and specifically the active portion. The vehicle computer 110 may store the map in, for example, a memory of the vehicle computer 110. The process 500 continues at block 565.
In block 565, the vehicle computer 110 may operate the vehicle 105 through the parking area 200 based on the updated map. For example, the vehicle computer 110 may actuate one or more vehicle components 125 to move the vehicle 105 along a stored travel path P specified in the updated map. After block 565, process 500 ends.
As used herein, the adverb "substantially" means that the shape, structure, measurement, quantity, time, etc. may deviate from the precisely described geometry, distance, measurement, quantity, time, etc. due to imperfections in materials, machining, manufacturing, data transmission, computational speed, etc.
In general, the described computing systems and/or devices may employ any of a variety of computer operating systems, including, but in no way limited to, the following versions and/or categories: ford (force)Application; appLink/Smart Device Link middleware; microsoft->An operating system; microsoft->An operating system; unix operating systems (e.g., +.A.issued by Oracle corporation on the coast of Redwood, california>An operating system); an AIX UNIX operating system issued by International Business Machines company of Armonk, N.Y.; a Linux operating system; mac OSX and iOS operating systems published by apple Inc. of Copico, calif.; blackberry operating systems issued by blackberry limited of smooth iron, canada; and android operating systems developed by google corporation and open cell phone alliance; or byQNX software systems Co>CAR infotainment platform. Examples of computing devices include, but are not limited to, an on-board first computer, a computer workstation, a server, a desktop computer, a notebook computer, a laptop computer, or a handheld computer, or some other computing system and/or device.
Computers and computing devices typically include computer-executable instructions that may be capable of being executed by one or more computing devices, such as those listed above. Computer-executable instructions may be compiled or interpreted from a computer program created using a variety of programming languages and/or techniques, including, but not limited to, java, alone or in combination TM C, C ++, matlab, simulink, stateflow, visual Basic, java Script, perl, HTML, etc. Some of these applications may be compiled and executed on virtual machines such as Java virtual machines, dalvik virtual machines, and the like. In general, a processor (e.g., a microprocessor) receives instructions from, for example, a memory, a computer-readable medium, etc., and executes the instructions, thereby performing one or more processes, including one or more of the processes described herein. Such instructions and other data may be stored and transmitted using a variety of computer-readable media. Files in a computing device are typically a collection of data stored on a computer readable medium such as a storage medium, random access memory, or the like.
The memory may include computer-readable media (also referred to as processor-readable media) including any non-transitory (e.g., tangible) media that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer). Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. Non-volatile media may include, for example, optical or magnetic disks, and other persistent memory. Volatile media may include, for example, dynamic Random Access Memory (DRAM), which typically constitutes a main memory. Such instructions may be transmitted by one or more transmission media, including coaxial cables, copper wire and fiber optics, including the wires that comprise a system bus coupled to the processor of the ECU. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, RAM, PROM, EPROM, a flash EEPROM, any other memory chip or cartridge, or any other medium from which a computer can read.
The database, data repository, or other data store described herein may include various mechanisms for storing, accessing, and retrieving various data, including hierarchical databases, file sets in file systems, application databases in proprietary formats, relational database management systems (RDBMSs), and the like. Each such data store is typically included within a computing device employing a computer operating system (such as one of those mentioned above) and is accessed via a network in any one or more of a variety of ways. The file system may be accessible from a computer operating system and may include files stored in various formats. In addition to languages used to create, store, edit, and execute stored programs, such as the PL/SQL language mentioned above, RDBMS also typically employ Structured Query Language (SQL).
In some examples, system elements may be implemented as computer-readable instructions (e.g., software) on one or more computing devices (e.g., servers, personal computers, etc.), stored on a computer-readable medium (e.g., disk, memory, etc.) associated therewith. The computer program product may include such instructions stored on a computer-readable medium for implementing the functions described herein.
With respect to the media, processes, systems, methods, heuristics, etc. described herein, it should be understood that, while the steps of such processes, etc. have been described as occurring according to some ordered sequence, such processes may be practiced by executing the described steps in an order different than that described herein. It should also be understood that certain steps may be performed concurrently, other steps may be added, or certain steps described herein may be omitted. In other words, the description of the processes herein is provided for the purpose of illustrating certain embodiments and should not be construed as limiting the claims in any way.
Accordingly, it is to be understood that the above description is intended to be illustrative, and not restrictive. Many embodiments and applications other than the examples provided will be apparent to those of skill in the art upon reading the above description. The scope of the invention should be determined, not with reference to the above description, but should instead be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. It is anticipated and intended that the technology discussed herein will evolve in the future, and that the disclosed systems and methods will be incorporated into such future embodiments. In summary, it is to be understood that the invention is capable of modification and variation and is limited only by the following claims.
Unless explicitly indicated to the contrary herein, all terms used in the claims are intended to be given their ordinary and customary meaning as understood by those skilled in the art. In particular, the use of singular articles such as "a," "an," "the," and the like are to be construed to recite one or more of the indicated elements unless a claim recites an explicit limitation to the contrary.
According to the present invention there is provided a system having a computer comprising a processor and a memory, the memory storing instructions executable by the processor to: determining that the vehicle is in the parking area based on the vehicle sensor data; activating a mapping mode of operation based on determining that the vehicle is in the parking area; recording a travel path of the vehicle through the parking area based on receiving a user input selecting the mapping operation mode; and updating a map of the parking area based on the recorded travel path.
According to one embodiment, the instructions further comprise instructions for: the vehicle is additionally determined to be in the parking area based on map data.
According to one embodiment, the instructions further comprise instructions for: the sensor data is input into a machine learning program that outputs an identification of the parking area.
According to one embodiment, the instructions further comprise instructions for: the mapping mode of operation is disabled based on determining that the vehicle is in a road region.
According to one embodiment, the instructions further comprise instructions for: after determining that a portion of the recorded travel path is outside the parking area, the portion of the travel path is removed from the map.
According to one embodiment, the instructions further comprise instructions for: operating the vehicle in the parking area based on the updated map.
According to one embodiment, the instructions further comprise instructions for: a confidence estimate is determined based on identifying an object in the sensor data, and the vehicle is additionally determined to be in the parking area based on the confidence estimate.
According to one embodiment, the instructions further comprise instructions for: the method further includes determining that the vehicle is in the parking area based on the confidence estimate being greater than a first threshold, and determining that the vehicle is in a road area based on the confidence estimate being less than a second threshold, the second threshold being less than the first threshold.
According to one embodiment, the instructions further comprise instructions for: after determining that the confidence estimate is between the first threshold and the second threshold, determining that the vehicle is in the parking area or in the road area based on second sensor data.
According to one embodiment, the instructions further comprise instructions for: recording the travel path is stopped based on receiving a second user input to deselect the mapping operation mode.
According to one embodiment, the instructions further comprise instructions for: stopping recording the travel path based on determining that the vehicle is outside the parking area for a predetermined time.
According to one embodiment, the instructions further comprise instructions for: stopping recording the travel path based on determining that the vehicle is outside the parking area and that a distance between the vehicle and the parking area is greater than a threshold.
According to one embodiment, the instructions further comprise instructions for: after determining that the vehicle is in an unknown area from the sensor data, the vehicle is determined to be in the parking area based on second sensor data obtained after the vehicle moves a threshold distance.
According to one embodiment, the instructions further comprise instructions for: disabling the mapping mode of operation based on determining that the vehicle is in the unknown region according to the second sensor data.
According to the invention, a method comprises: determining that the vehicle is in the parking area based on the vehicle sensor data; activating a mapping mode of operation based on determining that the vehicle is in the parking area; recording a travel path of the vehicle through the parking area based on receiving a user input selecting the mapping operation mode; and updating a map of the parking area based on the recorded travel path.
In one aspect of the invention, the method comprises: the vehicle is additionally determined to be in the parking area based on map data.
In one aspect of the invention, the method comprises: the sensor data is input into a machine learning program that outputs an identification of the parking area.
In one aspect of the invention, the method comprises: the mapping mode of operation is disabled based on determining that the vehicle is in a road region.
In one aspect of the invention, the method comprises: after determining that a portion of the recorded travel path is outside the parking area, the portion of the travel path is removed from the map.
In one aspect of the invention, the method comprises: operating the vehicle in the parking area based on the updated map.

Claims (15)

1. A method, the method comprising:
determining that the vehicle is in the parking area based on the vehicle sensor data;
activating a mapping mode of operation based on determining that the vehicle is in the parking area;
recording a travel path of the vehicle through the parking area based on receiving a user input selecting the mapping operation mode; and
updating a map of the parking area based on the recorded travel path.
2. The method of claim 1, the method further comprising: the vehicle is additionally determined to be in the parking area based on map data.
3. The method of claim 1, the method further comprising: the sensor data is input into a machine learning program that outputs an identification of the parking area.
4. The method of claim 1, the method further comprising: the mapping mode of operation is disabled based on determining that the vehicle is in a road region.
5. The method of claim 1, the method further comprising: after determining that a portion of the recorded travel path is outside the parking area, the portion of the travel path is removed from the map.
6. The method of claim 1, the method further comprising: operating the vehicle in the parking area based on the updated map.
7. The method of claim 1, the method further comprising: determining a confidence estimate based on identifying an object in the sensor data; and additionally determining that the vehicle is in the parking area based on the confidence estimate.
8. The method of claim 7, the method further comprising: determining that the vehicle is in the parking area based on the confidence estimate being greater than a first threshold; and determining that the vehicle is in a road region based on the confidence estimate being less than a second threshold, the second threshold being less than the first threshold.
9. The method of claim 8, the method further comprising: after determining that the confidence estimate is between the first threshold and the second threshold, determining that the vehicle is in the parking area or in the road area based on second sensor data.
10. The method of claim 1, the method further comprising: stopping recording the travel path based on determining that the vehicle is outside the parking area for a predetermined time.
11. The method of claim 1, the method further comprising: stopping recording the travel path based on determining that the vehicle is outside the parking area and that a distance between the vehicle and the parking area is greater than a threshold.
12. The method of claim 1, the method further comprising: after determining that the vehicle is in an unknown area from the sensor data, the vehicle is determined to be in the parking area based on second sensor data obtained after the vehicle moves a predetermined distance.
13. A computer programmed to perform the method of any one of claims 1 to 12.
14. A computer program product comprising instructions for performing the method of any of claims 1 to 12.
15. A vehicle comprising a computer programmed to perform the method of any one of claims 1 to 12.
CN202210505047.8A 2022-05-10 2022-05-10 Vehicle path determination Pending CN117068141A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210505047.8A CN117068141A (en) 2022-05-10 2022-05-10 Vehicle path determination

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210505047.8A CN117068141A (en) 2022-05-10 2022-05-10 Vehicle path determination

Publications (1)

Publication Number Publication Date
CN117068141A true CN117068141A (en) 2023-11-17

Family

ID=88704808

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210505047.8A Pending CN117068141A (en) 2022-05-10 2022-05-10 Vehicle path determination

Country Status (1)

Country Link
CN (1) CN117068141A (en)

Similar Documents

Publication Publication Date Title
US11400940B2 (en) Crosswind risk determination
CN112438729A (en) Driver alertness detection system
US20220289248A1 (en) Vehicle autonomous mode operating parameters
US11715338B2 (en) Ranking fault conditions
US11845431B2 (en) Enhanced vehicle operation
US11794787B2 (en) Vehicle assist feature control
US11348343B1 (en) Vehicle parking navigation
US20220073070A1 (en) Vehicle draft mode
CN112319456A (en) Vehicle threat detection and response
CN113298250A (en) Neural network for localization and object detection
US20220274592A1 (en) Vehicle parking navigation
US11639173B2 (en) Vehicle planned path signal
CN114119625A (en) Segmentation and classification of point cloud data
CN113306554A (en) Vehicle way-giving decision making
US11164457B2 (en) Vehicle control system
CN112446466A (en) Measuring confidence in deep neural networks
CN112896179A (en) Vehicle operating parameters
US20220178715A1 (en) Vehicle path determination
CN116552514A (en) Vehicle control for optimized operation
CN116740404A (en) Ice thickness estimation for mobile object manipulation
US11897468B2 (en) Vehicle control system
CN115959135A (en) Enhanced vehicle operation
US20220080968A1 (en) Adaptive cruise control
CN117068141A (en) Vehicle path determination
US11708075B2 (en) Enhanced adaptive cruise control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication