WO2023192385A1 - Customizing electric vehicle charging station service based on sentiment analysis - Google Patents

Customizing electric vehicle charging station service based on sentiment analysis Download PDF

Info

Publication number
WO2023192385A1
WO2023192385A1 PCT/US2023/016736 US2023016736W WO2023192385A1 WO 2023192385 A1 WO2023192385 A1 WO 2023192385A1 US 2023016736 W US2023016736 W US 2023016736W WO 2023192385 A1 WO2023192385 A1 WO 2023192385A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
evcs
electric vehicle
connector
event
Prior art date
Application number
PCT/US2023/016736
Other languages
French (fr)
Inventor
Bradford CRIST
Thomas Enders
Original Assignee
Volta Charging, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volta Charging, Llc filed Critical Volta Charging, Llc
Publication of WO2023192385A1 publication Critical patent/WO2023192385A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L53/00Methods of charging batteries, specially adapted for electric vehicles; Charging stations or on-board charging equipment therefor; Exchange of energy storage elements in electric vehicles
    • B60L53/30Constructional details of charging stations
    • B60L53/305Communication interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L53/00Methods of charging batteries, specially adapted for electric vehicles; Charging stations or on-board charging equipment therefor; Exchange of energy storage elements in electric vehicles
    • B60L53/30Constructional details of charging stations
    • B60L53/31Charging columns specially adapted for electric vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L53/00Methods of charging batteries, specially adapted for electric vehicles; Charging stations or on-board charging equipment therefor; Exchange of energy storage elements in electric vehicles
    • B60L53/60Monitoring or controlling charging stations
    • B60L53/66Data transfer between charging stations and vehicles
    • B60L53/665Methods related to measuring, billing or payment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L53/00Methods of charging batteries, specially adapted for electric vehicles; Charging stations or on-board charging equipment therefor; Exchange of energy storage elements in electric vehicles
    • B60L53/60Monitoring or controlling charging stations
    • B60L53/68Off-site monitoring or control, e.g. remote control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L2250/00Driver interactions
    • B60L2250/16Driver interactions by display

Definitions

  • the present disclosure relates to computer-implemented techniques for charging electric vehicles, and in particular to techniques for customizing services provided to electric vehicles.
  • EVCSs electric vehicle charging stations
  • These EVCSs usually supply electric energy, either using cables or wirelessly, to the batteries of electric vehicles.
  • a user can connect their electric vehicle via cables of an EVCS, and the EVCS supplies electrical current to the user’s electric vehicle.
  • the cables and control systems of the EVCSs can be housed in kiosks in locations to allow a driver of an electric vehicle to park the electric vehicle close to the EVCS and begin the charging process.
  • These kiosks may be placed in areas of convenience, such as in parking lots at shopping centers, in front of commercial buildings, or in other public places.
  • These kiosks often comprise a display that can be used to provide media items to the user to enhance the user’s charging experience.
  • passers-by in addition to users of the EVCS, may notice media items displayed by the EVCS.
  • users e.g., new users
  • Traditional EVCSs provide little to no assistance to users that require assistance when charging their electric vehicle.
  • EVCSs often provide the same services (e.g., user experience, charging rate, charging cost, etc.) to each electric vehicle that is connected to the EVCSs without considering the unique individual operating the electric vehicle, resulting in suboptimal user experience.
  • a first user that makes multiple unsuccessful attempts to connect their electric vehicle to the EVCS may require additional instructions (e.g., shown on the EVCS display), while a second user that quickly connects their electric vehicle to the EVCS may be more interested in special events nearby.
  • an EVCS can utilize a camera mounted to the EVCS.
  • the camera may be housed in the upper portion (e.g., between 100 centimeters and 150 centimeters above the ground) of the EVCS to capture images of the faces of the users.
  • the camera may be configured to capture one or more images of an area proximal to the EVCS.
  • a camera may be configured to obtain a video or capture images of the EVCS and/or an area around the EVCS.
  • Control circuitry housed within the EVCS can determine a characteristic of a user based on the one or more images captured by the camera.
  • the EVCS may identify an event (e.g., connector event, plug-in event, plug-out event, charging event, user event) using image recognition, then determine a user characteristic based on the identified event.
  • the camera may capture images of an EVCS connector used to supply electric current to an electric vehicle.
  • a first image may display the connector in a first position (e.g., EVCS holster), and the second image may display the connector in a second position.
  • the EVCS may determine that a connector event has occurred based on the first and second images.
  • a connector event may be any time the EVCS determines that the connector has been moved from the EVCS holster. The EVCS may then determine that a charging event has occurred based on a third image, captured by the camera, displaying that the connector is connected to an electric vehicle.
  • a charging event may be any time the EVCS determines that the connector is connected to an electric vehicle and/or the EVCS is supplying electric current to the electric vehicle. When the EVCS determines that a charging event occurs within a threshold time (e.g., 20 seconds) of a connector event, the EVCS may determine that the user does not require assistance (first user characteristic).
  • the EVCS can determine customized services (e.g., types of displays, selected advertisements, types of notifications, user experiences, charging rates, charging costs, etc.) based on the determined user characteristic.
  • the EVCS may notify the user of a nearby event via a display on the EVCS. If the EVCS determines that a charging event did not occur within the threshold time of a connector event, the EVCS may determine that the user requires assistance (second user characteristic). In response to determining the second user characteristic, the EVCS may display a video on how to operate the connector via the display on the EVCS.
  • the camera may capture images of the face of the user of an electric vehicle, for example, when the user approaches the EVCS (user event).
  • the EVCS may determine the facial expression of the user (e.g., angry, sad, happy, etc.) based on the images captured by the camera.
  • the EVCS determines the facial expression using detection software (e.g., facial expression recognition software, image recognition software, machine learning, etc.).
  • detection software e.g., facial expression recognition software, image recognition software, machine learning, etc.
  • the EVCS may then determine customized services based on the determined facial expression.
  • the camera may capture an image showing that a user is frowning, and the EVCS may determine that the user is sad (user characteristic) based on the facial expression.
  • the EVCS may provide a reduced charging rate, coupon for the user, and/or tone change.
  • the EVCS may notify the user of the reduced charging rate and/or coupon by sending a notification to a user device associated with the user.
  • any suitable sensor can be used to determine a characteristic of the user.
  • These sensors may be image sensors (e.g., one or more cameras), microphones, ultrasound sensors, depth sensors, infrared (IR) cameras, Red Green Blue (RGB) cameras, passive IR (PIR) cameras, proximity sensors, radar, tension sensors, near field communication (NFC) sensors, and/or any similar such sensor.
  • image sensors e.g., one or more cameras
  • IR infrared
  • RGB Red Green Blue
  • PIR passive IR
  • proximity sensors e.g., radar, tension sensors, near field communication (NFC) sensors, and/or any similar such sensor.
  • NFC near field communication
  • the EVCS may display a video instructing the user on how to operate the connector via a display on the EVCS.
  • the microphone may detect a baby crying, and the EVCS may determine that the user is a parent (second user characteristic).
  • the EVCS may display an advertisement for a nearby store that sells products for children.
  • the EVCS may also determine future customized services based on a characteristic of a user. For example, a camera mounted to the EVCS may capture a first image, and the EVCS may determine that a connector event has occurred based on the first image. The EVCS may then determine that a charging event did not occur within the threshold time of the connector event. The EVCS may determine, for example, that the user had trouble removing the connector from an EVCS holster (user characteristic) based on a second image captured by the camera. For example, the second image may have been taken ten seconds after the connector event and shows that the connector only moved three centimeters from the original position.
  • a camera mounted to the EVCS may capture a first image, and the EVCS may determine that a connector event has occurred based on the first image. The EVCS may then determine that a charging event did not occur within the threshold time of the connector event. The EVCS may determine, for example, that the user had trouble removing the connector from an EVCS holster (user
  • the EVCS can transmit this user characteristic to a second device, notifying developers of possible deficiencies in the EVCS (e.g., the EVCS needs repair) and/or in the design of the EVCS.
  • the camera captures an image of the user and the EVCS determines a user characteristic (e.g., demographic of the user based on the captured image.
  • the EVCS may update a profile associated with the user with the determined user characteristic.
  • the EVCS determines customized services (e.g., providing advertisements related to the demographic of the user), based on the user characteristic stored in the profile.
  • FIG. 1 shows an illustrative diagram of a system for determining customized services based on a characteristic of the user, in accordance with some embodiments of the disclosure
  • FIGS. 2 A and 2B show other illustrative diagrams of a system for determining customized services based on a characteristic of the user, in accordance with some embodiments of the disclosure
  • FIGS. 3A-3D show other illustrative diagrams of a system for determining customized services based on a characteristic of the user, in accordance with some embodiments of the disclosure
  • FIGS. 4 A and 4B show illustrative diagrams of notifications indicating customized services, in accordance with some embodiments of the disclosure
  • FIG. 5 shows an illustrative block diagram of an EVCS system, in accordance with some embodiments of the disclosure
  • FIG. 6 shows an illustrative block diagram of a user equipment device system, in accordance with some embodiments of the disclosure
  • FIG. 7 shows an illustrative block diagram of a server system, in accordance with some embodiments of the disclosure.
  • FIG. 8 is an illustrative flowchart of a process of determining customized services based on a characteristic of the user, in accordance with some embodiments of the disclosure.
  • FIG. 9 is another illustrative flowchart of a process of determining customized services based on a characteristic of the user, in accordance with some embodiments of the disclosure.
  • FIG. 1 shows an illustrative diagram of a system for determining customized services based on a characteristic of a user 106 of an electric vehicle 104, in accordance with some embodiments of the disclosure.
  • an EVCS 102 provides an electric charge to the electric vehicle 104 via a charging cable 122 or a wireless connection (e.g., wireless charging).
  • the EVCS 102 may be in communication with the electric vehicle 104 and/or a user device 108 belonging to a user 106 (e.g., a driver, passenger, owner, renter, or other operator of the electric vehicle 104) that is associated with the electric vehicle 104.
  • the EVCS 102 communicates with one or more devices or computer systems, such as user device 108 or server 110, respectively, via a network 112. Although some steps or methods may be described as being executed by the EVCS 102, user device 108, and/or server 110, said steps and methods may also be performed by any combination of the devices.
  • EVCS 102 There can be more than one EVCS 102, electric vehicle 104, user 106, user device 108, server 110, and network 112, but only one of each is shown in FIG. 1 to avoid overcomplicating the drawing.
  • a user 106 may utilize more than one type of user device 108 and more than one of each type of user device 108.
  • the devices may also communicate with each other directly through an indirect path via a communications network.
  • the communications network may be one or more networks including the Internet, a mobile phone network, mobile voice or data network (e.g., a 4G, 5G, or LTE network), cable network, public switched telephone network, or other types of communications network or combinations of communications networks.
  • a communication network path comprises one or more communications paths, such as, a satellite path, a fiberoptic path, a cable path, a path that supports Internet communications (e.g., IPTV), free-space connections (e.g., for broadcast or other wireless signals), or any other suitable wired or wireless communications path or combination of such paths.
  • a communication network path can be a wireless path. Communications with the devices may be provided by one or more communication paths but are shown as single paths in FIG. 1 to avoid overcomplicating the drawing.
  • the EVCS 102 determines a characteristic of the user 106 based on information detected from one more sensors.
  • the one or more sensors may comprise image sensors (e.g., camera 116), microphones, ultrasound sensors, depth sensors, IR cameras, RGB cameras, PIR cameras, proximity sensors, radar, tension sensors, NFC sensors, and/or any combination thereof.
  • image sensors e.g., camera 116
  • microphones ultrasound sensors
  • depth sensors e.g., IR cameras
  • RGB cameras IR cameras
  • PIR cameras IR cameras
  • proximity sensors e.g., radar
  • tension sensors e.g., NFC sensors, and/or any combination thereof.
  • one or more images captured may be captured by a camera 116 mounted to the EVCS 102.
  • the camera 116 is housed in the upper portion (e.g., between 100 centimeters and 150 centimeters above the ground) of the EVCS 102.
  • the camera 116 is configured to capture one or more images of an area proximal to the EVCS 102.
  • a camera 116 may be configured to obtain a video or capture images of the EVCS 102 and/or an area around a parking space 120 corresponding to the EVCS 102.
  • the EVCS 102 identifies a user characteristic based on a characteristic of the electric vehicle 104. In some embodiments, the EVCS 102 determines a characteristic of the electric vehicle 104 based on information received from one or more sensors (e.g., camera 116). In some embodiments, after the camera 116 captures information about the electric vehicle 104, the EVCS 102 determines an electric vehicle characteristic (e.g., model, make, license plate, VIN number, tire pressure, specifications, condition, etc.) based on the captured information. In some embodiments, the EVCS 102 accesses a database linking characteristics of electric vehicles to users of the electric vehicles.
  • an electric vehicle characteristic e.g., model, make, license plate, VIN number, tire pressure, specifications, condition, etc.
  • the EVCS 102 determines characteristics of the electric vehicle 104 using ISO 15118 when the electric vehicle 104 is connected to the EVCS 102. In some embodiments, the EVCS 102 receives a media access control (MAC) address from the electric vehicle 104, and the EVCS 102 uses the MAC address to determine vehicle characteristics of the electric vehicle 104 and/or to determine the user 106 associated with the electric vehicle 104. The EVCS 102 can use a database to match the received MAC address or portions of the received MAC address to entries in the database to determine vehicle characteristics of the electric vehicle 104. For example, certain vehicle manufacturers keep portions of their produced electric vehicle’s MAC addresses consistent.
  • MAC media access control
  • the EVCS 102 determines that a portion of the MAC address received from the electrical vehicle 104 corresponds to an electric vehicle manufacturer, the EVCS 102 can determine vehicle characteristics of the electric vehicle 104.
  • the EVCS 102 can also use a database to match the received MAC address or portions of the received MAC address to entries in the database to determine the user 106 associated with the electric vehicle 104.
  • the electric vehicle’s MAC address may correspond to a user profile corresponding to the user 106 associated with the electric vehicle 104.
  • the EVCS 102 identifies a user characteristic when the user 106 requests the EVCS 102 to charge the electric vehicle 104.
  • the user 106 may present credentials (e.g., password, PEST, biometrics, device, item, etc.) to request the EVCS 102 to charge the electric vehicle 104.
  • the user 106 inputs a password using the display 118 of the EVCS 102.
  • the EVCS 102 receives credentials from the user device 108 when the user device 108 is within a distance (e.g., two meters) from the EVCS 102.
  • the EVCS 102 identifies a user profile associated with the user 106 based on the received credentials.
  • the user profile comprises one or more user characteristics associated with the user 106.
  • the user 106 selects and/or reserves EVCS 102.
  • the user 106 can select the EVCS 102 when the user inputs the EVCS 102 as a destination in a navigation system.
  • the navigation system is implemented using the user device 108 and/or the electric vehicle 104.
  • the user 106 may access a web page and reserve the EVCS 102 for a time period.
  • the EVCS 102 when the user 106 selects and/or reserves the EVCS 102, the EVCS 102 receives a notification indicating a profile and/or user characteristic associated with the user 106 that made the selection/reservation. In some embodiments, the EVCS 102 determines a user characteristic using the profile.
  • the EVCS 102 determines customized services based on the facial expression of the user 106.
  • the camera 116 captures images of the face of the user 106 of the electric vehicle 104, for example, when the user 106 approaches the EVCS 102 (user event).
  • the EVCS 102 determines the facial expression of the user 102 (e.g., angry, sad, happy, etc.) based on facial expression recognition of the images captured by the camera 116.
  • the EVCS 102 determines customized services based on the determined facial expression (user characteristic).
  • the EVCS 102 determines customized services (e.g., types of displays, selected advertisements, types of notifications, user experiences, charging rates, charging costs, etc.) based on the determined user characteristic. For example, if the EVCS 102 determines that the user 106 does not require assistance (e.g., first user characteristic), the EVCS 102 may notify the user 106 of a nearby event via a display 118. In another example, if the EVCS 102 determines that the user 106 requires assistance (e.g., second user characteristic), the EVCS 102 displays a video on how to operate the connector 122 via the display 118.
  • customized services e.g., types of displays, selected advertisements, types of notifications, user experiences, charging rates, charging costs, etc.
  • the EVCS 102 accesses a database comprising a plurality of entries wherein user characteristics are mapped to types of services. In some embodiments, the EVCS 102 determines that a first entry, corresponding to a determined user characteristic (e.g., new user) indicates a first service (e.g., displaying a tutorial). In some embodiments, the EVCS 102 receives the user characteristic and/or service type from the server 110, user device 108, and/or electric vehicle 104.
  • a determined user characteristic e.g., new user
  • a first service e.g., displaying a tutorial.
  • the EVCS 102 receives the user characteristic and/or service type from the server 110, user device 108, and/or electric vehicle 104.
  • FIGS. 2 A and 2B show other illustrative diagrams of a system for determining customized services based on a characteristic of a user 106, in accordance with some embodiments of the disclosure.
  • FIGS. 2A and 2B use the same or similar methods and devices as those described in FIG. 1.
  • a camera captures a plurality of images of an area proximal to an EVCS. In some embodiments, the camera captures the plurality of images in response to detecting an electric vehicle 104 in the area proximal to the EVCS. In some embodiments, the camera captures the plurality of images in response to detecting an electric vehicle 104 in the parking space 120 corresponding to the EVCS. In some embodiments, the camera captures the plurality of images in response to an event (e.g., connector event, charging event, user event).
  • an event e.g., connector event, charging event, user event
  • FIG. 2A shows a first image 202 of the plurality of images captured by a camera (e.g., camera 116).
  • the plurality of images are captured in response to a connector event.
  • the connector event is detected whenever the connector 122 is detected by the camera. For example, when the camera detects the connector 122 entering the field of view of the camera at a first position 204, the first image 202 may be captured and/or processed.
  • the connector event is detected whenever the connector 122 position moves.
  • the first image 202 displays the connector 122 in the first position 204
  • a second image 206 displays the connector 122 in a second position 208.
  • the EVCS may determine that a connector event has occurred.
  • the EVCS determines whether a charging event occurs.
  • a charging event may be any time the EVCS determines that the connector 122 is connected to the electric vehicle 104 and/or the EVCS is supplying electric current to the electric vehicle 104. For example, if the camera captures an image of the connector 122 connected to the electric vehicle 104, the EVCS determines that a charging event has occurred. In another example, if the EVCS determines that current is flowing from the EVCS to the electric vehicle 104, the EVCS determines that a charging event has occurred. When the EVCS determines that a charging event occurs within a threshold time (e.g., 20 seconds) of a connector event, the EVCS may determine, for example, that the user 106 does not require assistance (user characteristic).
  • a threshold time e.g. 20 seconds
  • the EVCS may determine, for example, that the user requires assistance (second user characteristic).
  • the first image 202 may correspond to the detection of a connector event.
  • the camera may capture the second image 206.
  • the EVCS may determine that the connector 122 is located at the second position 208 and is not connected to the electric vehicle 104.
  • the EVCS determines, for example, that the user 106 requires assistance (user characteristic).
  • the EVCS is configured to determine user characteristics and/or events based on similar such user and/or connector positions. For example, if the plurality of images displays the connector 122 within a threshold distance (e.g., 30 centimeters) of the electric vehicle 104, the EVCS may determine that an attempted charging event occurred. In some embodiments, if the attempted charging event is detected after a threshold time period, the EVCS determines that the user 106 requires assistance. In some embodiments, the EVCS uses sensors mounted to the connector 122 and/or holster to determine if the connector 122 has been removed from the holster. In some embodiments, the EVCS uses sensors mounted to the connector 122 to determine if the connector 122 is connected or disconnected to the electric vehicle 104.
  • a threshold distance e.g. 30 centimeters
  • the EVCS determines customized services based on the determined user characteristic. For example, in response to determining, for example, that the user 106 does not require assistance (first user characteristic), the EVCS notifies the user 106 of a nearby event via a user device associated with the user. In another example, in response to determining that the user 106 requires assistance (second user characteristic), the EVCS displays a video on how to operate the connector 122 via a display on the EVCS. [0033] In some embodiments, the EVCS uses one or more items in the first image 202 and second image 206 to determine a characteristic of the user 106. For example, the EVCS determines (e.g., using optical character recognition) the characters of the license plate 212.
  • the EVCS 102 uses a database to look up user characteristics of the user 106 using the license plate information.
  • the database may comprise public records (e.g., public registration information linking license plates to user profiles), collected information (e.g., entries linking license plates to user characteristics based on data inputted by a user), historic information (entries linking license plates to user characteristics based on the EVCS 102 identifying user characteristics related to one or more license plates in the past), and/or similar such information.
  • the EVCS determines a user characteristic (e.g., user 106 is disabled) based on determining that the license plate 212 is associated with a person with a disability.
  • the EVCS may determine (e.g., via image recognition) that an item is located inside the electric vehicle 104.
  • the item is associated with a user characteristic.
  • the item may be a disabled person placard.
  • the EVCS determines a user characteristic (e.g., user 106 is disabled) based on determining that the item is located inside the electric vehicle 104.
  • the EVCS 102 customizes services based on the determined user characteristic. For example, if the user characteristic indicates that the user 106 requires assistance, the EVCS 102 can transmit a notification (e.g., via network 112) indicating that the user 106 requires assistance.
  • the notification may be transmitted to one or more devices associated with a nearby business, on-site employees, other electric vehicle owners, nearby assistants, etc.
  • nearby electric vehicle owners may receive a notification from the EVCS indicating that the user 106 at the EVCS requires assistance.
  • the notification may also indicate a benefit (e.g., compensation, credit, etc.) for assisting the user 106 with charging their electric vehicle 104.
  • the display of the EVCS may display a graphic, turn a color, and/or flash, signaling that the user 106 of the electric vehicle 104 requires assistance.
  • the EVCS 102 causes a light mounted to the EVCS 102 to turn on, signaling that user 106 requires assistance.
  • FIGS. 3A-3D show other illustrative diagrams of a system for determining customized services based on a characteristic of a user 106, in accordance with some embodiments of the disclosure.
  • FIGS. 3A-3D use the same or similar methods and devices as those described in FIGS. 1, 2A, and 2B.
  • a camera captures a plurality of images of an area proximal to an EVCS. In some embodiments, the camera captures the plurality of images in response to detecting an electric vehicle 104 in the area proximal to the EVCS. In some embodiments, the camera captures the plurality of images in response to detecting an electric vehicle 104 in the parking space 120 corresponding to the EVCS. In some embodiments, the camera captures the plurality of images in response to an event (e.g., connector event, charging event, user event).
  • an event e.g., connector event, charging event, user event
  • FIGS. 3A-3C show images (first image 302, second image 306, and third image 312) of the plurality of images captured by a camera (e.g., camera 116).
  • the plurality of images are captured in response to a user event.
  • the user event is detected when a user 106 is detected in the first image 302.
  • the user event is detected when a user 106 leaves the electric vehicle 104.
  • the EVCS may determine that a user event has occurred if the EVCS determines that user 104 is a distance (e.g., thirty centimeters) away from the electric vehicle 104.
  • the user event is detected when a user 106 is within a second distance (e.g., 60 centimeters) from the EVCS. In some embodiments, the user event is detected when the user 106 requests a charge. For example, a user event may be detected when the user 106 interacts with the EVCS. Although only one user is shown, the EVCS may detect more than one user. For example, multiple users may exit the electric vehicle 104.
  • a first image 302 is captured after a user event.
  • the EVCS uses facial expression recognition to determine the first facial expression 304 of the user 106.
  • the EVCS uses the first facial expression 304 as the user characteristic.
  • the EVCS may determine that the first facial expression 304 corresponds to a sad facial expression (e.g., user characteristic).
  • the EVCS determines customized services based on the user characteristic. For example, in response to the first facial expression 304, the EVCS may provide a reduced charging rate, coupon for the user 106, and/or tone change.
  • the EVCS may notify the user 106 of the reduced charging rate and/or coupon by sending a notification to a user device associated with the user 106.
  • a tone change indicates user interface (UI) changes related to the EVCS and/or application on a user device corresponding to the user. For example, in response to a sad facial expression (user characteristic) the EVCS may display an animation (e.g., confetti) on the EVCS display, change color of the EVCS display, output audio (e.g., happy jingle) using speakers of the EVCS, and/or similar such UI changes.
  • the EVCS updates a profile associated with the user 106 with the determined user characteristic.
  • the EVCS may update that the user had a first facial expression at a first time.
  • the EVCS may be able to determine the gender, age, ethnicity, preferences, and/or similar such information (user characteristics) of the user 106 based on the first image 302 and store the determined user characteristic in the profile.
  • the EVCS may determine user characteristics for more than one user in an image. For example, the EVCS may determine that a first user is an adult (e.g., first user characteristic) and a second user is a child (e.g., second user characteristic). The EVCS may determine customized services based on the multiple user characteristics.
  • the EVCS may display a first advertisement (e.g., vehicle advertisement) associated with the first user characteristic (e.g., adult) and a second advertisement (e.g., toy advertisement) associated with the second user characteristic (e.g., child).
  • the EVCS may display a single advertisement (e.g., family movie advertisement) based on both user characteristics.
  • the second image 306 is captured after a user event and/or a connector event.
  • the second image 306 is captured after a threshold time period after the user event and/or connector event.
  • the EVCS determines the second facial expression 310 of the user 106 using the same or similar methodologies as described herein.
  • the EVCS determines that the second facial expression 310 corresponds to a frustrated facial expression (e.g., user characteristic). In some embodiments, the EVCS determines customized services based on the user characteristic. For example, in response to determining that the user 106 has a second facial expression 310 corresponding to a frustrated user, the EVCS may display a video on how to operate the connector 122 via the display on the EVCS.
  • a frustrated facial expression e.g., user characteristic
  • the EVCS determines customized services based on the user characteristic. For example, in response to determining that the user 106 has a second facial expression 310 corresponding to a frustrated user, the EVCS may display a video on how to operate the connector 122 via the display on the EVCS.
  • the EVCS records additional information.
  • the EVCS may record that the connector 122 was located at a first position 308 when the second facial expression 310 was detected.
  • the EVCS may record the change in facial expression (e.g., changing from happy to the second facial expression 310).
  • the EVCS may record the amount of time between events.
  • the EVCS may record the amount of time between an event and the detection of the second facial expression 310.
  • the EVCS transmits the additional information to a user device.
  • the additional information is used to determine that the user 106 had trouble connecting the connector 122 to the electric vehicle 104 based on the second image 304 captured by the camera.
  • the second image 306 may have been taken thirty seconds after the connector event and displays that the connector 122 has not been connected to the electric vehicle and the user is frustrated (e.g., second facial expression 310).
  • the additional information signals deficiencies in the EVCS that were captured in the second image 306 (e.g., the EVCS needs repair) and/or in the design of the type of EVCS corresponding to the EVCS that were captured in the second image 306. For example, if one or more users take multiple attempts to disconnect the connector 122 from electric vehicles, then this may indicate that the EVCS 104 requires repair.
  • the plurality of images provide redundancies in case of other system failures.
  • the EVCS may not charge the electric vehicle 104.
  • the EVCS may not charge the electric vehicle 104 despite having the connector 122 connected to the electric vehicle 104.
  • an image e.g., image 312
  • the EVCS may charge the electric vehicle 104 despite the sensor failing to detect the plug-in event.
  • FIG. 3C shows a third image 312 captured after a user event, connector event, and/or charging event.
  • the third image 312 is captured after a threshold time period after a user event, connector event, and/or charging event.
  • the EVCS uses facial expression recognition to determine the gaze 316 of the user 106. For example, the EVCS may determine whether the gaze 316 of the user indicates that the user 106 is looking at the EVCS display, user device, electric vehicle, etc. In some embodiments, the EVCS determines a user characteristic based on the determined gaze 316. For example, the EVCS may determine that the gaze 316 indicates that the user is looking at the EVCS. In response to determining that the gaze 316 indicates that the user is looking at the EVCS, the EVCS may display an advertisement on the display for the user 106.
  • the EVCS records additional information. In some embodiments, the EVCS records the time period when the gaze 316 indicates that the user 106 is looking at the EVCS display, user device, and/or electric vehicle 104. In some embodiments, the EVCS records the number of times that the gaze 316 indicates that the user 106 is looking at the EVCS display, user device, and/or electric vehicle 104.
  • the EVCS determines a next action for the user 106 after the charging event.
  • a next action may correspond to a location and/or activity corresponding to the user 106 after the charging event.
  • the EVCS determines that a charging event has occurred because the third image 312 displays the connector 122 connected to the electric vehicle 104 (e.g., in a second position 314).
  • the EVCS determines the location of the user after the charging event. For example, the camera may capture a plurality of images after the charging event, and the EVCS may determine whether the user 106 is located in the electric vehicle 104, is located outside the vehicle 104, and/or leaves the frame of view of the camera.
  • the EVCS stores one or more next actions as additional information. In some embodiments, the EVCS stores time periods related to one or more actions as additional information. For example, the EVCS may determine that the user 106 left the frame two minutes after the charging event and then entered the frame one hour after leaving the frame. In some embodiments, the EVCS transmits the additional information to a user device.
  • FIG. 3D shows another illustrative diagram of a system 350 for determining customized services based on a characteristic of a user 106, in accordance with some embodiments of the disclosure.
  • a microphone mounted to an EVCS records audio in response to the EVCS detecting an electric vehicle 104.
  • the microphone records audio in response to the EVCS detecting the electric vehicle 104 in the parking space 120.
  • the microphone records audio in response to an event (e.g., connector event, charging event, user event). For example, if the EVCS uses the first image 302 to detect a user event, the microphone may begin to record audio.
  • the EVCS determines an event based on audio recorded by the microphone. For example, the EVCS may determine that a parking event has occurred when the microphone detects audio corresponding to an electric vehicle 104 in the parking space 120. In another example, the EVCS may determine that a user event has occurred when the microphone detects audio corresponding to a user 106.
  • a first audio 318 is captured after a connector event.
  • the EVCS uses audio recognition software (e.g., speech recognition, sound recognition, etc.) to determine a user characteristic based on the first audio. For example, the EVCS may use speech recognition software to determine that the user 106 shouting “I need help” (the first audio 318) indicates that the user 106 requires assistance (user characteristic). In another example, the microphone recording audio corresponding to a user yelling, crying, swearing, etc., indicates that the user 106 requires assistance (user characteristic).
  • audio recognition software e.g., speech recognition, sound recognition, etc.
  • the EVCS determines customized services based on the determined user characteristic. For example, in response to the microphone detecting the first audio 318, the EVCS may determine that the user 106 requires assistance (e.g., first user characteristic). In response to the first user characteristic, the EVCS may display a charging tutorial on the display of the EVCS. In another example, in response to the microphone detecting a baby crying, the EVCS may determine that the user 106 is a parent (e.g., second user characteristic). In response to the second user characteristic, the EVCS may display an advertisement for a nearby store that sells products for children.
  • the EVCS may display a charging tutorial on the display of the EVCS.
  • the EVCS may display an advertisement for a nearby store that sells products for children.
  • the EVCS may determine that the user 106 is a dog owner (e.g., third user characteristic). In response to the third user characteristic, the EVCS may display an advertisement for dog food. In some embodiments, the EVCS records additional information. For example, the EVCS may record the time period between an event (e.g., connector event) and the first audio 318.
  • an event e.g., connector event
  • the EVCS is configured to determine user characteristics and/or events based on similar such user and/or connector positions. For example, if the plurality of images displays the connector 122 within a threshold distance (e.g., thirty centimeters) of the electric vehicle 104, the EVCS may determine that an attempted charging event occurred. In some embodiments, if the attempted charging event is detected after a threshold time period, the EVCS determines that the user 106 requires assistance. In some embodiments, the EVCS uses sensors mounted to the connector 122 and/or holster to determine if the connector 122 has been removed from the holster. In some embodiments, the EVCS uses sensors mounted to the connector 122 to determine if the connector 122 is connected or disconnected to the electric vehicle 104.
  • a threshold distance e.g., thirty centimeters
  • the EVCS 102 customizes services based on the determined user characteristic. For example, if the user characteristic indicates that the user 106 requires assistance, the EVCS 102 can transmit a notification (e.g., via network 112) indicating that the user 106 requires assistance.
  • the notification may be transmitted to one or more devices associated with a nearby business, on-site employees, other electric vehicle owners, nearby assistants, etc.
  • nearby electric vehicle owners may receive a notification from the EVCS indicating that the user 106 at the EVCS requires assistance.
  • the notification may also indicate a benefit (e.g., compensation, credit, etc.) for assisting the user 106 with charging their electric vehicle 104.
  • the display of the EVCS may display a graphic, turn a color, and/or flash, signaling that the user 106 of the electric vehicle 104 requires assistance.
  • the EVCS 102 causes a light mounted to the EVCS 102 to turn on signaling that user 106 requires assistance.
  • FIG. 4A shows an illustrative diagram of an EVCS 402 displaying notifications (414a-c) on a display 410, in accordance with some embodiments of the disclosure.
  • the EVCS 402 comprises control circuitry and memory.
  • the memory stores instructions for displaying content on the display 410.
  • the control circuitry is disposed inside the housing 404.
  • the control circuitry is mounted on a panel that connects the display 410 to the housing 404.
  • the EVCS 402 comprises a camera 406 coupled to the control circuitry.
  • the EVCS 402 accesses a database comprising a plurality of entries wherein user characteristics are mapped to types of services.
  • the EVCS 102 determines that a first entry, corresponding to a determined user characteristic (e.g., new user), indicates a first service (e.g., displaying a first notification 414a comprising a tutorial).
  • the notifications 414a-c are selectable. For example, a user may touch the first notification 414a causing the first notification 414a to begin playback.
  • the EVCS displays the first notification 414a that comprises a tutorial on how to charge an electric vehicle using the connector 408.
  • the first notification 414a relates to a determined user characteristic.
  • the EVCS 402 may display the first notification 414a after determining that the user is a new user (user characteristic).
  • the EVCS 402 outputs audio related to the first notification 414a through a speaker 412 of the EVCS 402.
  • the second notification 414b displays an advertisement.
  • the second notification 414b relates to the same user characteristic used to generate the first notification 414a.
  • the second notification 414b may comprise an advertisement for car insurance for new electric vehicle owners based on the EVCS determining that the user is a new user (user characteristic).
  • the EVCS 402 displays the first notification 414a and the second notification 414b based on determining first and second user characteristics.
  • the first user characteristic may indicate the user is a new user
  • the second characteristic may indicate that the user is a dog owner.
  • the third notification 414c displays an event within a distance of the EVCS 402. In some embodiments, the third notification 414c relates to the same user characteristic used to generate the first notification 414a. In some embodiments, the EVCS 402 displays the first notification 414a and the third notification 414c based on determining first and second user characteristics. In some embodiments, the third notification 414c comprises directions to the special event.
  • the EVCS 402 displays the notifications 414a-c after an event. For example, after a threshold time period after a connector event, the EVCS 402 may display the first notification 414a if a charging event is not detected. In another example, the EVCS 402 may display the second notification 414b after a charging event is detected. In another example, the EVCS 402 displays the third notification 414c after detecting that a gaze of the user indicates that the user is looking at the display 410.
  • FIG. 4B shows an illustrative diagram of a user device 416 generating notifications 418a-c, in accordance with some embodiments of the disclosure.
  • a user device 416 may be any device or devices capable of displaying notifications 418a-c such as televisions, laptops, tablets, smartphones, and/or similar such devices.
  • the user device 416 receives a user characteristic from an EVCS (e.g., EVCS 402) and/or from a server. In some embodiments, the user device receives the user characteristic from the user. In some embodiments, the user characteristic is stored on the user device 416. In some embodiments, after the user device 416 receives a user characteristic, the user device 416 accesses a database comprising a plurality of entries wherein user characteristics are mapped to types of services. In some embodiments, the user device 416 determines that a first entry, corresponding to a determined user characteristic (e.g., new user) indicates a first service (e.g., displaying a first notification 418a comprising a tutorial).
  • a determined user characteristic e.g., new user
  • a first service e.g., displaying a first notification 418a comprising a tutorial.
  • the user device 416 displays the notifications 418a-c after an event. For example, after a threshold time period after a connector event, the user device 416 may display a first notification 418a if a charging event is not detected. In another example, the user device 416 may display a second notification 418b after a charging event is detected. In another example, the user device 416 displays a third notification 418c after an EVCS detects that a gaze of the user indicates that the user is looking at the user device 416. In some embodiments, the notifications 418a-c are selectable. For example, a user may touch the first notification 418a and the first notification 418a begins playback.
  • the user device 416 displays the first notification 418a that comprises a tutorial on how to charge an electric vehicle.
  • the first notification 418a relates to a determined user characteristic.
  • the user device 416 may display the first notification 418a after determining that the user is a new user (user characteristic).
  • the second notification 418b displays an advertisement.
  • the second notification 418b relates to the same user characteristic used to generate the first notification 418a.
  • the second notification 418b may comprise an advertisement for car insurance for new electric vehicle owners based on the user device 416 determining that the user is a new user (user characteristic).
  • the user device 416 displays the first notification 418a and the second notification 418b based on determining first and second user characteristics.
  • the first user characteristic may indicate the user is a new user
  • the second characteristic may indicate that the user is a dog owner.
  • the third notification 418c displays a special event within a distance of the user device 416 and/or EVCS 402. In some embodiments, the third notification 418c relates to the same user characteristic used to generate the first notification 418a. In some embodiments, the user device 416 displays the first notification 418a and the third notification 418c based on determining first and second user characteristics. In some embodiments, the third notification 418c comprises directions to the special event.
  • FIG. 5 shows an illustrative block diagram of an EVCS system 500, in accordance with some embodiments of the disclosure.
  • EVCS system 500 of FIG. 5 may be the EVCSs depicted and/or described in FIGS. 1, 2A-2B, 3A-3D, and 4A-4B.
  • items shown separately could be combined and some items could be separated.
  • not all shown items must be included in EVCS system 500.
  • EVCS system 500 may comprise additional items.
  • the EVCS system 500 can include processing circuitry 502 that includes one or more processing units (processors or cores), storage 504, one or more network or other communications network interfaces 506, additional peripherals 508, one or more sensors 510, a motor 512 (configured to retract a portion of a charging cable), one or more wireless transmitters and/or receivers 514, and one or more input/output (hereinafter “VO”) paths 516.
  • VO paths 516 may use communication buses for interconnecting the described components.
  • VO paths 516 can include circuitry (sometimes called a chipset) that interconnects and controls communications between system components.
  • EVCS system 500 may receive content and data via VO paths 516.
  • the VO path 516 may provide data to control circuitry 518, which includes processing circuitry 502 and a storage 504.
  • the control circuitry 518 may be used to send and receive commands, requests, and other suitable data using the VO path 516.
  • the VO path 516 may connect the control circuitry 518 (and specifically the processing circuitry 502) to one or more communications paths. VO functions may be provided by one or more of these communications paths but are shown as a single path in FIG. 5 to avoid overcomplicating the drawing.
  • the control circuitry 518 may be based on any suitable processing circuitry such as the processing circuitry 502.
  • processing circuitry should be understood to mean circuitry based on one or more microprocessors, microcontrollers, digital signal processors, programmable logic devices, field-programmable gate arrays (FPGAs), application-specific integrated circuits (ASICs), etc., and may include a multi-core processor (e.g., dual-core, quad-core, hexa-core, or any suitable number of cores) or supercomputer.
  • processing circuitry may be distributed across multiple separate processors or processing units, for example, multiple of the same type of processing units (e.g., two Intel Core i7 processors) or multiple different processors (e.g., an Intel Core i5 processor and an Intel Core i7 processor).
  • the determining customized services based on a characteristic of a user functionality can be at least partially implemented using the control circuitry 518.
  • the determining customized services based on a characteristic of a user functionality described herein may be implemented in or supported by any suitable software, hardware, or combination thereof.
  • the determining customized services based on a characteristic of a user functionality can be implemented on user equipment, on remote servers, or across both.
  • the control circuitry 518 may include communications circuitry suitable for communicating with one or more servers.
  • the instructions for carrying out the above- mentioned functionality may be stored on the one or more servers.
  • Communications circuitry may include a cable modem, an integrated service digital network (ISDN) modem, a digital subscriber line (DSL) modem, a telephone modem, Ethernet card, or a wireless modem for communications with other equipment, or any other suitable communications circuitry.
  • ISDN integrated service digital network
  • DSL digital subscriber line
  • Such communications may involve the Internet or any other suitable communications networks or paths.
  • communications circuitry may include circuitry that enables peer-to-peer communication of user equipment devices, or communication of user equipment devices in locations remote from each other (described in more detail below).
  • Memory may be an electronic storage device provided as the storage 504 that is part of the control circuitry 518.
  • storage device or “memory device” should be understood to mean any device for storing electronic data, computer software, or firmware, such as random-access memory, read-only memory, high-speed random-access memory (e.g., DRAM, SRAM, DDR RAM, or other random-access solid- state memory devices), non-volatile memory, one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, other non-volatile solid-state storage devices, quantum storage devices, and/or any combination of the same.
  • the storage 504 includes one or more storage devices remotely located, such as database of server system that is in communication with EVCS system 500.
  • the storage 504, or alternatively the non-volatile memory devices within the storage 504 includes a non-transitory computer-readable storage medium.
  • storage 504 or the computer-readable storage medium of the storage 504 stores an operating system, which includes procedures for handling various basic system services and for performing hardware dependent tasks.
  • storage 504 or the computer-readable storage medium of the storage 504 stores a communications module, which is used for connecting EVCS system 500 to other computers and devices via the one or more communication network interfaces 506 (wired or wireless), such as the internet, other wide area networks, local area networks, metropolitan area networks, and so on.
  • storage 504 or the computer-readable storage medium of the storage 504 stores a media item module for selecting and/or displaying media items on the display(s) 520 to be viewed by passersby and users of EVCS system 500.
  • storage 504 or the computer-readable storage medium of the storage 504 stores an EVCS module for charging an electric vehicle (e.g., measuring how much charge has been delivered to an electric vehicle, commencing charging, ceasing charging, etc.), including a motor control module that includes one or more instructions for energizing or forgoing energizing the motor.
  • executable modules, applications, or sets of procedures may be stored in one or more of the previously mentioned memory devices and corresponds to a set of instructions for performing a function described above.
  • modules or programs i.e., sets of instructions
  • the storage 504 stores a subset of the modules and data structures identified above.
  • the storage 504 may store additional modules or data structures not described above.
  • EVCS system 500 comprises additional peripherals 508 such as displays 520 for displaying content, and charging cable 522.
  • the displays 520 may be touch-sensitive displays that are configured to detect various swipe gestures (e.g., continuous gestures in vertical and/or horizontal directions) and/or other gestures (e.g., a single or double tap) or to detect user input via a soft keyboard that is displayed when keyboard entry is needed.
  • EVCS system 500 comprises one or more sensors 510 such as cameras (e.g., camera, described above with respect to in FIGS.
  • the one or more sensors 510 are for detecting whether external objects are within a region proximal to EVCS system 500, such as living and nonliving objects, and/or the status of EVCS system 500 (e.g., available, occupied, etc.) in order to perform an operation, such as determining a vehicle characteristic, user information, region status, etc.
  • the user may utter instructions to the control circuitry 518 which are received by a sensor 510 (e.g., microphone).
  • the sensor 510 may be any microphone (or microphones) capable of detecting human speech.
  • the microphone is connected to the processing circuitry 502 to transmit detected voice commands and other speech thereto for processing.
  • voice assistants e.g., Siri, Alexa, Google Home, and similar such voice assistants
  • FIG. 6 shows an illustrative block diagram of a user equipment device system, in accordance with some embodiments of the disclosure.
  • items shown separately could be combined and some items could be separated. In some embodiments, not all shown items must be included in device 600.
  • device 600 may comprise additional items.
  • the user equipment device 600 is the same user equipment device displayed in FIG. 1. The user equipment device 600 may receive content and data via VO path 602.
  • the VO path 602 may provide audio content (e.g., broadcast programming, on-demand programming, Internet content, content available over a local area network (LAN) or wide area network (WAN), and/or other content) and data to control circuitry 604, which includes processing circuitry 606 and a storage 608.
  • the control circuitry 604 may be used to send and receive commands, requests, and other suitable data using the VO path 602.
  • the VO path 602 may connect the control circuitry 604 (and specifically the processing circuitry 606) to one or more communications paths. I/O functions may be provided by one or more of these communications paths but are shown as a single path in FIG. 6 to avoid overcomplicating the drawing.
  • the control circuitry 604 may be based on any suitable processing circuitry such as the processing circuitry 606.
  • processing circuitry should be understood to mean circuitry based on one or more microprocessors, microcontrollers, digital signal processors, programmable logic devices, FPGAs, ASICs, etc., and may include a multi-core processor (e.g., dual-core, quad-core, hexa-core, or any suitable number of cores) or supercomputer.
  • processing circuitry may be distributed across multiple separate processors or processing units, for example, multiple of the same type of processing units (e.g., two Intel Core i7 processors) or multiple different processors (e.g., an Intel Core i5 processor and an Intel Core i7 processor).
  • multiple of the same type of processing units e.g., two Intel Core i7 processors
  • multiple different processors e.g., an Intel Core i5 processor and an Intel Core i7 processor.
  • control circuitry 604 may include communications circuitry suitable for communicating with one or more servers that may at least implement the described allocation of services functionality.
  • the instructions for carrying out the above-mentioned functionality may be stored on the one or more servers.
  • Communications circuitry may include a cable modem, an ISDN modem, a DSL modem, a telephone modem, Ethernet card, or a wireless modem for communications with other equipment, or any other suitable communications circuitry. Such communications may involve the Internet or any other suitable communications networks or paths.
  • communications circuitry may include circuitry that enables peer-to-peer communication of user equipment devices, or communication of user equipment devices in locations remote from each other (described in more detail below).
  • Memory may be an electronic storage device provided as the storage 608 that is part of the control circuitry 604.
  • Storage 608 may include random-access memory, read-only memory, hard drives, optical drives, digital video disc (DVD) recorders, compact disc (CD) recorders, BLU-RAY disc (BD) recorders, BLU-RAY 3D disc recorders, digital video recorders (DVR, sometimes called a personal video recorder, or PVR), solid-state devices, quantum storage devices, gaming consoles, gaming media, or any other suitable fixed or removable storage devices, and/or any combination of the same.
  • the storage 608 may be used to store various types of content described herein.
  • Nonvolatile memory may also be used (e.g., to launch a boot-up routine and other instructions).
  • Cloud-based storage may be used to supplement the storage 608 or instead of the storage 608.
  • the control circuitry 604 may include audio generating circuitry and tuning circuitry, such as one or more analog tuners, audio generation circuitry, filters or any other suitable tuning or audio circuits or combinations of such circuits.
  • the control circuitry 604 may also include scaler circuitry for upconverting and down converting content into the preferred output format of the user equipment device 600.
  • the control circuitry 604 may also include digital-to-analog converter circuitry and analog-to-digital converter circuitry for converting between digital and analog signals.
  • the tuning and encoding circuitry may be used by the user equipment device 600 to receive and to display, to play, or to record content.
  • the circuitry described herein including, for example, the tuning, audio generating, encoding, decoding, encrypting, decrypting, scaler, and analog/digital circuitry, may be implemented using software running on one or more general purpose or specialized processors. If the storage 608 is provided as a separate device from the user equipment device 600, the tuning and encoding circuitry (including multiple tuners) may be associated with the storage 608.
  • the user may utter instructions to the control circuitry 604 which are received by the microphone 616.
  • the microphone 616 may be any microphone (or microphones) capable of detecting human speech.
  • the microphone 616 is connected to the processing circuitry 606 to transmit detected voice commands and other speech thereto for processing.
  • voice assistants e.g., Siri, Alexa, Google Home, and similar such voice assistants receive and process the voice commands and other speech.
  • the user equipment device 600 may optionally include an interface 610.
  • the interface 610 may be any suitable user interface, such as a remote control, mouse, trackball, keypad, keyboard, touch screen, touchpad, stylus inputjoystick, or other user input interfaces.
  • a display 612 may be provided as a stand-alone device or integrated with other elements of the user equipment device 600.
  • the display 612 may be a touchscreen or touch-sensitive display.
  • the interface 610 may be integrated with or combined with the microphone 616.
  • the interface 610 When the interface 610 is configured with a screen, such a screen may be one or more of a monitor, a television, a liquid crystal display (LCD) for a mobile device, active matrix display, cathode ray tube display, lightemitting diode display, organic light-emitting diode display, quantum dot display, or any other suitable equipment for displaying visual images.
  • the interface 610 may be HDTV-capable.
  • the display 612 may be a 3D display.
  • the speaker (or speakers) 614 may be provided as integrated with other elements of user equipment device 600 or may be a stand-alone unit. In some embodiments, the display 612 may be outputted through speaker 614.
  • FIG. 7 shows an illustrative block diagram of a server system 700, in accordance with some embodiments of the disclosure.
  • Server system 700 may include one or more computer systems (e.g., computing devices), such as a desktop computer, a laptop computer, and a tablet computer.
  • the server system 700 is a data server that hosts one or more databases (e.g., databases of images or videos), models, or modules or may provide various executable applications or modules.
  • databases e.g., databases of images or videos
  • models e.g., models, or modules
  • server system 700 may comprise additional items.
  • the server system 700 can include processing circuitry 702 that includes one or more processing units (processors or cores), storage 704, one or more network or other communications network interfaces 706, and one or more I/O paths 708.
  • I/O paths 708 may use communication buses for interconnecting the described components.
  • I/O paths 708 can include circuitry (sometimes called a chipset) that interconnects and controls communications between system components.
  • Server system 700 may receive content and data via I/O paths 708.
  • the I/O path 708 may provide data to control circuitry 710, which includes processing circuitry 702 and a storage 704.
  • the control circuitry 710 may be used to send and receive commands, requests, and other suitable data using the I/O path 708.
  • the I/O path 708 may connect the control circuitry 710 (and specifically the processing circuitry 702) to one or more communications paths. I/O functions may be provided by one or more of these communications paths but are shown as a single path in FIG. 7 to avoid overcomplicating the drawing.
  • the control circuitry 710 may be based on any suitable processing circuitry such as the processing circuitry 702.
  • processing circuitry should be understood to mean circuitry based on one or more microprocessors, microcontrollers, digital signal processors, programmable logic devices, FPGAs, ASICs, etc., and may include a multi-core processor (e.g., dual-core, quad-core, hexa-core, or any suitable number of cores) or supercomputer.
  • processing circuitry may be distributed across multiple separate processors or processing units, for example, multiple of the same type of processing units (e.g., two Intel Core i7 processors) or multiple different processors (e.g., an Intel Core i5 processor and an Intel Core i7 processor).
  • multiple of the same type of processing units e.g., two Intel Core i7 processors
  • multiple different processors e.g., an Intel Core i5 processor and an Intel Core i7 processor.
  • Memory may be an electronic storage device provided as the storage 704 that is part of the control circuitry 710.
  • Storage 704 may include random-access memory, read-only memory, high-speed random-access memory (e.g., DRAM, SRAM, DDR RAM, or other random-access solid-state memory devices), non-volatile memory, one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, other non-volatile solid- state storage devices, quantum storage devices, and/or any combination of the same.
  • high-speed random-access memory e.g., DRAM, SRAM, DDR RAM, or other random-access solid-state memory devices
  • non-volatile memory e.g., one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, other non-volatile solid- state storage devices, quantum storage devices, and/or any combination of the same.
  • storage 704 or the computer-readable storage medium of the storage 704 stores an operating system, which includes procedures for handling various basic system services and for performing hardware dependent tasks.
  • storage 704 or the computer-readable storage medium of the storage 704 stores a communications module, which is used for connecting the server system 700 to other computers and devices via the one or more communication network interfaces 706 (wired or wireless), such as the internet, other wide area networks, local area networks, metropolitan area networks, and so on.
  • storage 704 or the computer-readable storage medium of the storage 704 stores a web browser (or other application capable of displaying web pages), which enables a user to communicate over a network with remote computers or devices.
  • storage 704 or the computer-readable storage medium of the storage 704 stores a database for storing information on electric vehicle charging stations, their locations, media items displayed at respective electric vehicle charging stations, a number of each type of impression count associated with respective electric vehicle charging stations, user profiles, and so forth.
  • executable modules, applications, or sets of procedures may be stored in one or more of the previously mentioned memory devices and corresponds to a set of instructions for performing a function described above.
  • modules or programs i.e., sets of instructions
  • the storage 704 stores a subset of the modules and data structures identified above.
  • the storage 704 may store additional modules or data structures not described above.
  • FIG. 8 is an illustrative flowchart of a process 800 for determining customized services based on a characteristic of a user, in accordance with some embodiments of the disclosure.
  • Process 800 may be performed by physical or virtual control circuitry, such as control circuitry 518 of EVCS system 500 (FIG. 5).
  • some steps of process 800 may be performed by one of several devices.
  • one or more of the steps of process 800 may be performed by a server (FIG. 7).
  • control circuitry detects an electric vehicle.
  • the control circuitry detects an electric vehicle based on information received from one or more sensors mounted to an EVCS.
  • the sensors may be image (e.g., optical) sensors (e.g., one or more cameras), microphones, ultrasound sensors, depth sensors, IR cameras, RGB cameras, PIR cameras, thermal IR, proximity sensors, radar, tension sensors, NFC sensors, and/or any combination thereof.
  • control circuitry determines an electric vehicle is in a location (e.g., parking space) based on information received from the one or more sensors.
  • a camera may transmit images of the location to the control circuitry and the control circuitry can determine (e.g., via images recognition, machine learning, etc.) that an electric vehicle is in the location.
  • the one or more sensors are calibrated to detect vehicles.
  • a first sensor may be calibrated to detect objects larger than a size threshold.
  • a sensor may be calibrated to detect objects that result in a change of inductance corresponding to the presence of a vehicle.
  • the control circuitry detects the electric vehicle when the electric vehicle enters a location.
  • the location comprises the parking space of the EVCS.
  • the location comprises an area larger than the parking space.
  • the location may comprise an area in front of the parking space that an electric vehicle may use to back into the parking space.
  • control circuitry receives a plurality of images.
  • the same sensor that detects the electric vehicle in step 802 captures the plurality of images.
  • a camera e.g., camera 116) captures the plurality of images.
  • the camera captures the plurality of images in response to detecting an electric vehicle in the area proximal to the EVCS.
  • the camera captures the plurality of images in response to detecting an electric vehicle in the parking space corresponding to the EVCS.
  • the camera captures the plurality of images in response to an event (e.g., vehicle event, user event, etc.).
  • control circuitry detects a connector event using the plurality of images.
  • the control circuitry uses a single image to detect the connector event. For example, when an image comprises a connector, the control circuitry may determine that the connector event occurred at the time the image was captured. In some embodiments, the control circuitry determines that the connector event occurred whenever the connector position moves. For example, a first image may display the connector in a first position and a second image may display the connector in a second position. In response to the connector changing from the first position to the second position, the control circuitry may determine that the connector event occurred. In some embodiments, the control circuitry determines that the connector event occurred whenever the connector moves a threshold distance from a first position.
  • control circuitry determines whether a charging event has occurred. In some embodiments, the control circuitry determines that a charging event has occurred when a a connector is connected to the electric vehicle and/or an EVCS is supplying electric current to the electric vehicle. For example, the control circuitry may determine that a charging event has occurred if a first image of the plurality of images shows the connector connected to the electric vehicle.
  • control circuitry may determine that the charging event has occurred if the control circuitry determines that current is flowing from the EVCS to the electric vehicle. In some embodiments, the control circuitry uses sensors mounted to the connector to determine if the connector is connected to or disconnected from the electric vehicle. If the control circuitry determines that the charging event has occurred, the process 800 ends at step 810. If the control circuitry determines that the charging event has not occurred, the process 800 continues to step 812.
  • control circuitry determines whether a threshold time has passed since the connector event.
  • the threshold time corresponds to the average length of time required for a user to connect the connector to their electric vehicle. In some embodiments, the threshold time is longer than the average length of time required for a user to connect the connector to their electric vehicle. If the control circuitry determines that the threshold time has not passed since the connector event, the process 800 continues to step 808, where the control circuitry again determines whether a charging event has occurred. If the control circuitry determines that the threshold time has passed since the connector event, the process 800 continues to step 814.
  • control circuitry transmits a charge message.
  • the charge message comprises instructions on how to operate a connector.
  • the charge message comprises a request for assistance.
  • the control circuitry may transmit the charge message to a plurality of user devices corresponding to nearby electric vehicle users indicating that the user corresponding to the electric vehicle detected in step 802 requires assistance.
  • the charge message is displayed on different devices (e.g., the EVCS, a user device associated with the user of the electric vehicle, user devices associated with other users, etc.).
  • how the charge message is displayed, where the charge message is transmitted, and/or the contents of the charge message are dependent on a characteristic related to the user of the electric vehicle.
  • the charge message may indicate that the user is a new user and requires assistance.
  • the charge message may be transmitted to a user device associated with a location of an employee designated to help disabled users.
  • the charge message indicates a benefit (e.g., compensation, credit, etc.) for assisting the user.
  • the control circuitry also activates a light coupled to the EVCS, based on determining that the threshold time has passed since the connector event.
  • FIG. 9 is another illustrative flowchart of a process 900 for determining customized services based on a characteristic of a user, in accordance with some embodiments of the disclosure.
  • Process 900 may be performed by physical or virtual control circuitry, such as control circuitry 518 of EVCS system 500(FIG. 5).
  • some steps of process 900 may be performed by one of several devices.
  • one or more of the steps of process 900 may be performed by a server (FIG. 7).
  • control circuitry detects an electric vehicle.
  • the control circuitry uses the same or similar methodologies described in step 802 above.
  • control circuitry receives a plurality of images of an area proximal to the electric vehicle and/or the EVCS.
  • the control circuitry uses the same or similar methodologies described in step 804 above.
  • control circuitry determines a user characteristic of a user associated with the electric vehicle. In some embodiments, the control circuitry determines the user characteristic based on information captured by one or more sensors. In some embodiments, the one or more sensors are the same sensors used to detect the electric vehicle in step 902. In some embodiments, the one or more sensors are different from the sensors used to detect the electric vehicle in step 902. In some embodiments, the control circuitry receives the user characteristic in conjunction with receiving a request to charge the electric vehicle. In some embodiments, the control circuitry accesses a profile associated with the electric vehicle from a database and/or server, wherein the profile comprises one or more user characteristics.
  • control circuitry accesses the user characteristic from a database, the user, and/or a third-party provider. In some embodiments, the control circuitry determines the user characteristic after a sensor receives a request to charge an electric vehicle. For example, the user may have to present some credentials (e.g., password, PIN, biometrics, device, item, etc.) to request the control circuitry to charge the electric vehicle.
  • credentials e.g., password, PIN, biometrics, device, item, etc.
  • the control circuitry determines the user characteristic using an electric vehicle characteristic. For example, once the control circuitry determines an electric vehicle characteristic, the control circuitry can determine a user characteristic of a user associated with the electric vehicle. In some embodiments, the control circuitry uses information collected from the one or more sensors during step 902 to determine one or more electric vehicle characteristics (e.g., model, make, color, license plate number, VIN number, charging status, tire pressure, specifications, condition, etc.) of the vehicle. In some embodiments, the control circuitry uses a machine learning algorithm to process information collected by the sensors to determine an electric vehicle characteristic.
  • the control circuitry uses a machine learning algorithm to process information collected by the sensors to determine an electric vehicle characteristic.
  • control circuitry determines whether the user characteristic corresponds to a service.
  • the control circuitry accesses a database comprising a plurality of entries indicating whether one or more user characteristics are associated with a service. For example, the control circuitry can determine that a user with a first user characteristic (e.g., disabled user) requires a first service and a second user with a second characteristic (e.g., not disabled) does not require a service.
  • the control circuitry determines that the user characteristic does not correspond to a service if the user characteristic is not included in the database. For example, the control circuitry may determine the user’s age (user characteristic) and may also determine that no entry in the database corresponds to the user’s age.
  • the control circuitry determines a service type. For example, a first user with a first user characteristic (e.g., the user is disabled) may require a first service type (e.g., assistant connecting the first user’s electric vehicle to an EVCS) and a second user with a second user characteristic (e.g., new user) may require a second service type (e.g., tutorial displayed on the display of the EVCS).
  • a first service type e.g., assistant connecting the first user’s electric vehicle to an EVCS
  • a second user with a second user characteristic e.g., new user
  • a second service type e.g., tutorial displayed on the display of the EVCS
  • control circuitry transmits a notification related to the user characteristic.
  • the notification includes a charging tutorial.
  • the notification indicates the user characteristic determined in step 906.
  • the notification may indicate that a nearby store offers kids’ supplies to a user who is a parent (user characteristic).
  • the control circuitry transmits the notification to one or more user devices based on the user characteristic. For example, if the user characteristic indicates a disabled user, the notification may be transmitted to a plurality of user devices corresponding to nearby electric vehicle users.
  • the control circuitry also activates a light mounted to an EVCS, based on the user characteristic identified in step 904.
  • the notification indicates a benefit (e.g., compensation, credit, etc.) for assisting the user.
  • a benefit e.g., compensation, credit, etc.
  • FIGS. 8-9 may be used with other suitable embodiments of this disclosure.
  • some suitable steps and descriptions described in relation to FIGS. 8-9 may be implemented in alternative orders or in parallel to further the purposes of this disclosure.
  • some suitable steps may be performed in any order or in parallel or substantially simultaneously to reduce lag or increase the speed of the system or method. Some suitable steps may also be skipped or omitted from the process.
  • some suitable devices or equipment discussed in relation to FIGS. 1-7 could be used to perform one or more of the steps in FIGS. 8-9.
  • An electric vehicle charging station comprising: a housing unit; a connector with the housing unit; a camera mounted to the housing unit; and a control circuitry located inside the housing unit, the control circuitry configured to: detect an electric vehicle using a first plurality of images captured by the camera; detect a connector event, wherein the connector event is associated with the connector moving from a position; determine a user characteristic of a user of the electric vehicle using a second plurality of images captured by the camera; and in response to determining the user characteristic, transmitting a notification related to the user characteristic.
  • control circuitry is further configured to charge the electric vehicle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Power Engineering (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Charge And Discharge Circuits For Batteries Or The Like (AREA)

Abstract

Systems and methods are provided herein for an electric vehicle charging station (EVCS) customizing services based on a characteristic of a user of an electric vehicle. This may be accomplished by an EVCS using one or more sensors to detect a connector event. For example, the EVCS may use one or more images to determine that a connector, coupled to the EVCS, moved from a first position to a second position. The EVCS may then determine that a threshold time has elapsed after the connector event without a detection of a charging event. The EVCS may use this determination to determine a characteristic (e.g., the user does not know how to charge their electric vehicle) and then take one or more actions (e.g., transmitting a notification, wherein the notification comprises instructions for charging the electric vehicle with the connector) based on the determined characteristic.

Description

CUSTOMIZING ELECTRIC VEHICLE CHARGING STATION SERVICE BASED ON SENTIMENT ANALYSIS
Background
[0001] The present disclosure relates to computer-implemented techniques for charging electric vehicles, and in particular to techniques for customizing services provided to electric vehicles.
Summary
[0002] As more consumers transition to electric vehicles, there is an increasing demand for electric vehicle charging stations (EVCSs). These EVCSs usually supply electric energy, either using cables or wirelessly, to the batteries of electric vehicles. For example, a user can connect their electric vehicle via cables of an EVCS, and the EVCS supplies electrical current to the user’s electric vehicle. The cables and control systems of the EVCSs can be housed in kiosks in locations to allow a driver of an electric vehicle to park the electric vehicle close to the EVCS and begin the charging process. These kiosks may be placed in areas of convenience, such as in parking lots at shopping centers, in front of commercial buildings, or in other public places. These kiosks often comprise a display that can be used to provide media items to the user to enhance the user’s charging experience. Consequently, passers-by, in addition to users of the EVCS, may notice media items displayed by the EVCS. In some instances, users (e.g., new users) require assistance when utilizing an EVCS. Traditional EVCSs provide little to no assistance to users that require assistance when charging their electric vehicle. EVCSs often provide the same services (e.g., user experience, charging rate, charging cost, etc.) to each electric vehicle that is connected to the EVCSs without considering the unique individual operating the electric vehicle, resulting in suboptimal user experience. For example, a first user that makes multiple unsuccessful attempts to connect their electric vehicle to the EVCS may require additional instructions (e.g., shown on the EVCS display), while a second user that quickly connects their electric vehicle to the EVCS may be more interested in special events nearby.
[0003] Various systems and methods described herein address these problems by providing EVCSs that determine customized services based on a characteristic of the user. To provide customized services based on a characteristic of the user, an EVCS can utilize a camera mounted to the EVCS. The camera may be housed in the upper portion (e.g., between 100 centimeters and 150 centimeters above the ground) of the EVCS to capture images of the faces of the users. The camera may be configured to capture one or more images of an area proximal to the EVCS. For example, a camera may be configured to obtain a video or capture images of the EVCS and/or an area around the EVCS.
[0004] Control circuitry housed within the EVCS can determine a characteristic of a user based on the one or more images captured by the camera. The EVCS may identify an event (e.g., connector event, plug-in event, plug-out event, charging event, user event) using image recognition, then determine a user characteristic based on the identified event. For example, the camera may capture images of an EVCS connector used to supply electric current to an electric vehicle. A first image may display the connector in a first position (e.g., EVCS holster), and the second image may display the connector in a second position. The EVCS may determine that a connector event has occurred based on the first and second images. A connector event may be any time the EVCS determines that the connector has been moved from the EVCS holster. The EVCS may then determine that a charging event has occurred based on a third image, captured by the camera, displaying that the connector is connected to an electric vehicle. A charging event may be any time the EVCS determines that the connector is connected to an electric vehicle and/or the EVCS is supplying electric current to the electric vehicle. When the EVCS determines that a charging event occurs within a threshold time (e.g., 20 seconds) of a connector event, the EVCS may determine that the user does not require assistance (first user characteristic). The EVCS can determine customized services (e.g., types of displays, selected advertisements, types of notifications, user experiences, charging rates, charging costs, etc.) based on the determined user characteristic. In response to the first user characteristic, in this example, that the user does not require assistance, the EVCS may notify the user of a nearby event via a display on the EVCS. If the EVCS determines that a charging event did not occur within the threshold time of a connector event, the EVCS may determine that the user requires assistance (second user characteristic). In response to determining the second user characteristic, the EVCS may display a video on how to operate the connector via the display on the EVCS.
[0005] In another example, the camera may capture images of the face of the user of an electric vehicle, for example, when the user approaches the EVCS (user event). The EVCS may determine the facial expression of the user (e.g., angry, sad, happy, etc.) based on the images captured by the camera. In some embodiments, the EVCS determines the facial expression using detection software (e.g., facial expression recognition software, image recognition software, machine learning, etc.). The EVCS may then determine customized services based on the determined facial expression. For example, the camera may capture an image showing that a user is frowning, and the EVCS may determine that the user is sad (user characteristic) based on the facial expression. In response to the user characteristic, the EVCS may provide a reduced charging rate, coupon for the user, and/or tone change. The EVCS may notify the user of the reduced charging rate and/or coupon by sending a notification to a user device associated with the user.
[0006] Although a camera is described, any suitable sensor can be used to determine a characteristic of the user. These sensors may be image sensors (e.g., one or more cameras), microphones, ultrasound sensors, depth sensors, infrared (IR) cameras, Red Green Blue (RGB) cameras, passive IR (PIR) cameras, proximity sensors, radar, tension sensors, near field communication (NFC) sensors, and/or any similar such sensor. For example, if a microphone, mounted to the EVCS, detects a user shouting after a connector event, the EVCS may determine that the user is angry (first user characteristic). In response to the first user characteristic, in this example, the user being angry, the EVCS may display a video instructing the user on how to operate the connector via a display on the EVCS. In another example, the microphone may detect a baby crying, and the EVCS may determine that the user is a parent (second user characteristic). In response to the second user characteristic, the EVCS may display an advertisement for a nearby store that sells products for children.
[0007] The EVCS may also determine future customized services based on a characteristic of a user. For example, a camera mounted to the EVCS may capture a first image, and the EVCS may determine that a connector event has occurred based on the first image. The EVCS may then determine that a charging event did not occur within the threshold time of the connector event. The EVCS may determine, for example, that the user had trouble removing the connector from an EVCS holster (user characteristic) based on a second image captured by the camera. For example, the second image may have been taken ten seconds after the connector event and shows that the connector only moved three centimeters from the original position. The EVCS can transmit this user characteristic to a second device, notifying developers of possible deficiencies in the EVCS (e.g., the EVCS needs repair) and/or in the design of the EVCS. In another example, the camera captures an image of the user and the EVCS determines a user characteristic (e.g., demographic of the user based on the captured image. The EVCS may update a profile associated with the user with the determined user characteristic. In some embodiments, the EVCS determines customized services (e.g., providing advertisements related to the demographic of the user), based on the user characteristic stored in the profile.
Brief Description of the Drawings
[0008] The below and other objects and advantages of the disclosure will be apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings, in which:
[0009] FIG. 1 shows an illustrative diagram of a system for determining customized services based on a characteristic of the user, in accordance with some embodiments of the disclosure;
[0010] FIGS. 2 A and 2B show other illustrative diagrams of a system for determining customized services based on a characteristic of the user, in accordance with some embodiments of the disclosure;
[0011] FIGS. 3A-3D show other illustrative diagrams of a system for determining customized services based on a characteristic of the user, in accordance with some embodiments of the disclosure;
[0012] FIGS. 4 A and 4B show illustrative diagrams of notifications indicating customized services, in accordance with some embodiments of the disclosure;
[0013] FIG. 5 shows an illustrative block diagram of an EVCS system, in accordance with some embodiments of the disclosure;
[0014] FIG. 6 shows an illustrative block diagram of a user equipment device system, in accordance with some embodiments of the disclosure;
[0015] FIG. 7 shows an illustrative block diagram of a server system, in accordance with some embodiments of the disclosure;
[0016] FIG. 8 is an illustrative flowchart of a process of determining customized services based on a characteristic of the user, in accordance with some embodiments of the disclosure; and
[0017] FIG. 9 is another illustrative flowchart of a process of determining customized services based on a characteristic of the user, in accordance with some embodiments of the disclosure.
Detailed Description
[0018] FIG. 1 shows an illustrative diagram of a system for determining customized services based on a characteristic of a user 106 of an electric vehicle 104, in accordance with some embodiments of the disclosure. In some embodiments, an EVCS 102 provides an electric charge to the electric vehicle 104 via a charging cable 122 or a wireless connection (e.g., wireless charging). The EVCS 102 may be in communication with the electric vehicle 104 and/or a user device 108 belonging to a user 106 (e.g., a driver, passenger, owner, renter, or other operator of the electric vehicle 104) that is associated with the electric vehicle 104. In some embodiments, the EVCS 102 communicates with one or more devices or computer systems, such as user device 108 or server 110, respectively, via a network 112. Although some steps or methods may be described as being executed by the EVCS 102, user device 108, and/or server 110, said steps and methods may also be performed by any combination of the devices.
[0019] There can be more than one EVCS 102, electric vehicle 104, user 106, user device 108, server 110, and network 112, but only one of each is shown in FIG. 1 to avoid overcomplicating the drawing. In addition, a user 106 may utilize more than one type of user device 108 and more than one of each type of user device 108. In some embodiments, there may be paths 114a-d between user devices, EVCSs, servers, and/or electric vehicles, so that the items may communicate directly with each other via communication paths, as well as other short-range point-to-point communication paths, such as USB cables, IEEE 1394 cables, wireless paths (e.g., Bluetooth, infrared, IEEE 802-1 lx, etc.), or other short-range communication via wired or wireless paths. In some embodiment, the devices may also communicate with each other directly through an indirect path via a communications network. The communications network may be one or more networks including the Internet, a mobile phone network, mobile voice or data network (e.g., a 4G, 5G, or LTE network), cable network, public switched telephone network, or other types of communications network or combinations of communications networks. In some embodiments, a communication network path comprises one or more communications paths, such as, a satellite path, a fiberoptic path, a cable path, a path that supports Internet communications (e.g., IPTV), free-space connections (e.g., for broadcast or other wireless signals), or any other suitable wired or wireless communications path or combination of such paths. In some embodiments, a communication network path can be a wireless path. Communications with the devices may be provided by one or more communication paths but are shown as single paths in FIG. 1 to avoid overcomplicating the drawing.
[0020] In some embodiments, the EVCS 102 determines a characteristic of the user 106 based on information detected from one more sensors. The one or more sensors may comprise image sensors (e.g., camera 116), microphones, ultrasound sensors, depth sensors, IR cameras, RGB cameras, PIR cameras, proximity sensors, radar, tension sensors, NFC sensors, and/or any combination thereof. For example, one or more images captured may be captured by a camera 116 mounted to the EVCS 102. In some embodiments, the camera 116 is housed in the upper portion (e.g., between 100 centimeters and 150 centimeters above the ground) of the EVCS 102. In some embodiments, the camera 116 is configured to capture one or more images of an area proximal to the EVCS 102. For example, a camera 116 may be configured to obtain a video or capture images of the EVCS 102 and/or an area around a parking space 120 corresponding to the EVCS 102.
[0021] In some embodiments, the EVCS 102 identifies a user characteristic based on a characteristic of the electric vehicle 104. In some embodiments, the EVCS 102 determines a characteristic of the electric vehicle 104 based on information received from one or more sensors (e.g., camera 116). In some embodiments, after the camera 116 captures information about the electric vehicle 104, the EVCS 102 determines an electric vehicle characteristic (e.g., model, make, license plate, VIN number, tire pressure, specifications, condition, etc.) based on the captured information. In some embodiments, the EVCS 102 accesses a database linking characteristics of electric vehicles to users of the electric vehicles. In some embodiments, the EVCS 102 determines characteristics of the electric vehicle 104 using ISO 15118 when the electric vehicle 104 is connected to the EVCS 102. In some embodiments, the EVCS 102 receives a media access control (MAC) address from the electric vehicle 104, and the EVCS 102 uses the MAC address to determine vehicle characteristics of the electric vehicle 104 and/or to determine the user 106 associated with the electric vehicle 104. The EVCS 102 can use a database to match the received MAC address or portions of the received MAC address to entries in the database to determine vehicle characteristics of the electric vehicle 104. For example, certain vehicle manufacturers keep portions of their produced electric vehicle’s MAC addresses consistent. Accordingly, if the EVCS 102 determines that a portion of the MAC address received from the electrical vehicle 104 corresponds to an electric vehicle manufacturer, the EVCS 102 can determine vehicle characteristics of the electric vehicle 104. The EVCS 102 can also use a database to match the received MAC address or portions of the received MAC address to entries in the database to determine the user 106 associated with the electric vehicle 104. For example, the electric vehicle’s MAC address may correspond to a user profile corresponding to the user 106 associated with the electric vehicle 104.
[0022] In some embodiments, the EVCS 102 identifies a user characteristic when the user 106 requests the EVCS 102 to charge the electric vehicle 104. For example, the user 106 may present credentials (e.g., password, PEST, biometrics, device, item, etc.) to request the EVCS 102 to charge the electric vehicle 104. In some embodiments, the user 106 inputs a password using the display 118 of the EVCS 102. In some embodiments, the EVCS 102 receives credentials from the user device 108 when the user device 108 is within a distance (e.g., two meters) from the EVCS 102. In some embodiments, the EVCS 102 identifies a user profile associated with the user 106 based on the received credentials. In some embodiments, the user profile comprises one or more user characteristics associated with the user 106. In some embodiments, the user 106 selects and/or reserves EVCS 102. For example, the user 106 can select the EVCS 102 when the user inputs the EVCS 102 as a destination in a navigation system. In some embodiments, the navigation system is implemented using the user device 108 and/or the electric vehicle 104. In another example, the user 106 may access a web page and reserve the EVCS 102 for a time period. In some embodiments, when the user 106 selects and/or reserves the EVCS 102, the EVCS 102 receives a notification indicating a profile and/or user characteristic associated with the user 106 that made the selection/reservation. In some embodiments, the EVCS 102 determines a user characteristic using the profile.
[0023] In some embodiments, the EVCS 102 determines customized services based on the facial expression of the user 106. In some embodiments, the camera 116 captures images of the face of the user 106 of the electric vehicle 104, for example, when the user 106 approaches the EVCS 102 (user event). In some embodiments, the EVCS 102 determines the facial expression of the user 102 (e.g., angry, sad, happy, etc.) based on facial expression recognition of the images captured by the camera 116. In some embodiments, the EVCS 102 determines customized services based on the determined facial expression (user characteristic).
[0024] In some embodiments, the EVCS 102 determines customized services (e.g., types of displays, selected advertisements, types of notifications, user experiences, charging rates, charging costs, etc.) based on the determined user characteristic. For example, if the EVCS 102 determines that the user 106 does not require assistance (e.g., first user characteristic), the EVCS 102 may notify the user 106 of a nearby event via a display 118. In another example, if the EVCS 102 determines that the user 106 requires assistance (e.g., second user characteristic), the EVCS 102 displays a video on how to operate the connector 122 via the display 118.
[0025] In some embodiments, the EVCS 102 accesses a database comprising a plurality of entries wherein user characteristics are mapped to types of services. In some embodiments, the EVCS 102 determines that a first entry, corresponding to a determined user characteristic (e.g., new user) indicates a first service (e.g., displaying a tutorial). In some embodiments, the EVCS 102 receives the user characteristic and/or service type from the server 110, user device 108, and/or electric vehicle 104.
[0026] FIGS. 2 A and 2B show other illustrative diagrams of a system for determining customized services based on a characteristic of a user 106, in accordance with some embodiments of the disclosure. In some embodiments, FIGS. 2A and 2B use the same or similar methods and devices as those described in FIG. 1.
[0027] In some embodiments, a camera (e.g., camera 116) captures a plurality of images of an area proximal to an EVCS. In some embodiments, the camera captures the plurality of images in response to detecting an electric vehicle 104 in the area proximal to the EVCS. In some embodiments, the camera captures the plurality of images in response to detecting an electric vehicle 104 in the parking space 120 corresponding to the EVCS. In some embodiments, the camera captures the plurality of images in response to an event (e.g., connector event, charging event, user event).
[0028] In some embodiments, FIG. 2A shows a first image 202 of the plurality of images captured by a camera (e.g., camera 116). In some embodiments, the plurality of images are captured in response to a connector event. In some embodiments, the connector event is detected whenever the connector 122 is detected by the camera. For example, when the camera detects the connector 122 entering the field of view of the camera at a first position 204, the first image 202 may be captured and/or processed. In some embodiments, the connector event is detected whenever the connector 122 position moves. For example, the first image 202 displays the connector 122 in the first position 204, and a second image 206 displays the connector 122 in a second position 208. In response to the connector 122 being detected in two different positions (e.g., first position 204 and second position 208), the EVCS may determine that a connector event has occurred.
[0029] In some embodiments, after detection of a connector event, the EVCS determines whether a charging event occurs. A charging event may be any time the EVCS determines that the connector 122 is connected to the electric vehicle 104 and/or the EVCS is supplying electric current to the electric vehicle 104. For example, if the camera captures an image of the connector 122 connected to the electric vehicle 104, the EVCS determines that a charging event has occurred. In another example, if the EVCS determines that current is flowing from the EVCS to the electric vehicle 104, the EVCS determines that a charging event has occurred. When the EVCS determines that a charging event occurs within a threshold time (e.g., 20 seconds) of a connector event, the EVCS may determine, for example, that the user 106 does not require assistance (user characteristic).
[0030] If the EVCS determines that a charging event does not occur within the threshold time of a connector event, the EVCS may determine, for example, that the user requires assistance (second user characteristic). For example, the first image 202 may correspond to the detection of a connector event. After the threshold time period, the camera may capture the second image 206. The EVCS may determine that the connector 122 is located at the second position 208 and is not connected to the electric vehicle 104. In some embodiments, in response to determining that the charging event did not occur within the threshold time of the connector event, the EVCS determines, for example, that the user 106 requires assistance (user characteristic).
[0031] In some embodiments, the EVCS is configured to determine user characteristics and/or events based on similar such user and/or connector positions. For example, if the plurality of images displays the connector 122 within a threshold distance (e.g., 30 centimeters) of the electric vehicle 104, the EVCS may determine that an attempted charging event occurred. In some embodiments, if the attempted charging event is detected after a threshold time period, the EVCS determines that the user 106 requires assistance. In some embodiments, the EVCS uses sensors mounted to the connector 122 and/or holster to determine if the connector 122 has been removed from the holster. In some embodiments, the EVCS uses sensors mounted to the connector 122 to determine if the connector 122 is connected or disconnected to the electric vehicle 104.
[0032] In some embodiments, the EVCS determines customized services based on the determined user characteristic. For example, in response to determining, for example, that the user 106 does not require assistance (first user characteristic), the EVCS notifies the user 106 of a nearby event via a user device associated with the user. In another example, in response to determining that the user 106 requires assistance (second user characteristic), the EVCS displays a video on how to operate the connector 122 via a display on the EVCS. [0033] In some embodiments, the EVCS uses one or more items in the first image 202 and second image 206 to determine a characteristic of the user 106. For example, the EVCS determines (e.g., using optical character recognition) the characters of the license plate 212. In some embodiments, the EVCS 102 uses a database to look up user characteristics of the user 106 using the license plate information. For example, the database may comprise public records (e.g., public registration information linking license plates to user profiles), collected information (e.g., entries linking license plates to user characteristics based on data inputted by a user), historic information (entries linking license plates to user characteristics based on the EVCS 102 identifying user characteristics related to one or more license plates in the past), and/or similar such information. In some embodiments, the EVCS determines a user characteristic (e.g., user 106 is disabled) based on determining that the license plate 212 is associated with a person with a disability. In another example, the EVCS may determine (e.g., via image recognition) that an item is located inside the electric vehicle 104. In some embodiments, the item is associated with a user characteristic. For example, the item may be a disabled person placard. In some embodiments, the EVCS determines a user characteristic (e.g., user 106 is disabled) based on determining that the item is located inside the electric vehicle 104.
[0034] In some embodiments, the EVCS 102 customizes services based on the determined user characteristic. For example, if the user characteristic indicates that the user 106 requires assistance, the EVCS 102 can transmit a notification (e.g., via network 112) indicating that the user 106 requires assistance. The notification may be transmitted to one or more devices associated with a nearby business, on-site employees, other electric vehicle owners, nearby assistants, etc. For example, nearby electric vehicle owners may receive a notification from the EVCS indicating that the user 106 at the EVCS requires assistance. The notification may also indicate a benefit (e.g., compensation, credit, etc.) for assisting the user 106 with charging their electric vehicle 104. In another example, if the user characteristic indicates that the user 106 is a new user, the display of the EVCS may display a graphic, turn a color, and/or flash, signaling that the user 106 of the electric vehicle 104 requires assistance. In some embodiments, the EVCS 102 causes a light mounted to the EVCS 102 to turn on, signaling that user 106 requires assistance.
[0035] FIGS. 3A-3D show other illustrative diagrams of a system for determining customized services based on a characteristic of a user 106, in accordance with some embodiments of the disclosure. In some embodiments, FIGS. 3A-3D use the same or similar methods and devices as those described in FIGS. 1, 2A, and 2B.
[0036] In some embodiments, a camera (e.g., camera 116) captures a plurality of images of an area proximal to an EVCS. In some embodiments, the camera captures the plurality of images in response to detecting an electric vehicle 104 in the area proximal to the EVCS. In some embodiments, the camera captures the plurality of images in response to detecting an electric vehicle 104 in the parking space 120 corresponding to the EVCS. In some embodiments, the camera captures the plurality of images in response to an event (e.g., connector event, charging event, user event).
[0037] In some embodiments, FIGS. 3A-3C show images (first image 302, second image 306, and third image 312) of the plurality of images captured by a camera (e.g., camera 116). In some embodiments, the plurality of images are captured in response to a user event. In some embodiments, the user event is detected when a user 106 is detected in the first image 302. In some embodiments, the user event is detected when a user 106 leaves the electric vehicle 104. For example, the EVCS may determine that a user event has occurred if the EVCS determines that user 104 is a distance (e.g., thirty centimeters) away from the electric vehicle 104. In some embodiments, the user event is detected when a user 106 is within a second distance (e.g., 60 centimeters) from the EVCS. In some embodiments, the user event is detected when the user 106 requests a charge. For example, a user event may be detected when the user 106 interacts with the EVCS. Although only one user is shown, the EVCS may detect more than one user. For example, multiple users may exit the electric vehicle 104.
[0038] In some embodiments, a first image 302 is captured after a user event. In some embodiments, the EVCS uses facial expression recognition to determine the first facial expression 304 of the user 106. In some embodiments, the EVCS uses the first facial expression 304 as the user characteristic. For example, the EVCS may determine that the first facial expression 304 corresponds to a sad facial expression (e.g., user characteristic). In some embodiments, the EVCS determines customized services based on the user characteristic. For example, in response to the first facial expression 304, the EVCS may provide a reduced charging rate, coupon for the user 106, and/or tone change. In some embodiments, the EVCS may notify the user 106 of the reduced charging rate and/or coupon by sending a notification to a user device associated with the user 106. In some embodiments, a tone change indicates user interface (UI) changes related to the EVCS and/or application on a user device corresponding to the user. For example, in response to a sad facial expression (user characteristic) the EVCS may display an animation (e.g., confetti) on the EVCS display, change color of the EVCS display, output audio (e.g., happy jingle) using speakers of the EVCS, and/or similar such UI changes. In some embodiments, the EVCS updates a profile associated with the user 106 with the determined user characteristic. For example, the EVCS may update that the user had a first facial expression at a first time. In another example, the EVCS may be able to determine the gender, age, ethnicity, preferences, and/or similar such information (user characteristics) of the user 106 based on the first image 302 and store the determined user characteristic in the profile. In some embodiments, the EVCS may determine user characteristics for more than one user in an image. For example, the EVCS may determine that a first user is an adult (e.g., first user characteristic) and a second user is a child (e.g., second user characteristic). The EVCS may determine customized services based on the multiple user characteristics. For example, the EVCS may display a first advertisement (e.g., vehicle advertisement) associated with the first user characteristic (e.g., adult) and a second advertisement (e.g., toy advertisement) associated with the second user characteristic (e.g., child). In another example, the EVCS may display a single advertisement (e.g., family movie advertisement) based on both user characteristics. [0039] In some embodiments, the second image 306 is captured after a user event and/or a connector event. In some embodiments, the second image 306 is captured after a threshold time period after the user event and/or connector event. In some embodiments, the EVCS determines the second facial expression 310 of the user 106 using the same or similar methodologies as described herein. In some embodiments, the EVCS determines that the second facial expression 310 corresponds to a frustrated facial expression (e.g., user characteristic). In some embodiments, the EVCS determines customized services based on the user characteristic. For example, in response to determining that the user 106 has a second facial expression 310 corresponding to a frustrated user, the EVCS may display a video on how to operate the connector 122 via the display on the EVCS.
[0040] In some embodiments, the EVCS records additional information. For example, the EVCS may record that the connector 122 was located at a first position 308 when the second facial expression 310 was detected. In another example, the EVCS may record the change in facial expression (e.g., changing from happy to the second facial expression 310). In another example, the EVCS may record the amount of time between events. In another example, the EVCS may record the amount of time between an event and the detection of the second facial expression 310. In some embodiments, the EVCS transmits the additional information to a user device. In some embodiments, the additional information is used to determine that the user 106 had trouble connecting the connector 122 to the electric vehicle 104 based on the second image 304 captured by the camera. For example, the second image 306 may have been taken thirty seconds after the connector event and displays that the connector 122 has not been connected to the electric vehicle and the user is frustrated (e.g., second facial expression 310). In some embodiments, the additional information signals deficiencies in the EVCS that were captured in the second image 306 (e.g., the EVCS needs repair) and/or in the design of the type of EVCS corresponding to the EVCS that were captured in the second image 306. For example, if one or more users take multiple attempts to disconnect the connector 122 from electric vehicles, then this may indicate that the EVCS 104 requires repair. In some embodiments, the plurality of images provide redundancies in case of other system failures. For example, if a sensor on the connector 122 does not detect a plug-in event, then the EVCS may not charge the electric vehicle 104. In cases when the sensor on the connector 122 fails to correctly detect a plug-in event, the EVCS may not charge the electric vehicle 104 despite having the connector 122 connected to the electric vehicle 104. However, if an image (e.g., image 312) displays a plugin event (shows the connector 122 connected to the electric vehicle 104), then the EVCS may charge the electric vehicle 104 despite the sensor failing to detect the plug-in event.
[0041] In some embodiments, FIG. 3C shows a third image 312 captured after a user event, connector event, and/or charging event. In some embodiments, the third image 312 is captured after a threshold time period after a user event, connector event, and/or charging event. In some embodiments, the EVCS uses facial expression recognition to determine the gaze 316 of the user 106. For example, the EVCS may determine whether the gaze 316 of the user indicates that the user 106 is looking at the EVCS display, user device, electric vehicle, etc. In some embodiments, the EVCS determines a user characteristic based on the determined gaze 316. For example, the EVCS may determine that the gaze 316 indicates that the user is looking at the EVCS. In response to determining that the gaze 316 indicates that the user is looking at the EVCS, the EVCS may display an advertisement on the display for the user 106.
[0042] In some embodiments, the EVCS records additional information. In some embodiments, the EVCS records the time period when the gaze 316 indicates that the user 106 is looking at the EVCS display, user device, and/or electric vehicle 104. In some embodiments, the EVCS records the number of times that the gaze 316 indicates that the user 106 is looking at the EVCS display, user device, and/or electric vehicle 104.
[0043] In some embodiments, the EVCS determines a next action for the user 106 after the charging event. A next action may correspond to a location and/or activity corresponding to the user 106 after the charging event. In some embodiments, the EVCS determines that a charging event has occurred because the third image 312 displays the connector 122 connected to the electric vehicle 104 (e.g., in a second position 314). In some embodiments, the EVCS determines the location of the user after the charging event. For example, the camera may capture a plurality of images after the charging event, and the EVCS may determine whether the user 106 is located in the electric vehicle 104, is located outside the vehicle 104, and/or leaves the frame of view of the camera. In some embodiments, the EVCS stores one or more next actions as additional information. In some embodiments, the EVCS stores time periods related to one or more actions as additional information. For example, the EVCS may determine that the user 106 left the frame two minutes after the charging event and then entered the frame one hour after leaving the frame. In some embodiments, the EVCS transmits the additional information to a user device.
[0044] FIG. 3D shows another illustrative diagram of a system 350 for determining customized services based on a characteristic of a user 106, in accordance with some embodiments of the disclosure.
[0045] In some embodiments, a microphone mounted to an EVCS records audio in response to the EVCS detecting an electric vehicle 104. In some embodiments, the microphone records audio in response to the EVCS detecting the electric vehicle 104 in the parking space 120. In some embodiments, the microphone records audio in response to an event (e.g., connector event, charging event, user event). For example, if the EVCS uses the first image 302 to detect a user event, the microphone may begin to record audio. In some embodiments, the EVCS determines an event based on audio recorded by the microphone. For example, the EVCS may determine that a parking event has occurred when the microphone detects audio corresponding to an electric vehicle 104 in the parking space 120. In another example, the EVCS may determine that a user event has occurred when the microphone detects audio corresponding to a user 106.
[0046] In some embodiments, a first audio 318 is captured after a connector event.
In some embodiments, the EVCS uses audio recognition software (e.g., speech recognition, sound recognition, etc.) to determine a user characteristic based on the first audio. For example, the EVCS may use speech recognition software to determine that the user 106 shouting “I need help” (the first audio 318) indicates that the user 106 requires assistance (user characteristic). In another example, the microphone recording audio corresponding to a user yelling, crying, swearing, etc., indicates that the user 106 requires assistance (user characteristic).
[0047] In some embodiments, the EVCS determines customized services based on the determined user characteristic. For example, in response to the microphone detecting the first audio 318, the EVCS may determine that the user 106 requires assistance (e.g., first user characteristic). In response to the first user characteristic, the EVCS may display a charging tutorial on the display of the EVCS. In another example, in response to the microphone detecting a baby crying, the EVCS may determine that the user 106 is a parent (e.g., second user characteristic). In response to the second user characteristic, the EVCS may display an advertisement for a nearby store that sells products for children. In another example, in response to the microphone detecting a dog bark, the EVCS may determine that the user 106 is a dog owner (e.g., third user characteristic). In response to the third user characteristic, the EVCS may display an advertisement for dog food. In some embodiments, the EVCS records additional information. For example, the EVCS may record the time period between an event (e.g., connector event) and the first audio 318.
[0048] In some embodiments, the EVCS is configured to determine user characteristics and/or events based on similar such user and/or connector positions. For example, if the plurality of images displays the connector 122 within a threshold distance (e.g., thirty centimeters) of the electric vehicle 104, the EVCS may determine that an attempted charging event occurred. In some embodiments, if the attempted charging event is detected after a threshold time period, the EVCS determines that the user 106 requires assistance. In some embodiments, the EVCS uses sensors mounted to the connector 122 and/or holster to determine if the connector 122 has been removed from the holster. In some embodiments, the EVCS uses sensors mounted to the connector 122 to determine if the connector 122 is connected or disconnected to the electric vehicle 104.
[0049] In some embodiments, the EVCS 102 customizes services based on the determined user characteristic. For example, if the user characteristic indicates that the user 106 requires assistance, the EVCS 102 can transmit a notification (e.g., via network 112) indicating that the user 106 requires assistance. The notification may be transmitted to one or more devices associated with a nearby business, on-site employees, other electric vehicle owners, nearby assistants, etc. For example, nearby electric vehicle owners may receive a notification from the EVCS indicating that the user 106 at the EVCS requires assistance. The notification may also indicate a benefit (e.g., compensation, credit, etc.) for assisting the user 106 with charging their electric vehicle 104. In another example, if the user characteristic indicates that the user 106 is a new user, the display of the EVCS may display a graphic, turn a color, and/or flash, signaling that the user 106 of the electric vehicle 104 requires assistance. In some embodiments, the EVCS 102 causes a light mounted to the EVCS 102 to turn on signaling that user 106 requires assistance.
[0050] FIG. 4A shows an illustrative diagram of an EVCS 402 displaying notifications (414a-c) on a display 410, in accordance with some embodiments of the disclosure. In some embodiments, the EVCS 402 comprises control circuitry and memory. In some embodiments, the memory stores instructions for displaying content on the display 410. In some embodiments, the control circuitry is disposed inside the housing 404. In some embodiments, the control circuitry is mounted on a panel that connects the display 410 to the housing 404. In some embodiments, the EVCS 402 comprises a camera 406 coupled to the control circuitry.
[0051] In some embodiments, the EVCS 402 accesses a database comprising a plurality of entries wherein user characteristics are mapped to types of services. In some embodiments, the EVCS 102 determines that a first entry, corresponding to a determined user characteristic (e.g., new user), indicates a first service (e.g., displaying a first notification 414a comprising a tutorial). In some embodiments, the notifications 414a-c are selectable. For example, a user may touch the first notification 414a causing the first notification 414a to begin playback.
[0052] In some embodiments, the EVCS displays the first notification 414a that comprises a tutorial on how to charge an electric vehicle using the connector 408. In some embodiments, the first notification 414a relates to a determined user characteristic. For example, the EVCS 402 may display the first notification 414a after determining that the user is a new user (user characteristic). In some embodiments, the EVCS 402 outputs audio related to the first notification 414a through a speaker 412 of the EVCS 402.
[0053] In some embodiments, the second notification 414b displays an advertisement. In some embodiments, the second notification 414b relates to the same user characteristic used to generate the first notification 414a. For example, the second notification 414b may comprise an advertisement for car insurance for new electric vehicle owners based on the EVCS determining that the user is a new user (user characteristic). In some embodiments, the EVCS 402 displays the first notification 414a and the second notification 414b based on determining first and second user characteristics. For example, the first user characteristic may indicate the user is a new user, and the second characteristic may indicate that the user is a dog owner.
[0054] In some embodiments, the third notification 414c displays an event within a distance of the EVCS 402. In some embodiments, the third notification 414c relates to the same user characteristic used to generate the first notification 414a. In some embodiments, the EVCS 402 displays the first notification 414a and the third notification 414c based on determining first and second user characteristics. In some embodiments, the third notification 414c comprises directions to the special event.
[0055] In some embodiments, the EVCS 402 displays the notifications 414a-c after an event. For example, after a threshold time period after a connector event, the EVCS 402 may display the first notification 414a if a charging event is not detected. In another example, the EVCS 402 may display the second notification 414b after a charging event is detected. In another example, the EVCS 402 displays the third notification 414c after detecting that a gaze of the user indicates that the user is looking at the display 410.
[0056] FIG. 4B shows an illustrative diagram of a user device 416 generating notifications 418a-c, in accordance with some embodiments of the disclosure. Although a smartphone is used in this example, a user device 416 may be any device or devices capable of displaying notifications 418a-c such as televisions, laptops, tablets, smartphones, and/or similar such devices.
[0057] In some embodiments, the user device 416 receives a user characteristic from an EVCS (e.g., EVCS 402) and/or from a server. In some embodiments, the user device receives the user characteristic from the user. In some embodiments, the user characteristic is stored on the user device 416. In some embodiments, after the user device 416 receives a user characteristic, the user device 416 accesses a database comprising a plurality of entries wherein user characteristics are mapped to types of services. In some embodiments, the user device 416 determines that a first entry, corresponding to a determined user characteristic (e.g., new user) indicates a first service (e.g., displaying a first notification 418a comprising a tutorial). In some embodiments, the user device 416 displays the notifications 418a-c after an event. For example, after a threshold time period after a connector event, the user device 416 may display a first notification 418a if a charging event is not detected. In another example, the user device 416 may display a second notification 418b after a charging event is detected. In another example, the user device 416 displays a third notification 418c after an EVCS detects that a gaze of the user indicates that the user is looking at the user device 416. In some embodiments, the notifications 418a-c are selectable. For example, a user may touch the first notification 418a and the first notification 418a begins playback.
[0058] In some embodiments, the user device 416 displays the first notification 418a that comprises a tutorial on how to charge an electric vehicle. In some embodiments, the first notification 418a relates to a determined user characteristic. For example, the user device 416 may display the first notification 418a after determining that the user is a new user (user characteristic).
[0059] In some embodiments, the second notification 418b displays an advertisement. In some embodiments, the second notification 418b relates to the same user characteristic used to generate the first notification 418a. For example, the second notification 418b may comprise an advertisement for car insurance for new electric vehicle owners based on the user device 416 determining that the user is a new user (user characteristic). In some embodiments, the user device 416 displays the first notification 418a and the second notification 418b based on determining first and second user characteristics. For example, the first user characteristic may indicate the user is a new user, and the second characteristic may indicate that the user is a dog owner.
[0060] In some embodiments, the third notification 418c displays a special event within a distance of the user device 416 and/or EVCS 402. In some embodiments, the third notification 418c relates to the same user characteristic used to generate the first notification 418a. In some embodiments, the user device 416 displays the first notification 418a and the third notification 418c based on determining first and second user characteristics. In some embodiments, the third notification 418c comprises directions to the special event.
[0061] FIG. 5 shows an illustrative block diagram of an EVCS system 500, in accordance with some embodiments of the disclosure. In particular, EVCS system 500 of FIG. 5 may be the EVCSs depicted and/or described in FIGS. 1, 2A-2B, 3A-3D, and 4A-4B. In practice, and as recognized by those of ordinary skill in the art, items shown separately could be combined and some items could be separated. In some embodiments, not all shown items must be included in EVCS system 500. In some embodiments, EVCS system 500 may comprise additional items.
[0062] The EVCS system 500 can include processing circuitry 502 that includes one or more processing units (processors or cores), storage 504, one or more network or other communications network interfaces 506, additional peripherals 508, one or more sensors 510, a motor 512 (configured to retract a portion of a charging cable), one or more wireless transmitters and/or receivers 514, and one or more input/output (hereinafter “VO”) paths 516. VO paths 516 may use communication buses for interconnecting the described components. VO paths 516 can include circuitry (sometimes called a chipset) that interconnects and controls communications between system components. EVCS system 500 may receive content and data via VO paths 516. The VO path 516 may provide data to control circuitry 518, which includes processing circuitry 502 and a storage 504. The control circuitry 518 may be used to send and receive commands, requests, and other suitable data using the VO path 516. The VO path 516 may connect the control circuitry 518 (and specifically the processing circuitry 502) to one or more communications paths. VO functions may be provided by one or more of these communications paths but are shown as a single path in FIG. 5 to avoid overcomplicating the drawing. [0063] The control circuitry 518 may be based on any suitable processing circuitry such as the processing circuitry 502. As referred to herein, processing circuitry should be understood to mean circuitry based on one or more microprocessors, microcontrollers, digital signal processors, programmable logic devices, field-programmable gate arrays (FPGAs), application-specific integrated circuits (ASICs), etc., and may include a multi-core processor (e.g., dual-core, quad-core, hexa-core, or any suitable number of cores) or supercomputer. In some embodiments, processing circuitry may be distributed across multiple separate processors or processing units, for example, multiple of the same type of processing units (e.g., two Intel Core i7 processors) or multiple different processors (e.g., an Intel Core i5 processor and an Intel Core i7 processor). The determining customized services based on a characteristic of a user functionality can be at least partially implemented using the control circuitry 518. The determining customized services based on a characteristic of a user functionality described herein may be implemented in or supported by any suitable software, hardware, or combination thereof. The determining customized services based on a characteristic of a user functionality can be implemented on user equipment, on remote servers, or across both.
[0064] The control circuitry 518 may include communications circuitry suitable for communicating with one or more servers. The instructions for carrying out the above- mentioned functionality may be stored on the one or more servers. Communications circuitry may include a cable modem, an integrated service digital network (ISDN) modem, a digital subscriber line (DSL) modem, a telephone modem, Ethernet card, or a wireless modem for communications with other equipment, or any other suitable communications circuitry. Such communications may involve the Internet or any other suitable communications networks or paths. In addition, communications circuitry may include circuitry that enables peer-to-peer communication of user equipment devices, or communication of user equipment devices in locations remote from each other (described in more detail below).
[0065] Memory may be an electronic storage device provided as the storage 504 that is part of the control circuitry 518. As referred to herein, the phrase “storage device” or “memory device” should be understood to mean any device for storing electronic data, computer software, or firmware, such as random-access memory, read-only memory, high-speed random-access memory (e.g., DRAM, SRAM, DDR RAM, or other random-access solid- state memory devices), non-volatile memory, one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, other non-volatile solid-state storage devices, quantum storage devices, and/or any combination of the same. In some embodiments, the storage 504 includes one or more storage devices remotely located, such as database of server system that is in communication with EVCS system 500. In some embodiments, the storage 504, or alternatively the non-volatile memory devices within the storage 504, includes a non-transitory computer-readable storage medium.
[0066] In some embodiments, storage 504 or the computer-readable storage medium of the storage 504 stores an operating system, which includes procedures for handling various basic system services and for performing hardware dependent tasks. In some embodiments, storage 504 or the computer-readable storage medium of the storage 504 stores a communications module, which is used for connecting EVCS system 500 to other computers and devices via the one or more communication network interfaces 506 (wired or wireless), such as the internet, other wide area networks, local area networks, metropolitan area networks, and so on. In some embodiments, storage 504 or the computer-readable storage medium of the storage 504 stores a media item module for selecting and/or displaying media items on the display(s) 520 to be viewed by passersby and users of EVCS system 500. In some embodiments, storage 504 or the computer-readable storage medium of the storage 504 stores an EVCS module for charging an electric vehicle (e.g., measuring how much charge has been delivered to an electric vehicle, commencing charging, ceasing charging, etc.), including a motor control module that includes one or more instructions for energizing or forgoing energizing the motor. In some embodiments, executable modules, applications, or sets of procedures may be stored in one or more of the previously mentioned memory devices and corresponds to a set of instructions for performing a function described above. In some embodiments, modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of modules may be combined or otherwise re-arranged in various implementations. In some embodiments, the storage 504 stores a subset of the modules and data structures identified above. In some embodiments, the storage 504 may store additional modules or data structures not described above.
[0067] In some embodiments, EVCS system 500 comprises additional peripherals 508 such as displays 520 for displaying content, and charging cable 522. In some embodiments, the displays 520 may be touch-sensitive displays that are configured to detect various swipe gestures (e.g., continuous gestures in vertical and/or horizontal directions) and/or other gestures (e.g., a single or double tap) or to detect user input via a soft keyboard that is displayed when keyboard entry is needed. [0068] In some embodiments, EVCS system 500 comprises one or more sensors 510 such as cameras (e.g., camera, described above with respect to in FIGS. 1, 2A-2B, 3A-3D, and 4A- 4B etc.), microphones, ultrasound sensors, depth sensors, IR cameras, RGB cameras, PIR cameras, thermal IR, proximity sensors, radar, tension sensors, NFC sensors, and/or any combination thereof. In some embodiments, the one or more sensors 510 are for detecting whether external objects are within a region proximal to EVCS system 500, such as living and nonliving objects, and/or the status of EVCS system 500 (e.g., available, occupied, etc.) in order to perform an operation, such as determining a vehicle characteristic, user information, region status, etc. The user may utter instructions to the control circuitry 518 which are received by a sensor 510 (e.g., microphone). The sensor 510 may be any microphone (or microphones) capable of detecting human speech. In some embodiments, the microphone is connected to the processing circuitry 502 to transmit detected voice commands and other speech thereto for processing. In some embodiments, voice assistants (e.g., Siri, Alexa, Google Home, and similar such voice assistants) receive and process the voice commands and other speech.
[0069] FIG. 6 shows an illustrative block diagram of a user equipment device system, in accordance with some embodiments of the disclosure. In practice, and as recognized by those of ordinary skill in the art, items shown separately could be combined and some items could be separated. In some embodiments, not all shown items must be included in device 600. In some embodiments, device 600 may comprise additional items. In an embodiment, the user equipment device 600, is the same user equipment device displayed in FIG. 1. The user equipment device 600 may receive content and data via VO path 602. The VO path 602 may provide audio content (e.g., broadcast programming, on-demand programming, Internet content, content available over a local area network (LAN) or wide area network (WAN), and/or other content) and data to control circuitry 604, which includes processing circuitry 606 and a storage 608. The control circuitry 604 may be used to send and receive commands, requests, and other suitable data using the VO path 602. The VO path 602 may connect the control circuitry 604 (and specifically the processing circuitry 606) to one or more communications paths. I/O functions may be provided by one or more of these communications paths but are shown as a single path in FIG. 6 to avoid overcomplicating the drawing.
[0070] The control circuitry 604 may be based on any suitable processing circuitry such as the processing circuitry 606. As referred to herein, processing circuitry should be understood to mean circuitry based on one or more microprocessors, microcontrollers, digital signal processors, programmable logic devices, FPGAs, ASICs, etc., and may include a multi-core processor (e.g., dual-core, quad-core, hexa-core, or any suitable number of cores) or supercomputer. In some embodiments, processing circuitry may be distributed across multiple separate processors or processing units, for example, multiple of the same type of processing units (e.g., two Intel Core i7 processors) or multiple different processors (e.g., an Intel Core i5 processor and an Intel Core i7 processor).
[0071] In client-server-based embodiments, the control circuitry 604 may include communications circuitry suitable for communicating with one or more servers that may at least implement the described allocation of services functionality. The instructions for carrying out the above-mentioned functionality may be stored on the one or more servers. Communications circuitry may include a cable modem, an ISDN modem, a DSL modem, a telephone modem, Ethernet card, or a wireless modem for communications with other equipment, or any other suitable communications circuitry. Such communications may involve the Internet or any other suitable communications networks or paths. In addition, communications circuitry may include circuitry that enables peer-to-peer communication of user equipment devices, or communication of user equipment devices in locations remote from each other (described in more detail below).
[0072] Memory may be an electronic storage device provided as the storage 608 that is part of the control circuitry 604. Storage 608 may include random-access memory, read-only memory, hard drives, optical drives, digital video disc (DVD) recorders, compact disc (CD) recorders, BLU-RAY disc (BD) recorders, BLU-RAY 3D disc recorders, digital video recorders (DVR, sometimes called a personal video recorder, or PVR), solid-state devices, quantum storage devices, gaming consoles, gaming media, or any other suitable fixed or removable storage devices, and/or any combination of the same. The storage 608 may be used to store various types of content described herein. Nonvolatile memory may also be used (e.g., to launch a boot-up routine and other instructions). Cloud-based storage may be used to supplement the storage 608 or instead of the storage 608.
[0073] The control circuitry 604 may include audio generating circuitry and tuning circuitry, such as one or more analog tuners, audio generation circuitry, filters or any other suitable tuning or audio circuits or combinations of such circuits. The control circuitry 604 may also include scaler circuitry for upconverting and down converting content into the preferred output format of the user equipment device 600. The control circuitry 604 may also include digital-to-analog converter circuitry and analog-to-digital converter circuitry for converting between digital and analog signals. The tuning and encoding circuitry may be used by the user equipment device 600 to receive and to display, to play, or to record content. The circuitry described herein, including, for example, the tuning, audio generating, encoding, decoding, encrypting, decrypting, scaler, and analog/digital circuitry, may be implemented using software running on one or more general purpose or specialized processors. If the storage 608 is provided as a separate device from the user equipment device 600, the tuning and encoding circuitry (including multiple tuners) may be associated with the storage 608.
[0074] The user may utter instructions to the control circuitry 604 which are received by the microphone 616. The microphone 616 may be any microphone (or microphones) capable of detecting human speech. The microphone 616 is connected to the processing circuitry 606 to transmit detected voice commands and other speech thereto for processing. In some embodiments, voice assistants (e.g., Siri, Alexa, Google Home, and similar such voice assistants) receive and process the voice commands and other speech.
[0075] The user equipment device 600 may optionally include an interface 610. The interface 610 may be any suitable user interface, such as a remote control, mouse, trackball, keypad, keyboard, touch screen, touchpad, stylus inputjoystick, or other user input interfaces. A display 612 may be provided as a stand-alone device or integrated with other elements of the user equipment device 600. For example, the display 612 may be a touchscreen or touch-sensitive display. In such circumstances, the interface 610 may be integrated with or combined with the microphone 616. When the interface 610 is configured with a screen, such a screen may be one or more of a monitor, a television, a liquid crystal display (LCD) for a mobile device, active matrix display, cathode ray tube display, lightemitting diode display, organic light-emitting diode display, quantum dot display, or any other suitable equipment for displaying visual images. In some embodiments, the interface 610 may be HDTV-capable. In some embodiments, the display 612 may be a 3D display. The speaker (or speakers) 614 may be provided as integrated with other elements of user equipment device 600 or may be a stand-alone unit. In some embodiments, the display 612 may be outputted through speaker 614.
[0076] FIG. 7 shows an illustrative block diagram of a server system 700, in accordance with some embodiments of the disclosure. Server system 700 may include one or more computer systems (e.g., computing devices), such as a desktop computer, a laptop computer, and a tablet computer. In some embodiments, the server system 700 is a data server that hosts one or more databases (e.g., databases of images or videos), models, or modules or may provide various executable applications or modules. In practice, and as recognized by those of ordinary skill in the art, items shown separately could be combined and some items could be separated. In some embodiments, not all shown items must be included in server system 700. In some embodiments, server system 700 may comprise additional items.
[0077] The server system 700 can include processing circuitry 702 that includes one or more processing units (processors or cores), storage 704, one or more network or other communications network interfaces 706, and one or more I/O paths 708. I/O paths 708 may use communication buses for interconnecting the described components. I/O paths 708 can include circuitry (sometimes called a chipset) that interconnects and controls communications between system components. Server system 700 may receive content and data via I/O paths 708. The I/O path 708 may provide data to control circuitry 710, which includes processing circuitry 702 and a storage 704. The control circuitry 710 may be used to send and receive commands, requests, and other suitable data using the I/O path 708. The I/O path 708 may connect the control circuitry 710 (and specifically the processing circuitry 702) to one or more communications paths. I/O functions may be provided by one or more of these communications paths but are shown as a single path in FIG. 7 to avoid overcomplicating the drawing.
[0078] The control circuitry 710 may be based on any suitable processing circuitry such as the processing circuitry 702. As referred to herein, processing circuitry should be understood to mean circuitry based on one or more microprocessors, microcontrollers, digital signal processors, programmable logic devices, FPGAs, ASICs, etc., and may include a multi-core processor (e.g., dual-core, quad-core, hexa-core, or any suitable number of cores) or supercomputer. In some embodiments, processing circuitry may be distributed across multiple separate processors or processing units, for example, multiple of the same type of processing units (e.g., two Intel Core i7 processors) or multiple different processors (e.g., an Intel Core i5 processor and an Intel Core i7 processor).
[0079] Memory may be an electronic storage device provided as the storage 704 that is part of the control circuitry 710. Storage 704 may include random-access memory, read-only memory, high-speed random-access memory (e.g., DRAM, SRAM, DDR RAM, or other random-access solid-state memory devices), non-volatile memory, one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, other non-volatile solid- state storage devices, quantum storage devices, and/or any combination of the same.
[0080] In some embodiments, storage 704 or the computer-readable storage medium of the storage 704 stores an operating system, which includes procedures for handling various basic system services and for performing hardware dependent tasks. In some embodiments, storage 704 or the computer-readable storage medium of the storage 704 stores a communications module, which is used for connecting the server system 700 to other computers and devices via the one or more communication network interfaces 706 (wired or wireless), such as the internet, other wide area networks, local area networks, metropolitan area networks, and so on. In some embodiments, storage 704 or the computer-readable storage medium of the storage 704 stores a web browser (or other application capable of displaying web pages), which enables a user to communicate over a network with remote computers or devices. In some embodiments, storage 704 or the computer-readable storage medium of the storage 704 stores a database for storing information on electric vehicle charging stations, their locations, media items displayed at respective electric vehicle charging stations, a number of each type of impression count associated with respective electric vehicle charging stations, user profiles, and so forth.
[0081] In some embodiments, executable modules, applications, or sets of procedures may be stored in one or more of the previously mentioned memory devices and corresponds to a set of instructions for performing a function described above. In some embodiments, modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of modules may be combined or otherwise re-arranged in various implementations. In some embodiments, the storage 704 stores a subset of the modules and data structures identified above. In some embodiments, the storage 704 may store additional modules or data structures not described above.
[0082] FIG. 8 is an illustrative flowchart of a process 800 for determining customized services based on a characteristic of a user, in accordance with some embodiments of the disclosure. Process 800 may be performed by physical or virtual control circuitry, such as control circuitry 518 of EVCS system 500 (FIG. 5). In some embodiments, some steps of process 800 may be performed by one of several devices. For example, one or more of the steps of process 800 may be performed by a server (FIG. 7).
[0083] At step 802, control circuitry detects an electric vehicle. In some embodiments, the control circuitry detects an electric vehicle based on information received from one or more sensors mounted to an EVCS. In some embodiments, the sensors may be image (e.g., optical) sensors (e.g., one or more cameras), microphones, ultrasound sensors, depth sensors, IR cameras, RGB cameras, PIR cameras, thermal IR, proximity sensors, radar, tension sensors, NFC sensors, and/or any combination thereof. In some embodiments, control circuitry determines an electric vehicle is in a location (e.g., parking space) based on information received from the one or more sensors. For example, a camera may transmit images of the location to the control circuitry and the control circuitry can determine (e.g., via images recognition, machine learning, etc.) that an electric vehicle is in the location. In some embodiments, the one or more sensors are calibrated to detect vehicles. For example, a first sensor may be calibrated to detect objects larger than a size threshold. In another example, a sensor may be calibrated to detect objects that result in a change of inductance corresponding to the presence of a vehicle. In some embodiments, the control circuitry detects the electric vehicle when the electric vehicle enters a location. In some embodiments, the location comprises the parking space of the EVCS. In some embodiments, the location comprises an area larger than the parking space. For example, the location may comprise an area in front of the parking space that an electric vehicle may use to back into the parking space.
[0084] At step 804, control circuitry receives a plurality of images. In some embodiments, the same sensor that detects the electric vehicle in step 802 captures the plurality of images. In some embodiments, a camera (e.g., camera 116) captures the plurality of images. In some embodiments, the camera captures the plurality of images in response to detecting an electric vehicle in the area proximal to the EVCS. In some embodiments, the camera captures the plurality of images in response to detecting an electric vehicle in the parking space corresponding to the EVCS. In some embodiments, the camera captures the plurality of images in response to an event (e.g., vehicle event, user event, etc.).
[0085] At step 806, control circuitry detects a connector event using the plurality of images. In some embodiments, the control circuitry uses a single image to detect the connector event. For example, when an image comprises a connector, the control circuitry may determine that the connector event occurred at the time the image was captured. In some embodiments, the control circuitry determines that the connector event occurred whenever the connector position moves. For example, a first image may display the connector in a first position and a second image may display the connector in a second position. In response to the connector changing from the first position to the second position, the control circuitry may determine that the connector event occurred. In some embodiments, the control circuitry determines that the connector event occurred whenever the connector moves a threshold distance from a first position. In some embodiments, the EVCS uses sensors mounted to the connector and/or holster to determine if the connector has been removed from the holster. In some embodiments, the EVCS uses sensors mounted to the connector to determine if the connector is connected to or disconnected from the electric vehicle. [0086] At step 808, control circuitry determines whether a charging event has occurred. In some embodiments, the control circuitry determines that a charging event has occurred when a a connector is connected to the electric vehicle and/or an EVCS is supplying electric current to the electric vehicle. For example, the control circuitry may determine that a charging event has occurred if a first image of the plurality of images shows the connector connected to the electric vehicle. In another example, the control circuitry may determine that the charging event has occurred if the control circuitry determines that current is flowing from the EVCS to the electric vehicle. In some embodiments, the control circuitry uses sensors mounted to the connector to determine if the connector is connected to or disconnected from the electric vehicle. If the control circuitry determines that the charging event has occurred, the process 800 ends at step 810. If the control circuitry determines that the charging event has not occurred, the process 800 continues to step 812.
[0087] At step 812, control circuitry determines whether a threshold time has passed since the connector event. In some embodiments, the threshold time corresponds to the average length of time required for a user to connect the connector to their electric vehicle. In some embodiments, the threshold time is longer than the average length of time required for a user to connect the connector to their electric vehicle. If the control circuitry determines that the threshold time has not passed since the connector event, the process 800 continues to step 808, where the control circuitry again determines whether a charging event has occurred. If the control circuitry determines that the threshold time has passed since the connector event, the process 800 continues to step 814.
[0088] At step 814, control circuitry transmits a charge message. In some embodiments, the charge message comprises instructions on how to operate a connector. In some embodiments, the charge message comprises a request for assistance. For example, the control circuitry may transmit the charge message to a plurality of user devices corresponding to nearby electric vehicle users indicating that the user corresponding to the electric vehicle detected in step 802 requires assistance. In some embodiments, the charge message is displayed on different devices (e.g., the EVCS, a user device associated with the user of the electric vehicle, user devices associated with other users, etc.). In some embodiments, how the charge message is displayed, where the charge message is transmitted, and/or the contents of the charge message are dependent on a characteristic related to the user of the electric vehicle. For example, if the user characteristic corresponds to a new user, the charge message may indicate that the user is a new user and requires assistance. In another example, if the user characteristic corresponds to a disabled user, the charge message may be transmitted to a user device associated with a location of an employee designated to help disabled users. In some embodiments, the charge message indicates a benefit (e.g., compensation, credit, etc.) for assisting the user. In some embodiments, the control circuitry also activates a light coupled to the EVCS, based on determining that the threshold time has passed since the connector event.
[0089] FIG. 9 is another illustrative flowchart of a process 900 for determining customized services based on a characteristic of a user, in accordance with some embodiments of the disclosure. Process 900 may be performed by physical or virtual control circuitry, such as control circuitry 518 of EVCS system 500(FIG. 5). In some embodiments, some steps of process 900 may be performed by one of several devices. For example, one or more of the steps of process 900 may be performed by a server (FIG. 7).
[0090] At step 902, control circuitry detects an electric vehicle. In some embodiments, the control circuitry uses the same or similar methodologies described in step 802 above.
[0091] At step 904, control circuitry receives a plurality of images of an area proximal to the electric vehicle and/or the EVCS. In some embodiments, the control circuitry uses the same or similar methodologies described in step 804 above.
[0092] At step 906, control circuitry determines a user characteristic of a user associated with the electric vehicle. In some embodiments, the control circuitry determines the user characteristic based on information captured by one or more sensors. In some embodiments, the one or more sensors are the same sensors used to detect the electric vehicle in step 902. In some embodiments, the one or more sensors are different from the sensors used to detect the electric vehicle in step 902. In some embodiments, the control circuitry receives the user characteristic in conjunction with receiving a request to charge the electric vehicle. In some embodiments, the control circuitry accesses a profile associated with the electric vehicle from a database and/or server, wherein the profile comprises one or more user characteristics. In some embodiments, the control circuitry accesses the user characteristic from a database, the user, and/or a third-party provider. In some embodiments, the control circuitry determines the user characteristic after a sensor receives a request to charge an electric vehicle. For example, the user may have to present some credentials (e.g., password, PIN, biometrics, device, item, etc.) to request the control circuitry to charge the electric vehicle.
[0093] In some embodiments, the control circuitry determines the user characteristic using an electric vehicle characteristic. For example, once the control circuitry determines an electric vehicle characteristic, the control circuitry can determine a user characteristic of a user associated with the electric vehicle. In some embodiments, the control circuitry uses information collected from the one or more sensors during step 902 to determine one or more electric vehicle characteristics (e.g., model, make, color, license plate number, VIN number, charging status, tire pressure, specifications, condition, etc.) of the vehicle. In some embodiments, the control circuitry uses a machine learning algorithm to process information collected by the sensors to determine an electric vehicle characteristic.
[0094] At step 908, control circuitry determines whether the user characteristic corresponds to a service. In some embodiments, the control circuitry accesses a database comprising a plurality of entries indicating whether one or more user characteristics are associated with a service. For example, the control circuitry can determine that a user with a first user characteristic (e.g., disabled user) requires a first service and a second user with a second characteristic (e.g., not disabled) does not require a service. In some embodiments, the control circuitry determines that the user characteristic does not correspond to a service if the user characteristic is not included in the database. For example, the control circuitry may determine the user’s age (user characteristic) and may also determine that no entry in the database corresponds to the user’s age. In some embodiments, the control circuitry determines a service type. For example, a first user with a first user characteristic (e.g., the user is disabled) may require a first service type (e.g., assistant connecting the first user’s electric vehicle to an EVCS) and a second user with a second user characteristic (e.g., new user) may require a second service type (e.g., tutorial displayed on the display of the EVCS). If the control circuitry determines that the user characteristic does not correspond to a service, the process 900 ends at step 910. If the control circuitry determines that the user characteristic does correspond to a service, the process 900 continues to step 912.
[0095] At step 912, control circuitry transmits a notification related to the user characteristic. In some embodiments, the notification includes a charging tutorial. In some embodiments, the notification indicates the user characteristic determined in step 906. For example, the notification may indicate that a nearby store offers kids’ supplies to a user who is a parent (user characteristic). In some embodiments, the control circuitry transmits the notification to one or more user devices based on the user characteristic. For example, if the user characteristic indicates a disabled user, the notification may be transmitted to a plurality of user devices corresponding to nearby electric vehicle users. In some embodiments, the control circuitry also activates a light mounted to an EVCS, based on the user characteristic identified in step 904. In some embodiments, the notification indicates a benefit (e.g., compensation, credit, etc.) for assisting the user. [0096] It is contemplated that some suitable steps or suitable descriptions of FIGS. 8-9 may be used with other suitable embodiments of this disclosure. In addition, some suitable steps and descriptions described in relation to FIGS. 8-9 may be implemented in alternative orders or in parallel to further the purposes of this disclosure. For example, some suitable steps may be performed in any order or in parallel or substantially simultaneously to reduce lag or increase the speed of the system or method. Some suitable steps may also be skipped or omitted from the process. Furthermore, it should be noted that some suitable devices or equipment discussed in relation to FIGS. 1-7 could be used to perform one or more of the steps in FIGS. 8-9.
[0097] The processes discussed above are intended to be illustrative and not limiting. One skilled in the art would appreciate that the steps of the processes discussed herein may be omitted, modified, combined, and/or rearranged, and any additional steps may be performed without departing from the scope of the invention. More generally, the above disclosure is meant to be exemplary and not limiting. Only the claims that follow are meant to set bounds as to what the present invention includes. Furthermore, it should be noted that the features and limitations described in any one embodiment may be applied to any other embodiment herein, and flowcharts or examples relating to one embodiment may be combined with any other embodiment in a suitable manner, done in different orders, or done in parallel. In addition, the systems and methods described herein may be performed in real time. It should also be noted that the systems and/or methods described above may be applied to, or used in accordance with, other systems and/or methods.
This specification discloses embodiments which include, but are not limited to, the following:
1. An electric vehicle charging station comprising: a housing unit; a connector with the housing unit; a camera mounted to the housing unit; and a control circuitry located inside the housing unit, the control circuitry configured to: detect an electric vehicle using a first plurality of images captured by the camera; detect a connector event, wherein the connector event is associated with the connector moving from a position; determine a user characteristic of a user of the electric vehicle using a second plurality of images captured by the camera; and in response to determining the user characteristic, transmitting a notification related to the user characteristic.
2. The electric vehicle charging station of item 1, wherein the control circuitry is further configured to charge the electric vehicle.
3. The electric vehicle charging station of item 1, wherein the user characteristic corresponds to a new user.
4. The electric vehicle charging station of item 1, wherein the user characteristic indicates that a user associated with the electric vehicle has a disability.
5. The electric vehicle charging station of item 4, wherein the notification requests assistance for charging the electric vehicle with the electric vehicle charging station.
6. The electric vehicle charging station of item 5, wherein the notification is transmitted to a location within a threshold distance of the electric vehicle charging station.
7. The electric vehicle charging station of item 5, wherein the notification is transmitted to a user device within a threshold distance of the electric vehicle charging station.

Claims

What is claimed is:
1. An electric vehicle charging station comprising: a housing unit; a connector with the housing unit; a camera mounted to the housing unit; and a control circuitry located inside the housing unit, the control circuitry configured to: detect an electric vehicle using a first plurality of images captured by the camera; detect a connector event using a second plurality of images captured by the camera, wherein the connector event is associated with the connector moving from a position; determine that a threshold time has elapsed after the connector event without a detection of a charging event; and in response to determining that the threshold time has elapsed after the connector event without the detection of the charging event, transmitting a notification, wherein the notification comprises instructions for charging the electric vehicle with the connector.
2. The electric vehicle charging station of claim 1, wherein the electric vehicle charging station further comprises a display coupled to the control circuitry and the control circuitry is further configured to display the notification on the display.
3. The electric vehicle charging station of claim 1, wherein the notification is transmitted to a user device associated with a user of the electric vehicle.
4. The electric vehicle charging station of claim 1, wherein the control circuitry is further configured to charge the electric vehicle.
5. The electric vehicle charging station of claim 1, wherein the camera is mounted to a top portion of the housing unit.
6. An electric vehicle charging station comprising: a housing unit; a connector with the housing unit; a camera mounted to the housing unit; and a control circuitry located inside the housing unit, the control circuitry configured to: detect an electric vehicle using a first plurality of images captured by the camera; detect a connector event, wherein the connector event is associated with the connector moving from a position; determine a user characteristic of a user of the electric vehicle using a second plurality of images captured by the camera; and in response to determining the user characteristic, transmitting a notification related to the user characteristic.
7. The electric vehicle charging station of claim 6, wherein the user characteristic corresponds to the facial expression of the user.
8. The electric vehicle charging station of claim 6, wherein the user characteristic corresponds to the gaze of the user.
9. The electric vehicle charging station of claim 6, wherein the camera is mounted to a top portion of the housing unit.
10. The electric vehicle charging station of claim 6, wherein the notification comprises instructions for charging the electric vehicle with the connector.
11. The electric vehicle charging station of claim 10, wherein the electric vehicle charging station further comprises a display coupled to the control circuitry and the control circuitry is further configured to display the notification on the display.
12. The electric vehicle charging station of claim 10, wherein the notification is transmitted to a user device associated with the user of the electric vehicle.
PCT/US2023/016736 2022-03-31 2023-03-29 Customizing electric vehicle charging station service based on sentiment analysis WO2023192385A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263326181P 2022-03-31 2022-03-31
US63/326,181 2022-03-31

Publications (1)

Publication Number Publication Date
WO2023192385A1 true WO2023192385A1 (en) 2023-10-05

Family

ID=88203529

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/016736 WO2023192385A1 (en) 2022-03-31 2023-03-29 Customizing electric vehicle charging station service based on sentiment analysis

Country Status (1)

Country Link
WO (1) WO2023192385A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120114074A (en) * 2011-04-06 2012-10-16 인지컨트롤스 주식회사 Charging system for electric vehicle
KR20150075745A (en) * 2013-12-26 2015-07-06 엘에스산전 주식회사 Charger and method for managementing a chagrge connector of the same
US20160297316A1 (en) * 2012-12-24 2016-10-13 Angel A. Penilla Methods and Systems for Automatic Electric Vehicle Identification and Charging Via Wireless Charging Pads
US20170320400A1 (en) * 2009-12-01 2017-11-09 William Gibbens Redmann Method and apparatus for parking lot management
KR20200055171A (en) * 2018-11-07 2020-05-21 현대모비스 주식회사 Apparatus and method for supporting safe driving

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170320400A1 (en) * 2009-12-01 2017-11-09 William Gibbens Redmann Method and apparatus for parking lot management
KR20120114074A (en) * 2011-04-06 2012-10-16 인지컨트롤스 주식회사 Charging system for electric vehicle
US20160297316A1 (en) * 2012-12-24 2016-10-13 Angel A. Penilla Methods and Systems for Automatic Electric Vehicle Identification and Charging Via Wireless Charging Pads
KR20150075745A (en) * 2013-12-26 2015-07-06 엘에스산전 주식회사 Charger and method for managementing a chagrge connector of the same
KR20200055171A (en) * 2018-11-07 2020-05-21 현대모비스 주식회사 Apparatus and method for supporting safe driving

Similar Documents

Publication Publication Date Title
US8941736B1 (en) Doorbell communication systems and methods
US20230302945A1 (en) Systems and methods for monitoring an electric vehicle using an electric vehicle charging station
CN112559098B (en) Card rendering method and electronic equipment
US20230286408A1 (en) Systems and methods for managing an electric vehicle charging station's parking space availability
US20170060319A1 (en) Large format display apparatus and control method thereof
US9311795B2 (en) Systems and methods for operating remote presence security
EP2862362B1 (en) Stream-based media management
US20230302946A1 (en) Systems and methods for collision detection using an electric vehicle charging station
KR20180137913A (en) Electronic device for playing contents and operating method thereof
US20180060088A1 (en) Group Interactions
KR102405307B1 (en) Electronic apparatus, method for controlling thereof and the computer readable recording medium
CN106686232A (en) Method for optimizing control interfaces and mobile terminal
US10394510B2 (en) Method for displaying content and electronic device therefor
US20240021194A1 (en) Voice interaction method and apparatus
WO2023192385A1 (en) Customizing electric vehicle charging station service based on sentiment analysis
KR20210047112A (en) Electronic apparatus and control method thereof
US20150287089A1 (en) System and method of preventing addiction to electronic device, and electronic device adapted to the same
WO2023192383A1 (en) Customizing electric vehicle charging station services for users with disabilities
US20220379765A1 (en) Systems and methods for allocation of charging rates based on vehicle characteristics
WO2024092215A1 (en) Electric vehicle charging station camera array
US20230259844A1 (en) Systems and methods for determining a parking space status using an electric vehicle charging station
US20230009749A1 (en) Systems and methods for charging an electric vehicle based on inferred dwell time
WO2023172672A1 (en) Systems and methods for determining charging station compatibility
US20230058986A1 (en) Systems and methods for determining tire characteristics using an electric vehicle charging station
CN106465099A (en) Improved delivery of contextual data to a computing device while preserving data privacy

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23781751

Country of ref document: EP

Kind code of ref document: A1