US20160297361A1 - Camera array system and method to detect a load status of a semi- trailer truck - Google Patents
Camera array system and method to detect a load status of a semi- trailer truck Download PDFInfo
- Publication number
- US20160297361A1 US20160297361A1 US14/682,086 US201514682086A US2016297361A1 US 20160297361 A1 US20160297361 A1 US 20160297361A1 US 201514682086 A US201514682086 A US 201514682086A US 2016297361 A1 US2016297361 A1 US 2016297361A1
- Authority
- US
- United States
- Prior art keywords
- trailer
- sensor array
- semi
- cargo
- cameras
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title abstract description 25
- 230000015654 memory Effects 0.000 claims abstract description 61
- 239000002131 composite material Substances 0.000 claims description 23
- 230000001413 cellular effect Effects 0.000 claims description 22
- 238000001514 detection method Methods 0.000 claims description 9
- 230000000284 resting effect Effects 0.000 claims 4
- 230000001960 triggered effect Effects 0.000 claims 2
- 238000004422 calculation algorithm Methods 0.000 description 32
- 238000004891 communication Methods 0.000 description 17
- 230000008569 process Effects 0.000 description 10
- 230000000007 visual effect Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 7
- 238000004590 computer program Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 239000000446 fuel Substances 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 230000006855 networking Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000010191 image analysis Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000004549 pulsed laser deposition Methods 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/26—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/04—Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/02—Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/188—Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R2011/0001—Arrangements for holding or mounting articles, not otherwise provided for characterised by position
- B60R2011/0003—Arrangements for holding or mounting articles, not otherwise provided for characterised by position inside the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/20—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
- B60R2300/207—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using multi-purpose displays, e.g. camera image and navigation or video on same display
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/40—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the details of the power supply or the coupling to vehicle components
- B60R2300/406—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the details of the power supply or the coupling to vehicle components using wireless transmission
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8006—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying scenes of vehicle interior, e.g. for monitoring passengers or cargo
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8046—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for replacing a rear-view mirror system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/806—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8086—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for vehicle path indication
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01G—WEIGHING
- G01G19/00—Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
- G01G19/08—Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups for incorporation in vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
Definitions
- This disclosure relates generally to automotive technology and, more particularly, to a method, a device and/or a system of utilizing a camera array system to detect a load status of a semi-trailer truck.
- a transportation service provider may be compensated based on a type of goods being carried inside a cargo area of a trailer of a transportation vehicle (e.g., a semi-trailer truck). Therefore, the transportation service provider may seek to maximize a utilization of space inside of a cargo area of the trailer. Sensors (e.g. weight sensors, wave sensors, ultrasound sensors) employed in an interior space of the cargo area may not be able detect color patterns or types of cargo.
- these sensors may not be able to detect exactly where in the trailer the cargo is located. Moreover, these sensors may not provide a reliable view of what is exactly happening inside of the trailer. As a result, new problems may arise such as a driver may embark on long journeys, when, in fact, their cargo area is filled with the wrong type of cargo (e.g., may even be empty). This may lead to wasted time, fuel, efficiency, customer dissatisfaction, and/or ultimately, loss of revenue for the transportation services provider.
- a trailer of a semi-trailer truck includes a sensor array affixed to a surface of the trailer to automatically determine whether a cargo area of the semi-trailer truck is occupied.
- the trailer of the semi-trailer truck includes a set of cameras of the sensor array. Each camera of the set of cameras is each embedded in individual recesses of the sensor array such that each of the set of cameras does not protrude from the sensor array into the cargo area and/or each of the set of cameras peers into the cargo area of the semi-trailer truck.
- the trailer of the semi-trailer truck further includes at least one light source to illuminate the cargo area.
- a memory and a processor associated with the sensor array are configured to store one baseline image of the cargo area of the trailer when the trailer is in an empty state.
- the processor is configured to detect a triggering event and to illuminate the cargo area of the trailer using at least one light source.
- the processor is further configured to capture a current image of the cargo area of the trailer using the set of cameras.
- the processor is configured to compare each current image of an interior cavity with the corresponding baseline image of a cargo cavity.
- the processor determines a cargo status based upon a difference between the current image and the baseline image.
- the processor is also configured to send the cargo status to a dispatcher using a cellular modem.
- the sensor array may be affixed to an upper corner of the trailer.
- the sensor array may be affixed to a middle top-section of the trailer, such that the sensor array is placed in a separate housing from the cargo area on an exterior face of the trailer.
- the light source may be a light-emitting diode that is associated with each camera of the set of cameras.
- Each camera of the set of cameras may automatically take a photograph of the cargo area in view of each camera upon an occurrence of the triggering event.
- the triggering event may be a trailer opening event, a trailer closing event, a motion detection event through a global positioning device and a motion sensor in the trailer, a stopping event, a time-based event, a geographic-location based event, and/or a velocity based event.
- the sensor array may include a backup camera to observe a rear area of the trailer of the semi-trailer truck.
- the backup camera may be mounted to the sensor array.
- the backup camera may view a door of the trailer, a loading area of the trailer, and/or an area behind the trailer.
- a driver of the trailer may view a video feed from the backup camera using a wired connection and/or a wireless connection between the backup camera and a display in a cabin area of the semi-trailer truck.
- the trailer of the semi-trailer truck may have a field of view of each of the set of cameras to partially overlap with the field of view of another of the set of cameras.
- the sensor array may be powered by a battery, the semi-trailer truck, and/or a solar array mounted on the trailer.
- the sensor array may communicatively generate a composite view of the cargo area using the set of cameras.
- the sensor array may communicate the composite view to the cabin area of the semi-trailer truck and/or a central server communicatively coupled with the semi-trailer truck through an Internet network using the processor and the memory of the semi-trailer truck.
- the cellular modem may periodically provide a reporting of a location of the semi-trailer truck captured with a geographic positioning receiver to the central server along with the composite view using the processor and the memory.
- a trailer of a semi-trailer truck in another aspect, includes a sensor array affixed to a surface of the trailer to automatically determine whether a cargo area of the semi-trailer truck is occupied.
- the trailer of the semi-trailer truck further includes a set of cameras of the sensor array. Each camera of the set of cameras is each recessed relative to an interior region of the cargo area and/or each of the set of cameras peers into the cargo area of the semi-trailer truck.
- a memory and a processor associated with the sensor array are configured to store one baseline image of the cargo area of the trailer when the trailer is in an empty state.
- the processor is configured to detect a triggering event and to illuminate the cargo area of the trailer using at least one light source.
- the processor is further configured to capture a current image of the cargo area of the trailer using the set of cameras.
- the processor compares each current image of an interior cavity with the corresponding baseline image of a cargo cavity.
- the processor of the sensor array is configured to determine a cargo status based upon a difference between the current image and the baseline image.
- the processor is configured to send the cargo status to a dispatcher using a cellular modem.
- a trailer of a semi-trailer truck includes a sensor array affixed to a surface of the trailer to automatically determine whether a cargo area of the semi-trailer truck is occupied.
- the trailer of the semi-trailer truck also includes a set of cameras of the sensor array. Each camera of the set of cameras is each embedded in individual recesses of the sensor array such that each of the set of cameras are interior to a flush plane of the surface to prevent cargo from damaging each camera.
- Each of the set of cameras peers into the cargo area of the semi-trailer truck.
- the trailer of the semi-trailer truck further includes at least one light source to illuminate the cargo area.
- a memory and a processor associated with the sensor array are configured to store one baseline image of the cargo area of the trailer when the trailer is in an empty state.
- the processor is configured to detect a triggering event and/or to illuminate the cargo area of the trailer using at least one light source.
- the processor is further configured to capture a current image of the cargo area of the trailer using the set of cameras.
- the processor is also configured to compare each current image of an interior cavity with the corresponding baseline image of a cargo cavity.
- the processor is configured to determine a cargo status based upon a difference between the current image and the baseline image.
- the processor is configured to send the cargo status to a dispatcher using a cellular modem.
- the sensor array includes a backup camera to observe a rear area of the trailer of the semi-trailer truck.
- the backup camera is mounted to the sensor array such that the backup camera views a door of the trailer, a loading area of the trailer, and/or an area behind the trailer.
- a driver of the trailer may view a video feed from the backup camera using a wired connection and/or a wireless connection between the backup camera and/or a display in a cabin area of the semi-trailer truck.
- the method, apparatus, and system disclosed herein may be implemented in any means for achieving various aspects, and may be executed in a form of a non-transitory machine-readable medium embodying a set of instructions that, when executed by a machine, cause the machine to perform any of the operations disclosed herein.
- Other features will be apparent from the accompanying drawings and from the detailed description that follows.
- FIG. 1A is an upper corner placement view of a sensor array affixed to an upper corner of a trailer of a semi-trailer truck to automatically determine whether a cargo area of the semi-trailer truck is occupied and sending the cargo status to a dispatcher using a cellular modem, according to one embodiment.
- FIG. 1B is a middle top placement view of the sensor array of FIG. 1 illustrating a set of camera communicatively generating a composite view of the cargo area based on a triggering event, according to at least one embodiment.
- FIG. 2 is a backup camera view of the sensor array of FIG. 1 illustrating a backup camera mounted to the sensor array enabling a driver of the trailer to view a video feed from the backup camera, according to at least one embodiment.
- FIG. 3 is a block diagram representing one embodiment of the sensor array of the trailer of semi-trailer truck illustrated in FIG. 1 .
- FIG. 4 is a composite view illustrating the overlapping distortion captured by each camera of the set of cameras of the sensor array of FIG. 1 providing the cargo status of the trailer, according to one embodiment.
- FIG. 5 is a table view illustrating the storing of undistorted baseline image captured at an empty state of the trailer of FIG. 1 and the corresponding distorted image after occurrence of the triggering event for determining the cargo status, according to one embodiment.
- FIG. 6 is an exploded view of the triggering event algorithm of the sensor array of FIG. 1 , according to one embodiment.
- FIG. 7 is a critical path view illustrating a flow based on time in which critical operations of the sensor array of FIG. 1 are established, according to one embodiment.
- FIG. 8 is a process flow diagram of the sensor array of FIG. 1 to determine the cargo status of the trailer of the semi-trailer truck of FIG. 1 , according to one embodiment.
- FIG. 9 is a schematic diagram of exemplary data processing devices that can be used to implement the methods and systems disclosed herein, according to one embodiment.
- a trailer 102 of a semi-trailer truck 104 includes a sensor array 106 affixed to a surface 108 of the trailer 102 to automatically determine whether a cargo area 110 of the semi-trailer truck 104 is occupied.
- the trailer 102 of the semi-trailer truck 104 also includes a set of cameras 112 of the sensor array 106 .
- Each camera of the set of cameras 112 is each embedded in individual recess(es) 113 of the sensor array 106 such that each of the set of cameras 112 does not protrude from the sensor array 106 into the cargo area 110 and/or each of the set of cameras 112 peers into the cargo area 110 of the semi-trailer truck 104 .
- the trailer 102 of the semi-trailer truck 104 further includes at least one light source 114 to illuminate the cargo area 110 .
- a memory 116 and a processor 118 associated with the sensor array 106 are configured to store one baseline image 122 of the cargo area 110 of the trailer 102 when the trailer 102 is in an empty state.
- the processor 118 is configured to detect a triggering event 206 (e.g., using the triggering event algorithm 142 of the dispatch server 126 ) and to illuminate the cargo area 110 of the trailer 102 using at least one light source 114 .
- the processor 118 is further configured to capture a current image 144 of the cargo area 110 of the trailer 102 using the set of cameras 112 .
- the processor 118 is configured to compare (e.g., using the difference algorithm 148 of the dispatch server 126 ) each current image 144 of an interior cavity 402 with the corresponding baseline image 122 of a cargo cavity.
- the processor 118 determines a cargo status 124 (e.g., using the cargo status algorithm 125 of the dispatch server 126 ) based upon a difference (e.g., using the difference algorithm 148 of the dispatch server 126 ) between the current image 144 and the baseline image 122 .
- the processor 118 is also configured to send the cargo status 124 to a dispatcher 134 using a cellular modem 136 .
- the sensor array 106 may be affixed to an upper corner of the trailer 102 .
- the sensor array 106 may be affixed to a middle top-section of the trailer 102 , such that the sensor array 106 is placed in a separate housing 138 from the cargo area 110 on an exterior face 140 of the trailer 102 .
- the light source 114 may be a light-emitting diode that is associated with each camera of the set of cameras 112 .
- Each camera of the set of cameras 112 may automatically take a photograph of the cargo area 110 in view of each camera upon an occurrence of the triggering event 206 .
- the triggering event 206 may be a trailer opening event, a trailer closing event, a motion detection event through a global positioning device and a motion sensor in the trailer, a stopping event, a time-based event, a geographic-location based event, and/or a velocity based event.
- the sensor array 106 may include a backup camera 202 to observe a rear area 204 of the trailer 102 of the semi-trailer truck 104 .
- the backup camera 202 may be mounted to the sensor array 106 .
- the backup camera 202 may view a door of the trailer, a loading area of the trailer, and/or an area behind the trailer.
- a driver 208 of the trailer 102 may view a video feed 210 from the backup camera 202 using a wired connection and/or a wireless connection between the backup camera 202 and a display 212 in a cabin area 214 of the semi-trailer truck 104 .
- the trailer 102 of the semi-trailer truck 104 may have a field of view 404 of each of the set of cameras 112 to partially overlap with the field of view 404 of another of the set of cameras 112 .
- the sensor array 106 may be powered by a battery, the semi-trailer truck 104 , and/or a solar array mounted on the trailer 102 .
- the sensor array 106 may communicatively generate a composite view 146 of the cargo area 110 using the set of cameras 112 .
- the sensor array 106 may communicate the composite view 146 to the cabin area 214 of the semi-trailer truck 104 and/or a central server communicatively coupled with the semi-trailer truck 104 through an Internet network using the processor 118 and the memory 116 of the semi-trailer truck 104 .
- the cellular modem 136 may periodically provide a reporting of a location of the semi-trailer truck 104 captured with a geographic positioning receiver to the central server along with the composite view 146 using the processor 118 and the memory 116 .
- a trailer 102 of a semi-trailer truck 104 includes a sensor array 106 affixed to a surface 108 of the trailer 102 to automatically determine whether a cargo area 110 of the semi-trailer truck 104 is occupied.
- the trailer 102 of the semi-trailer truck 104 further includes a set of cameras 112 of the sensor array 106 .
- Each camera of the set of cameras 112 is each recessed relative to an interior region of the cargo area 110 and/or each of the set of cameras 112 peers into the cargo area 110 of the semi-trailer truck 104 .
- a memory 116 and a processor 118 associated with the sensor array 106 are configured to store one baseline image 122 of the cargo area 110 of the trailer 102 when the trailer 102 is in an empty state.
- the processor 118 is configured to detect a triggering event 206 (e.g., using the triggering event algorithm 142 of the dispatch server 126 ) and to illuminate the cargo area 110 of the trailer 102 using at least one light source 114 .
- the processor 118 is further configured to capture a current image 144 of the cargo area 110 of the trailer 102 using the set of cameras 112 .
- the processor 118 compares (e.g., using the difference algorithm 148 of the dispatch server 126 ) each current image 144 of an interior cavity 402 with the corresponding baseline image 122 of a cargo cavity.
- the processor 118 associated with the sensor array 106 is configured to determine a cargo status 124 (e.g., using the cargo status algorithm 125 of the dispatch server 126 ) based upon a difference 148 (e.g., using the difference algorithm 148 of the dispatch server 126 ) between the current image 144 and the baseline image 122 . Furthermore, the processor 118 is configured to send the cargo status 124 to a dispatcher 134 using a cellular modem 136 .
- a trailer 102 of a semi-trailer truck 104 includes a sensor array 106 affixed to a surface 108 of the trailer 102 to automatically determine whether a cargo area 110 of the semi-trailer truck 104 is occupied.
- the trailer 102 of the semi-trailer truck 104 also includes a set of cameras 112 of the sensor array 106 .
- Each camera of the set of cameras 112 is each embedded in individual recess(es) 113 of the sensor array 106 such that each of the set of cameras 112 are interior to a flush plane of the surface 108 to prevent cargo from damaging each camera.
- Each of the set of cameras 112 peers into the cargo area 110 of the semi-trailer truck 104 .
- the trailer 102 of the semi-trailer truck 104 further includes at least one light source 114 to illuminate the cargo area 110 .
- a memory 116 and a processor 118 associated with the sensor array 106 are configured to store one baseline image 122 of the cargo area 110 of the trailer 102 when the trailer 102 is in an empty state.
- the processor 118 is configured to detect a triggering event 206 and/or to illuminate the cargo area 110 of the trailer 102 using at least one light source 114 .
- the processor 118 is further configured to capture a current image 144 of the cargo area 110 of the trailer 102 using the set of cameras 112 .
- the processor 118 is also configured to compare each current image 144 of an interior cavity 402 with the corresponding baseline image 122 of a cargo cavity.
- the processor 118 is configured to determine a cargo status 124 (e.g., using the cargo status algorithm 125 of the dispatch server 126 ) based upon a difference 148 (e.g., using the difference algorithm 148 of the dispatch server 126 ) between the current image 144 and the baseline image 122 . Also, the processor 118 is configured to send the cargo status 124 to a dispatcher 134 using a cellular modem 136 .
- the sensor array 106 includes a backup camera 202 to observe a rear area 204 of the trailer 102 of the semi-trailer truck 104 .
- the backup camera 202 is mounted to the sensor array 106 such that the backup camera 202 views (e.g., using the triggering event algorithm 142 of the dispatch server 126 ) a door of the trailer, a loading area of the trailer, and/or an area behind the trailer.
- a driver 208 of the trailer 102 may view a video feed 210 from the backup camera 202 using a wired connection and/or a wireless connection between the backup camera 202 and/or a display 212 in a cabin area 214 of the semi-trailer truck 104 .
- FIG. 1A is a upper corner placement view 150 A of a sensor array illustrating the sensor array 106 affixed to an upper corner of a trailer 102 of a semi-trailer truck 104 to automatically determine whether a cargo area 110 of the semi-trailer truck 104 is occupied and sending the cargo status 124 to a dispatcher 134 using a cellular modem 136 , according to one embodiment.
- FIG. 1A illustrates the trailer 102 , a network 101 , the semi-trailer truck 104 , the sensor array 106 , the surface 108 , the cargo area 110 , a set of cameras 112 , a recess 113 , a light source 114 , a projection areas 115 , a memory 116 , a processor 118 , a database 120 , a baseline image 122 , a cargo status 124 , a cargo status algorithm 125 , a dispatch server 126 , a dispatch server memory 128 , a dispatch server processor 130 , a dispatch server database 132 , a dispatcher 134 , a user device 135 , and a cellular modem 136 , according to one embodiment.
- the trailer 102 may be a nonmotorized vehicle designed to be hauled by a motor vehicle (e.g., a truck, utility vehicles, and/or a tractor).
- the network 101 may be a group of computing devices (e.g., hardware and software) that are linked together through communication channels to facilitate communication and resource-sharing among a wide range of entities (e.g., dispatcher 134 ).
- the semi-trailer truck 104 may be a large vehicle that consists of a towing engine, known as a tractor and/or a truck, attached to one or more semi-trailers to carry freight, according to one embodiment.
- the sensor array 106 may be a device in the form of a bar and/or a series of bars that may be affixed to a wall and/or upright supports (e.g., a surface 108 of the trailer 102 ) which detects or measures a physical property (e.g., light, heat, motion, moisture, pressure, or any one of a great number of other environmental phenomena) of the occupancy inside the trailer 102 and records, indicates, and/or otherwise responds to it as an output.
- the sensor array 106 e.g., a sensor rail, a sensor housing, etc.
- the sensor array 106 may hold a single camera or may hold multiple cameras.
- the sensor array 106 may be connected through a wired and/or wired networking topology.
- cameras are positioned in different locations of the trailer 102 individually, and the sensor array 106 provides a housing in which to communicatively couple the sensor array 106 to the trailer without the need of a separate rail.
- the sensor array 106 includes multiple cameras on a single sensor rail.
- the sensor array 106 may include optional temperature, humidity, and/or pressure sensing in addition to visual sensing to determine general conditions in which cargo is housed inside the trailer 102 .
- the output may be generally a signal that is converted to human-readable display at the sensor location or transmitted electronically over the network 101 for reading or further processing to determine the cargo status 124 of the trailer 102 .
- the surface 108 may be the uppermost layer of the wall or ceiling of the trailer 102 on which the sensor array 106 is affixed.
- the cargo area 110 may be the space inside the trailer 102 of the semi-trailer truck 104 where the goods are kept for freighting, according to one embodiment.
- the set of cameras 112 may be a group and/or a collection of a number of cameras that may be used for recording visual images of the inside of the trailer 102 in the form of photographs, film, or video signals.
- the recess 113 may be a small space created by building part of a wall of the trailer 102 further back from the rest so as to affix the set of cameras 112 of the sensor array 106 .
- the light source 114 may be any device serving as a source of illumination to make things visible inside the trailer 102 .
- the projection areas 115 may be the extent or measurement covered by each camera of the set of cameras 112 to capture visual images of the inside of the trailer 102 in the form of photographs, film, or video signals, according to one embodiment.
- the memory 116 may be an electronic holding place for instructions and data that the processor 118 of the sensor array 106 can reach quickly.
- the processor 118 may be a logic circuitry that responds to and processes the basic instructions that drives the sensor array 106 for monitoring the semi-trailer truck 104 .
- the database 120 may be a structured collection of information collected by the set of cameras 112 that is organized to be easily accessed, managed, and/or updated by the dispatcher 134 .
- the baseline image 122 may be a visual representation of the inside of the cargo area 110 of the trailer 102 at an empty state.
- the cargo status 124 may be the present situation of the cargo area 110 in terms of occupancy of goods in the trailer 102 as captured by the set of cameras 112 .
- the cargo status algorithm 125 may be a process or set of rules to be followed in calculations or other problem-solving operations for identifying the occupancy of goods in the cargo area 110 of the trailer 102 .
- the dispatch server 126 may be a computer system that provides local area networking services to multiple users (e.g., dispatcher 134 ) to send off the cargo to its respective destination by managing resources and services of the network 101 , while handling requests by the dispatcher 134 from different computers to access the said resources, according to one embodiment.
- multiple users e.g., dispatcher 134
- the dispatch server 126 may be a computer system that provides local area networking services to multiple users (e.g., dispatcher 134 ) to send off the cargo to its respective destination by managing resources and services of the network 101 , while handling requests by the dispatcher 134 from different computers to access the said resources, according to one embodiment.
- the dispatch server memory 128 may be an electronic holding place for instructions and data that the dispatch server processor 130 can reach quickly.
- the dispatch server processor 130 may be a logic circuitry that responds to and processes the basic instructions that drives the dispatch server 126 for monitoring the semi-trailer truck 104 .
- the dispatch server database 132 may be a collection of information that is organized to be easily accessed, managed, and/or updated by the dispatcher 134 , according to one embodiment.
- the dispatcher 134 may be the personnel responsible (e.g., overseeing) for receiving and transmitting pure and reliable messages, tracking vehicles and equipment, and recording other important information regarding the cargo status 124 (e.g., using the cargo status algorithm 125 of the dispatch server 126 ) of the semi-trailer truck 104 .
- the user device 135 may be a computing device that enables the dispatcher 134 to communicate with the dispatch server 126 through the network 101 .
- the cellular modem 136 may be a device that adds wireless 3G or 4G (LTE) connectivity to a laptop or a desktop computer in order to send the cargo status 124 to the dispatcher 134 , according to one embodiment.
- LTE wireless 3G or 4G
- FIG. 1A illustrates a sensor array 106 affixed to an upper corner of the trailer 102 .
- the sensor array 106 includes a set of cameras 112 . Each camera is each embedded in an individual recess 113 of the sensor array 106 . At least one light source 114 is coupled with each of the set of cameras 112 .
- the sensor array 106 is communicatively coupled to a dispatch server 126 through the network 101 .
- the dispatch server 126 includes a dispatch server database 132 coupled with a dispatch server processor 130 and dispatch server memory 128 , according to one embodiment.
- the dispatch server 126 is communicatively coupled to the user device 135 through the network 101 .
- the sensor array 106 is communicatively coupled to the dispatch server 126 through a cellular modem 136 , according to one embodiment.
- the cargo status 124 may be automatically determined using the sensor array 106 .
- the sensor array 106 is affixed to the upper corner of the trailer 102 .
- each camera is each embedded in an individual recess 113 of the sensor array 106 .
- at least one light source 114 illuminates the cargo area 110 associated with each camera of the set of cameras 112 .
- a baseline image 122 captured by the set of cameras 112 is communicated to the dispatch server 126 .
- cargo status 124 is communicated to the dispatcher 134 through the cellular modem 136 , according to one embodiment.
- FIG. 1B is a middle-top placement view 150 B of the sensor array 106 of FIG. 1 illustrating a set of cameras 112 communicatively generating a composite view 146 of the cargo area 110 based on a triggering event (e.g., using the triggering event algorithm 142 of the dispatch server 126 ), according to one embodiment.
- FIG. 1B illustrates a separate housing 138 , an exterior face 140 , a triggering event algorithm 142 , a current image 144 , a composite view 146 , and a difference algorithm 148 , according to one embodiment.
- the separate housing 138 may be a discrete rigid casing that encloses and protects the various components of the sensor array 106 .
- the exterior face 140 may be outermost part of the middle-top section of the trailer 102 on which the sensor array 106 is affixed.
- the triggering event algorithm 142 may be a process or set of rules to be followed in calculations or other problem-solving operations for identifying the occurrence of a trailer opening event, a trailer closing event, a motion detection event (e.g., using a global positioning device and/or a motion sensor), a stopping event, a time-based event, a geographic-location based event, and/or a velocity based event of the trailer 102 of semi-trailer truck 104 , according to one embodiment.
- a motion detection event e.g., using a global positioning device and/or a motion sensor
- stopping event e.g., a time-based event, a geographic-location based event, and/or a velocity based event of the trailer 102 of semi-trailer truck 104 , according to one embodiment.
- the current image 144 may be the present visual representation of the inside of the cargo area 110 of the trailer 102 after occurrence of the triggering event.
- the composite view 146 may be a combined visual representation of the inside of the cargo area 110 of the trailer 102 captured by the set of cameras 112 after occurrence of the triggering event.
- the difference algorithm 148 may be a process or set of rules to be followed in calculations or other problem-solving operations for identifying the distinctness or dissimilarity of the composite view 146 of the cargo area 110 after occurrence of the triggering event from the baseline image 122 of the cargo area 110 at an empty state, according to one embodiment.
- FIG. 1B illustrates a sensor array 106 affixed to a middle-top section of the trailer 102 on the exterior face 140 .
- the sensor array 106 is placed in a separate housing 138 from the cargo area, according to one embodiment.
- the cargo status 124 based on a triggering event may be automatically determined using the sensor array 106 .
- a triggering event e.g., using the triggering event algorithm 142 of the dispatch server 126
- a current image 144 captured by the set of cameras 112 is communicated to the dispatch server 126 .
- the composite view is communicated to the dispatch server 126 .
- the cargo status 124 e.g., using the cargo status algorithm 125 of the dispatch server 126
- FIG. 2 is a backup camera view 250 illustrating a backup camera 202 mounted to the sensor array of FIG. 1 enabling a driver 208 of the trailer 102 to view a video feed 210 from the backup camera 202 , according to one embodiment.
- FIG. 2 illustrates a backup camera 202 , a rear area 204 , a triggering event 206 , a driver 208 , a video feed 210 , a display 212 , and a cabin area 214 , according to one embodiment.
- the backup camera 202 may be a used for recording visual images of the rear area 204 of the trailer 102 in the form of photographs, film, or video signals.
- the rear area 204 may be the back part of the trailer 102 (e.g., a door of the trailer, a loading area of the trailer, and/or an area behind the trailer).
- the triggering event 206 may be a situation (e.g., a trailer opening event, a trailer closing event, a motion detection event, a stopping event, a time-based event, a geographic-location based event, and/or a velocity based event) to cause the set of cameras 112 of the sensor array 106 to record the visual images of the inside of the cargo area 110 .
- the driver 208 may be the person driving the semi-trailer truck 104 .
- the video feed 210 may be a sequence of images from the set of cameras processed electronically into an analog or digital format and displayed on a display 212 with sufficient rapidity so as to create the illusion of motion and continuity.
- the display 212 may be a computer output surface and projecting mechanism that shows video feed 210 or graphic images to the driver 208 , using a cathode ray tube (CRT), liquid crystal display (LCD), light-emitting diode, gas plasma, or other image projection technology.
- the cabin area 214 may be the private compartment for the driver 208 in the front portion of the semi-trailer truck 104 , according to one embodiment.
- FIG. 2 illustrates a backup camera 202 mounted to the sensor array 106 to observe the rear area 204 of the trailer 102 of the semi-trailer truck 104 , according to one embodiment.
- the triggering event is communicated to the processor 118 .
- the projection area 115 in the rear area 204 of the trailer 102 is captured by the backup camera 202 .
- the video feed 210 is sent to the driver 208 using a wired connection and/or a wireless connection of the sensor array 106 , according to one embodiment.
- FIG. 3 is a block diagram 350 representing one embodiment of the sensor array 106 of the trailer of semi-trailer truck 104 illustrated in FIG. 1 .
- the sensor array 106 includes a set of cameras 112 associated with a light source 114 .
- the sensor array 106 of the trailer of semi-trailer truck 104 further includes a processor 118 , a database 120 and a memory 116 .
- the processor 118 of the sensor array 106 may be configured to capture the baseline image 122 using the set of cameras 112 .
- the light source 114 associated with each of the set of cameras 112 illuminates the inside cavity of the cargo area 110 .
- the processor 118 identifies the triggering event (e.g., using the triggering event algorithm 142 of the dispatch server 126 ) caused by a trailer opening event, a trailer closing event, a motion detection event, a stopping event, a time-based event, a geographic-location based event, and/or a velocity based event.
- a current image 144 is captured by each of the set of cameras 112 .
- a composite view 146 is generated based on the current image 144 captured by each of the set of cameras 112 .
- the composite view 146 and the baseline image 122 is compared to conclude the cargo status 124 (e.g., using the cargo status algorithm 125 of the dispatch server 126 ) of the trailer 102 .
- the cargo status 124 is communicated to the dispatcher 134 , according to one embodiment.
- FIG. 4 is a composite view 450 illustrating the overlapping distortion 406 captured by each camera 112 A-D of the set of cameras 112 of the sensor array 106 of FIG. 1 providing the cargo status 124 of the trailer 102 , according to one embodiment.
- FIG. 4 illustrates an interior cavity 402 , a field of view 404 , and an overlapping distortion 406 .
- the interior cavity 402 may be an empty space inside the trailer 102 of the semi-trailer truck where the cargo is kept for dispatch.
- the field of view 404 may be the extent or measurement covered by each camera of the set of cameras 112 to capture visual images of the inside of the trailer 102 in the form of photographs, film, or video signals.
- the overlapping distortion 406 may be the covering or extension of field of view 404 of one camera over the field of view 404 of its adjoining camera of the set of cameras 112 of the sensor array 106 , according to one embodiment.
- composite view 450 illustrates an example embodiment of the sensor array 106 running the length of the trailer 102 with embedded set of cameras 112 , electronics, wiring and LED light source and other sensors mounted on ceiling.
- Each camera is looking for distortion from reference baseline image 122 . No distortion from any of the camera indicates that the trailer is empty.
- Overlapping distortion 406 provides information on the extent of quadrant load in each of the projection areas 114 A-E.
- FIG. 5 is a table view illustrating the storing of undistorted baseline image 122 captured at empty state of the trailer 102 of FIG. 1 and the corresponding distorted image after occurrence of the triggering event 206 for determining the cargo status 124 , according to one embodiment.
- FIG. 5 is a table view 550 showing the fields associated with the dispatcher 134 , a trailer 102 field, a set of cameras 112 field, a baseline image distortion 502 field, a triggering event 206 field, distortion in current image 504 field, and a cargo status 124 field, according to one embodiment.
- FIG. 5 illustrates an example of two records for a dispatcher 134 with two trailers having a sensor array having a set of cameras 112 affixed to each of its trailer 102 .
- the baseline image(s) 122 captured in empty state of the trailer 1 and 2 shows no distortion as shown in the 502 field.
- the triggering event 206 caused by the trailer opening event in trailer 1 depicts a distortion in current image 504 captured by camera 112 A of trailer 1 .
- the triggering event 206 caused by the velocity based event in trailer 2 depicts a distortion in current image 504 captured by camera 112 A-C of trailer 2 .
- FIG. 6 is an exploded view of the triggering event algorithm 142 of the sensor array 106 of FIG. 1 , according to one embodiment. Particularly, FIG. 6 illustrates a trailer opening event module 602 , a trailer closing event module 604 , a time-based event module 606 , a motion detection event module 608 , a stopping event module 610 , a geographic-location based event module 612 , and a velocity based event module 614 , according to one embodiment.
- the trailer opening event module 602 may be a part and/or a separate unit of a program of the triggering event algorithm 142 that assists in identifying the occurrence of a trailer opening event in order to activate the set of cameras 112 to capture the current image 144 .
- the trailer closing event module 604 may be a part and/or a separate unit of a program of the triggering event algorithm 142 that assists in identifying the occurrence of a trailer closing event in order to activate the set of cameras 112 to capture the current image 144 .
- the time-based event module 606 may be a part and/or a separate unit of a program of the triggering event algorithm 142 that assists in identifying the occurrence of an event based on time, according to one embodiment.
- the motion detection event module 608 may be a part and/or a separate unit of a program of the triggering event algorithm 142 to detect motion of the semi-trailer truck 104 .
- the stopping event module 610 may be a part and/or a separate unit of a program of the triggering event algorithm 142 that assists in identifying the stopping of the semi-trailer truck 104 .
- the geographic-location based event module 612 may be a part and/or a separate unit of a program of the triggering event algorithm 142 that assists in identifying the occurrence of a situation based on the geographic-location of the semi-trailer truck 104 .
- the velocity based event module 614 may be a part and/or a separate unit of a program of the triggering event algorithm 142 that assists in identifying the occurrence of a situation based on the velocity of the semi-trailer truck 104 , according to one embodiment.
- FIG. 7 is a critical path view illustrating a flow based on time in which critical operations of the sensor array of FIG. 1 are established, according to one embodiment.
- the dispatcher 134 affixes a sensor array 106 to a surface 108 of a trailer 102 of a semi-trailer truck 104 .
- the sensor array 106 peers each of the camera of the set of cameras 112 into the cargo area 110 of the semi-trailer truck 104 .
- the dispatcher 134 configures a memory 116 and a processor 118 to store at least one baseline image 122 of the cargo area 110 of the trailer 102 when trailer is in empty state.
- the dispatcher 134 configures the processor 118 to detect a triggering event 206 .
- the dispatcher 134 configures the processor 118 to illuminate the cargo area 110 of the trailer 102 using at least one light source 114 .
- the sensor array 106 captures a current image 144 of the cargo area 110 of the trailer 102 using at least one of the set of cameras 112 .
- the sensor array 106 compares each current image 144 of the interior cavity with the corresponding baseline image 122 of the cargo cavity.
- the sensor array 106 determines a cargo status 124 based upon a difference between the current image 144 and the baseline image 122 .
- the sensor array 106 sends the cargo status 124 to a dispatcher 134 using a cellular modem 136 , according to one embodiment.
- FIG. 8 is a process flow diagram of the sensor array 106 of FIG. 1 to determine the cargo status 124 of the trailer 102 of the semi-trailer truck 104 of FIG. 1 , according to one embodiment.
- a sensor array 106 is affixed to a surface 108 of a trailer 102 to automatically determine whether a cargo area 110 of the semi-trailer truck 104 is occupied.
- each of the camera 112 A-D of a set of cameras 112 of the sensor array 106 peers into the cargo area 110 of the semi-trailer truck 104 .
- a memory 116 and a processor 118 associated with the sensor array 106 are configured to store at least one baseline image 122 of the cargo area 110 of the trailer 102 when the trailer 102 is in an empty state.
- the processor 118 is configured to detect a triggering event 206 .
- the cargo area 110 is illuminated using at least one light source 114 .
- a current image 144 of the cargo area 110 of the trailer 102 is captured using at least one of the set of cameras 112 , according to one embodiment.
- each current image 144 of the interior cavity is compared with the corresponding baseline image 122 of the cargo cavity.
- a cargo status 124 is determined based upon a difference between the current image 144 and the baseline image 122 .
- the cargo status 124 is sent to a dispatcher 134 using a cellular modem 136 , according to one embodiment.
- FIG. 9 is a schematic diagram of generic computing device 990 that can be used to implement the methods and systems disclosed herein, according to one or more embodiments.
- FIG. 9 is a schematic diagram of generic computing device 990 and a generic mobile computing device 930 that can be used to perform and/or implement any of the embodiments disclosed herein.
- dispatch server 126 and/or user device 135 of FIG. 1A may be the generic computing device 900 .
- the generic computing device 900 may represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and/or other appropriate computers.
- the generic mobile computing device 930 may represent various forms of mobile devices, such as smartphones, camera phones, personal digital assistants, cellular telephones, and other similar mobile devices.
- the components shown here, their connections, couples, and relationships, and their functions, are meant to be exemplary only, and are not meant to limit the embodiments described and/or claimed, according to one embodiment.
- the generic computing device 900 may include a processor 902 , a memory 904 , a storage device 906 , a high speed interface 908 coupled to the memory 904 and a plurality of high speed expansion ports 910 , and a low speed interface 912 coupled to a low speed bus 914 and a storage device 906 .
- each of the components heretofore may be inter-coupled using various buses, and may be mounted on a common motherboard and/or in other manners as appropriate.
- the processor 902 may process instructions for execution in the generic computing device 900 , including instructions stored in the memory 904 and/or on the storage device 906 to display a graphical information for a GUI on an external input/output device, such as a display unit 916 coupled to the high speed interface 908 , according to one embodiment.
- multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and/or types of memory.
- a plurality of computing device 900 may be coupled with, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, and/or a multi-processor system).
- the memory 904 may be coupled to the generic computing device 900 .
- the memory 904 may be a volatile memory.
- the memory 904 may be a non-volatile memory.
- the memory 904 may also be another form of computer-readable medium, such as a magnetic and/or an optical disk.
- the storage device 906 may be capable of providing mass storage for the generic computing device 900 .
- the storage device 906 may be includes a floppy disk device, a hard disk device, an optical disk device, a tape device, a flash memory and/or other similar solid state memory device.
- the storage device 906 may be an array of the devices in a computer-readable medium previously mentioned heretofore, computer-readable medium, such as, and/or an array of devices, including devices in a storage area network and/or other configurations.
- a computer program may be comprised of instructions that, when executed, perform one or more methods, such as those described above.
- the instructions may be stored in the memory 904 , the storage device 906 , a memory coupled to the processor 902 , and/or a propagated signal.
- the high speed interface 908 may manage bandwidth-intensive operations for the generic computing device 900 , while the low speed interface 912 may manage lower bandwidth-intensive operations. Such allocation of functions is exemplary only.
- the high speed interface 908 may be coupled to the memory 904 , the display unit 916 (e.g., through a graphics processor and/or an accelerator), and to the plurality of high speed expansion ports 910 , which may accept various expansion cards.
- the low speed interface 912 may be coupled to the storage device 906 and the low speed bus 914 .
- the low speed bus 914 may be comprised of a wired and/or wireless communication port (e.g., a Universal Serial Bus (“USB”), a Bluetooth® port, an Ethernet port, and/or a wireless Ethernet port).
- the low speed bus 914 may also be coupled to the scan unit 928 , a printer 926 , a keyboard, a mouse 924 , and a networking device (e.g., a switch and/or a router) through a network adapter.
- the generic computing device 900 may be implemented in a number of different forms, as shown in the figure.
- the computing device 900 may be implemented as a standard server 918 and/or a group of such servers.
- the generic computing device 900 may be implemented as part of a rack server system 922 .
- the generic computing device 900 may be implemented as a general computer 920 such as a laptop or desktop computer.
- a component from the generic computing device 900 may be combined with another component in a generic mobile computing device 930 .
- an entire system may be made up of a plurality of generic computing device 900 and/or a plurality of generic computing device 900 coupled to a plurality of generic mobile computing device 930 .
- the generic mobile computing device 930 may include a mobile compatible processor 932 , a mobile compatible memory 934 , and an input/output device such as a mobile display 946 , a communication interface 952 , and a transceiver 938 , among other components.
- the generic mobile computing device 930 may also be provided with a storage device, such as a microdrive or other device, to provide additional storage.
- the components indicated heretofore are inter-coupled using various buses, and several of the components may be mounted on a common motherboard.
- the mobile compatible processor 932 may execute instructions in the generic mobile computing device 930 , including instructions stored in the mobile compatible memory 934 .
- the mobile compatible processor 932 may be implemented as a chipset of chips that include separate and multiple analog and digital processors.
- the mobile compatible processor 932 may provide, for example, for coordination of the other components of the generic mobile computing device 930 , such as control of user interfaces, applications run by the generic mobile computing device 930 , and wireless communication by the generic mobile computing device 930 .
- the mobile compatible processor 932 may communicate with a user through the control interface 936 and the display interface 944 coupled to a mobile display 946 .
- the mobile display 946 may be a Thin-Film-Transistor Liquid Crystal Display (“TFT LCD”), an Organic Light Emitting Diode (“OLED”) display, and another appropriate display technology.
- TFT LCD Thin-Film-Transistor Liquid Crystal Display
- OLED Organic Light Emitting Diode
- the display interface 944 may comprise appropriate circuitry for driving the mobile display 946 to present graphical and other information to a user.
- the control interface 936 may receive commands from a user and convert them for submission to the mobile compatible processor 932 .
- an external interface 942 may be provide in communication with the mobile compatible processor 932 , so as to enable near area communication of the generic mobile computing device 930 with other devices.
- External interface 942 may provide, for example, for wired communication in some embodiments, or for wireless communication in other embodiments, and multiple interfaces may also be used.
- the mobile compatible memory 934 may be coupled to the generic mobile computing device 930 .
- the mobile compatible memory 934 may be implemented as a volatile memory and a non-volatile memory.
- the expansion memory 958 may also be coupled to the generic mobile computing device 930 through the expansion interface 956 , which may comprise, for example, a Single In Line Memory Module (“SIMM”) card interface.
- SIMM Single In Line Memory Module
- the expansion memory 958 may provide extra storage space for the generic mobile computing device 930 , or may also store an application or other information for the generic mobile computing device 930 .
- the expansion memory 958 may comprise instructions to carry out the processes described above.
- the expansion memory 958 may also comprise secure information.
- the expansion memory 958 may be provided as a security module for the generic mobile computing device 930 , and may be programmed with instructions that permit secure use of the generic mobile computing device 930 .
- a secure application may be provided on the SIMM card, along with additional information, such as placing identifying information on the SIMM card in a non-hackable manner.
- the mobile compatible memory may include a volatile memory (e.g., a flash memory) and a non-volatile memory (e.g., a non-volatile random-access memory (“NVRAM”)).
- NVRAM non-volatile random-access memory
- a computer program comprises a set of instructions that, when executed, perform one or more methods.
- the set of instructions may be stored on the mobile compatible memory 934 , the expansion memory 958 , a memory coupled to the mobile compatible processor 932 , and a propagated signal that may be received, for example, over the transceiver 938 and/or the external interface 942 .
- the generic mobile computing device 930 may communicate wirelessly through the communication interface 952 , which may be comprised of a digital signal processing circuitry.
- the communication interface 952 may provide for communications using various modes and/or protocols, such as, a Global System for Mobile Communications (“GSM”) protocol, a Short Message Service (“SMS”) protocol, an Enhanced Messaging System (“EMS”) protocol, a Multimedia Messaging Service (“MMS”) protocol, a Code Division Multiple Access (“CDMA”) protocol, Time Division Multiple Access (“TDMA”) protocol, a Personal Digital Cellular (“PDC”) protocol, a Wideband Code Division Multiple Access (“WCDMA”) protocol, a CDMA2000 protocol, and a General Packet Radio Service (“GPRS”) protocol.
- GSM Global System for Mobile Communications
- SMS Short Message Service
- EMS Enhanced Messaging System
- MMS Multimedia Messaging Service
- CDMA Code Division Multiple Access
- TDMA Time Division Multiple Access
- PDC Personal Digital Cellular
- WCDMA Wideband Code Division Multiple Access
- CDMA2000 protocol Code Division Multiple
- Such communication may occur, for example, through the transceiver 938 (e.g., radio-frequency transceiver).
- short-range communication may occur, such as using a Bluetooth®, Wi-Fi, and/or other such transceiver.
- a GPS (“Global Positioning System”) receiver module 954 may provide additional navigation-related and location-related wireless data to the generic mobile computing device 930 , which may be used as appropriate by a software application running on the generic mobile computing device 930 .
- the generic mobile computing device 930 may also communicate audibly using an audio codec 940 , which may receive spoken information from a user and convert it to usable digital information.
- the audio codec 940 may likewise generate audible sound for a user, such as through a speaker (e.g., in a handset smartphone of the generic mobile computing device 930 ).
- Such a sound may comprise a sound from a voice telephone call, a recorded sound (e.g., a voice message, a music files, etc.) and may also include a sound generated by an application operating on the generic mobile computing device 930 .
- the generic mobile computing device 930 may be implemented in a number of different forms, as shown in the figure.
- the generic mobile computing device 930 may be implemented as a smartphone 948 .
- the generic mobile computing device 930 may be implemented as a personal digital assistant (“PDA”).
- the generic mobile computing device, 930 may be implemented as a tablet device 950 .
- the ACME Haulage Corporation may provide cargo transportation services in remote areas of the United States.
- the ACME Haulage Corporation may be compensated based on a type of goods being carried inside a cargo area of its trailer of a transportation vehicle (e.g., a semi-trailer truck 104 ). For this reason, the ACME Haulage Corporation may want to understand the ‘load’ status of their equipment (e.g., a semi-trailer truck 104 ) to optimize the dispatch and routing of their transportation assets.
- the ACME Haulage Corporation may have to rely on field reports.
- the ACME Haulage Corporation may have employed sensors (e.g.
- weight sensors in an interior space of its trailers.
- These sensors may not be able to detect patterns or types of cargo and exactly where in the trailer the cargo is located.
- the incorrect and unreliable cargo status provided by these sensors may have resulted into a number of untoward situations. For example, a driver of its semi-trailer truck may have embarked on a long journey, when, in fact, its cargo area is filled with the wrong type of cargo or may even be empty. This may have lead The ACME Haulage Corporation to a loss of invaluable time, fuel, efficiency, customer dissatisfaction, and/or ultimately, loss of revenue for its services.
- the ACME Haulage Corporation may have decided to invest in embodiments described herein (e.g., use of various embodiments of the FIGS. 1-9 ) for optimum utilization of interior spaces of the cargo area of its trailers (e.g., a trailer 102 ).
- the use of technologies described in various embodiments of the FIGS. 1-9 may enable the dispatch managers of ACME Haulage Corporation to remotely monitor and manage its entire fleets of cargo transport equipment (e.g., trailer 102 ) and asset utilization in real-time.
- 1-9 may have also enabled the dispatch managers of the ACME Haulage Corporation to know the actual load status of its cargo transport equipment (e.g., a trailer 102 ) through image analysis and to verify the contents of the equipment through a photographic image. Additionally, the image analysis may have enabled the central dispatch (e.g., dispatcher 134 ) of the ACME Haulage Corporation to know what areas and/or zones of the equipment (e.g., trailer 102 ) are actually loaded.
- the dispatch managers of the ACME Haulage Corporation may have also enabled the dispatch managers of the ACME Haulage Corporation to know the actual load status of its cargo transport equipment (e.g., a trailer 102 ) through image analysis and to verify the contents of the equipment through a photographic image. Additionally, the image analysis may have enabled the central dispatch (e.g., dispatcher 134 ) of the ACME Haulage Corporation to know what areas and/or zones of the equipment (e.g., trailer 102 ) are actually loaded.
- the central dispatch e.g., dispatcher
- the use of technologies described in various embodiments of the FIGS. 1-9 facilitated the dispatch managers (e.g., dispatcher 134 ) of ACME Haulage Corporation to utilize an easy-to-use mobile interface, giving it real-time visibility of the cargo areas of its trailers for their daily operations along with helping dispatch managers (e.g., dispatcher 134 ).
- the dispatch managers (e.g., dispatcher 134 ) of the ACME Haulage Corporation may now be able to automate manual business processes and optimize performance of its transportation equipments (e.g., trailer 102 ) by using the rich data platform as described in various embodiments of the FIGS. 1-9 maximizing trailer utilization.
- FIGS. 1-9 may have enabled trailer management system of the ACME Haulage Corporation to instantly connect dispatch managers to a host of powerful, easy-to-use analytics and insights via web-based, highly intuitive trailer tracking dashboards, customizable trailer tracking reports and exception-based alerts.
- dispatch managers e.g., dispatcher 134
- dispatcher 134 may have the ability to automate yard checks; better manage and distribute trailer pools; improve detention billing; increase the efficiencies and productivity of dispatch operations; secure trailers and high-value cargo; deter fraud and unauthorized trailer use; improve driver and customer satisfaction; and maximize trailer utilization for a more profitable fleet.
- the ACME Haulage Corporation may now utilize their cargo area to its optimum capacity. This may have lead the ACME Haulage Corporation to save time, fuel, increase efficiency, customer satisfaction, and/or ultimately, prevent loss of revenue for its transportation services raising its profit.
- Various embodiments of the systems and techniques described here can be realized in a digital electronic circuitry, an integrated circuitry, a specially designed application specific integrated circuits (“ASICs”), a piece of computer hardware, a firmware, a software application, and a combination thereof.
- ASICs application specific integrated circuits
- These various embodiments can include embodiment in one or more computer programs that are executable and/or interpretable on a programmable system including one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, one input device, and at least one output device.
- the systems and techniques described here may be implemented on a computing device having a display device (e.g., a cathode ray tube (“CRT”) and/or liquid crystal (“LCD”) monitor) for displaying information to the user and a keyboard and a mouse 924 by which the user can provide input to the computer.
- a display device e.g., a cathode ray tube (“CRT”) and/or liquid crystal (“LCD”) monitor
- CTR cathode ray tube
- LCD liquid crystal
- Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, and/or tactile feedback) and input from the user can be received in any form, including acoustic, speech, and/or tactile input.
- the systems and techniques described here may be implemented in a computing system that includes a back end component (e.g., as a data server), a middleware component (e.g., an application server), a front end component (e.g., a client computer having a graphical user interface, and/or a Web browser through which a user can interact with an embodiment of the systems and techniques described here), and a combination thereof.
- a back end component e.g., as a data server
- a middleware component e.g., an application server
- a front end component e.g., a client computer having a graphical user interface, and/or a Web browser through which a user can interact with an embodiment of the systems and techniques described here
- the components of the system may also be coupled through a communication network.
- the communication network may include a local area network (“LAN”) and a wide area network (“WAN”) (e.g., the Internet).
- the computing system can include a client and a server. In one embodiment, the client and the server are remote from each other and interact through the communication network.
- the structures and modules in the figures may be shown as distinct and communicating with only a few specific structures and not others.
- the structures may be merged with each other, may perform overlapping functions, and may communicate with other structures not shown to be connected in the figures. Accordingly, the specification and/or drawings may be regarded in an illustrative rather than a restrictive sense.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Mechanical Engineering (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Computer Networks & Wireless Communication (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Alarm Systems (AREA)
- Studio Devices (AREA)
- Traffic Control Systems (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Analysis (AREA)
Abstract
Disclosed are a method, a device and/or a system of utilizing a camera array system to detect a load status of a semi-trailer truck. A sensor array is affixed to a surface of a trailer of the semi-trailer truck to automatically determine whether a cargo area of the semi-trailer truck is occupied. A set of cameras of the sensor array have each camera embedded in individual recesses of the sensor array. The cargo area is illuminated using at least one light source of the sensor array. A memory and a processor associated with the sensor array are configured to store one baseline image of the cargo area of the trailer in an empty state. The processor is configured to detect a triggering event. The processor determines a cargo status based upon a difference between the current image and the baseline image, and sends the cargo status to a dispatcher.
Description
- This disclosure relates generally to automotive technology and, more particularly, to a method, a device and/or a system of utilizing a camera array system to detect a load status of a semi-trailer truck.
- A transportation service provider (e.g., a logistics provider) may be compensated based on a type of goods being carried inside a cargo area of a trailer of a transportation vehicle (e.g., a semi-trailer truck). Therefore, the transportation service provider may seek to maximize a utilization of space inside of a cargo area of the trailer. Sensors (e.g. weight sensors, wave sensors, ultrasound sensors) employed in an interior space of the cargo area may not be able detect color patterns or types of cargo.
- Further, these sensors may not be able to detect exactly where in the trailer the cargo is located. Moreover, these sensors may not provide a reliable view of what is exactly happening inside of the trailer. As a result, new problems may arise such as a driver may embark on long journeys, when, in fact, their cargo area is filled with the wrong type of cargo (e.g., may even be empty). This may lead to wasted time, fuel, efficiency, customer dissatisfaction, and/or ultimately, loss of revenue for the transportation services provider.
- Disclosed are a method, a device and/or a system of utilizing a camera array system to detect a load status of a semi-trailer truck.
- In one aspect, a trailer of a semi-trailer truck includes a sensor array affixed to a surface of the trailer to automatically determine whether a cargo area of the semi-trailer truck is occupied. The trailer of the semi-trailer truck includes a set of cameras of the sensor array. Each camera of the set of cameras is each embedded in individual recesses of the sensor array such that each of the set of cameras does not protrude from the sensor array into the cargo area and/or each of the set of cameras peers into the cargo area of the semi-trailer truck. The trailer of the semi-trailer truck further includes at least one light source to illuminate the cargo area. A memory and a processor associated with the sensor array are configured to store one baseline image of the cargo area of the trailer when the trailer is in an empty state. The processor is configured to detect a triggering event and to illuminate the cargo area of the trailer using at least one light source. The processor is further configured to capture a current image of the cargo area of the trailer using the set of cameras. The processor is configured to compare each current image of an interior cavity with the corresponding baseline image of a cargo cavity. The processor determines a cargo status based upon a difference between the current image and the baseline image. The processor is also configured to send the cargo status to a dispatcher using a cellular modem.
- The sensor array may be affixed to an upper corner of the trailer. The sensor array may be affixed to a middle top-section of the trailer, such that the sensor array is placed in a separate housing from the cargo area on an exterior face of the trailer. The light source may be a light-emitting diode that is associated with each camera of the set of cameras. Each camera of the set of cameras may automatically take a photograph of the cargo area in view of each camera upon an occurrence of the triggering event. The triggering event may be a trailer opening event, a trailer closing event, a motion detection event through a global positioning device and a motion sensor in the trailer, a stopping event, a time-based event, a geographic-location based event, and/or a velocity based event.
- The sensor array may include a backup camera to observe a rear area of the trailer of the semi-trailer truck. The backup camera may be mounted to the sensor array. The backup camera may view a door of the trailer, a loading area of the trailer, and/or an area behind the trailer. A driver of the trailer may view a video feed from the backup camera using a wired connection and/or a wireless connection between the backup camera and a display in a cabin area of the semi-trailer truck. The trailer of the semi-trailer truck may have a field of view of each of the set of cameras to partially overlap with the field of view of another of the set of cameras. The sensor array may be powered by a battery, the semi-trailer truck, and/or a solar array mounted on the trailer.
- The sensor array may communicatively generate a composite view of the cargo area using the set of cameras. The sensor array may communicate the composite view to the cabin area of the semi-trailer truck and/or a central server communicatively coupled with the semi-trailer truck through an Internet network using the processor and the memory of the semi-trailer truck. The cellular modem may periodically provide a reporting of a location of the semi-trailer truck captured with a geographic positioning receiver to the central server along with the composite view using the processor and the memory.
- In another aspect, a trailer of a semi-trailer truck includes a sensor array affixed to a surface of the trailer to automatically determine whether a cargo area of the semi-trailer truck is occupied. The trailer of the semi-trailer truck further includes a set of cameras of the sensor array. Each camera of the set of cameras is each recessed relative to an interior region of the cargo area and/or each of the set of cameras peers into the cargo area of the semi-trailer truck. A memory and a processor associated with the sensor array are configured to store one baseline image of the cargo area of the trailer when the trailer is in an empty state. The processor is configured to detect a triggering event and to illuminate the cargo area of the trailer using at least one light source. The processor is further configured to capture a current image of the cargo area of the trailer using the set of cameras. The processor compares each current image of an interior cavity with the corresponding baseline image of a cargo cavity. The processor of the sensor array is configured to determine a cargo status based upon a difference between the current image and the baseline image. Furthermore, the processor is configured to send the cargo status to a dispatcher using a cellular modem.
- In yet another aspect, a trailer of a semi-trailer truck includes a sensor array affixed to a surface of the trailer to automatically determine whether a cargo area of the semi-trailer truck is occupied. The trailer of the semi-trailer truck also includes a set of cameras of the sensor array. Each camera of the set of cameras is each embedded in individual recesses of the sensor array such that each of the set of cameras are interior to a flush plane of the surface to prevent cargo from damaging each camera. Each of the set of cameras peers into the cargo area of the semi-trailer truck. The trailer of the semi-trailer truck further includes at least one light source to illuminate the cargo area. A memory and a processor associated with the sensor array are configured to store one baseline image of the cargo area of the trailer when the trailer is in an empty state. The processor is configured to detect a triggering event and/or to illuminate the cargo area of the trailer using at least one light source. The processor is further configured to capture a current image of the cargo area of the trailer using the set of cameras. The processor is also configured to compare each current image of an interior cavity with the corresponding baseline image of a cargo cavity. Furthermore, the processor is configured to determine a cargo status based upon a difference between the current image and the baseline image. Also, the processor is configured to send the cargo status to a dispatcher using a cellular modem. The sensor array includes a backup camera to observe a rear area of the trailer of the semi-trailer truck. The backup camera is mounted to the sensor array such that the backup camera views a door of the trailer, a loading area of the trailer, and/or an area behind the trailer. A driver of the trailer may view a video feed from the backup camera using a wired connection and/or a wireless connection between the backup camera and/or a display in a cabin area of the semi-trailer truck.
- The method, apparatus, and system disclosed herein may be implemented in any means for achieving various aspects, and may be executed in a form of a non-transitory machine-readable medium embodying a set of instructions that, when executed by a machine, cause the machine to perform any of the operations disclosed herein. Other features will be apparent from the accompanying drawings and from the detailed description that follows.
- The embodiments of this invention are illustrated by way of example and not limitation in the Figures of the accompanying drawings, in which like references indicate similar elements and in which:
-
FIG. 1A is an upper corner placement view of a sensor array affixed to an upper corner of a trailer of a semi-trailer truck to automatically determine whether a cargo area of the semi-trailer truck is occupied and sending the cargo status to a dispatcher using a cellular modem, according to one embodiment. -
FIG. 1B is a middle top placement view of the sensor array ofFIG. 1 illustrating a set of camera communicatively generating a composite view of the cargo area based on a triggering event, according to at least one embodiment. -
FIG. 2 is a backup camera view of the sensor array ofFIG. 1 illustrating a backup camera mounted to the sensor array enabling a driver of the trailer to view a video feed from the backup camera, according to at least one embodiment. -
FIG. 3 is a block diagram representing one embodiment of the sensor array of the trailer of semi-trailer truck illustrated inFIG. 1 . -
FIG. 4 is a composite view illustrating the overlapping distortion captured by each camera of the set of cameras of the sensor array ofFIG. 1 providing the cargo status of the trailer, according to one embodiment. -
FIG. 5 is a table view illustrating the storing of undistorted baseline image captured at an empty state of the trailer ofFIG. 1 and the corresponding distorted image after occurrence of the triggering event for determining the cargo status, according to one embodiment. -
FIG. 6 is an exploded view of the triggering event algorithm of the sensor array ofFIG. 1 , according to one embodiment. -
FIG. 7 is a critical path view illustrating a flow based on time in which critical operations of the sensor array ofFIG. 1 are established, according to one embodiment. -
FIG. 8 is a process flow diagram of the sensor array ofFIG. 1 to determine the cargo status of the trailer of the semi-trailer truck ofFIG. 1 , according to one embodiment. -
FIG. 9 is a schematic diagram of exemplary data processing devices that can be used to implement the methods and systems disclosed herein, according to one embodiment. - Other features of the present embodiments will be apparent from the accompanying drawings and from the detailed description that follows.
- Disclosed are a method, a device and/or a system of utilizing a camera array system to detect a load status of a semi-trailer truck.
- In one embodiment, a
trailer 102 of asemi-trailer truck 104 includes asensor array 106 affixed to asurface 108 of thetrailer 102 to automatically determine whether acargo area 110 of thesemi-trailer truck 104 is occupied. Thetrailer 102 of thesemi-trailer truck 104 also includes a set ofcameras 112 of thesensor array 106. Each camera of the set ofcameras 112 is each embedded in individual recess(es) 113 of thesensor array 106 such that each of the set ofcameras 112 does not protrude from thesensor array 106 into thecargo area 110 and/or each of the set ofcameras 112 peers into thecargo area 110 of thesemi-trailer truck 104. Thetrailer 102 of thesemi-trailer truck 104 further includes at least onelight source 114 to illuminate thecargo area 110. Amemory 116 and aprocessor 118 associated with thesensor array 106 are configured to store onebaseline image 122 of thecargo area 110 of thetrailer 102 when thetrailer 102 is in an empty state. - The
processor 118 is configured to detect a triggering event 206 (e.g., using the triggeringevent algorithm 142 of the dispatch server 126) and to illuminate thecargo area 110 of thetrailer 102 using at least onelight source 114. Theprocessor 118 is further configured to capture acurrent image 144 of thecargo area 110 of thetrailer 102 using the set ofcameras 112. Theprocessor 118 is configured to compare (e.g., using thedifference algorithm 148 of the dispatch server 126) eachcurrent image 144 of aninterior cavity 402 with thecorresponding baseline image 122 of a cargo cavity. Theprocessor 118 determines a cargo status 124 (e.g., using the cargo status algorithm 125 of the dispatch server 126) based upon a difference (e.g., using thedifference algorithm 148 of the dispatch server 126) between thecurrent image 144 and thebaseline image 122. Theprocessor 118 is also configured to send thecargo status 124 to adispatcher 134 using acellular modem 136. - The
sensor array 106 may be affixed to an upper corner of thetrailer 102. Thesensor array 106 may be affixed to a middle top-section of thetrailer 102, such that thesensor array 106 is placed in aseparate housing 138 from thecargo area 110 on anexterior face 140 of thetrailer 102. Thelight source 114 may be a light-emitting diode that is associated with each camera of the set ofcameras 112. Each camera of the set ofcameras 112 may automatically take a photograph of thecargo area 110 in view of each camera upon an occurrence of the triggeringevent 206. The triggeringevent 206 may be a trailer opening event, a trailer closing event, a motion detection event through a global positioning device and a motion sensor in the trailer, a stopping event, a time-based event, a geographic-location based event, and/or a velocity based event. - The
sensor array 106 may include abackup camera 202 to observe arear area 204 of thetrailer 102 of thesemi-trailer truck 104. Thebackup camera 202 may be mounted to thesensor array 106. Thebackup camera 202 may view a door of the trailer, a loading area of the trailer, and/or an area behind the trailer. Adriver 208 of thetrailer 102 may view avideo feed 210 from thebackup camera 202 using a wired connection and/or a wireless connection between thebackup camera 202 and adisplay 212 in acabin area 214 of thesemi-trailer truck 104. Thetrailer 102 of thesemi-trailer truck 104 may have a field ofview 404 of each of the set ofcameras 112 to partially overlap with the field ofview 404 of another of the set ofcameras 112. Thesensor array 106 may be powered by a battery, thesemi-trailer truck 104, and/or a solar array mounted on thetrailer 102. - The
sensor array 106 may communicatively generate acomposite view 146 of thecargo area 110 using the set ofcameras 112. Thesensor array 106 may communicate thecomposite view 146 to thecabin area 214 of thesemi-trailer truck 104 and/or a central server communicatively coupled with thesemi-trailer truck 104 through an Internet network using theprocessor 118 and thememory 116 of thesemi-trailer truck 104. Thecellular modem 136 may periodically provide a reporting of a location of thesemi-trailer truck 104 captured with a geographic positioning receiver to the central server along with thecomposite view 146 using theprocessor 118 and thememory 116. - In another embodiment, a
trailer 102 of asemi-trailer truck 104 includes asensor array 106 affixed to asurface 108 of thetrailer 102 to automatically determine whether acargo area 110 of thesemi-trailer truck 104 is occupied. Thetrailer 102 of thesemi-trailer truck 104 further includes a set ofcameras 112 of thesensor array 106. Each camera of the set ofcameras 112 is each recessed relative to an interior region of thecargo area 110 and/or each of the set ofcameras 112 peers into thecargo area 110 of thesemi-trailer truck 104. Amemory 116 and aprocessor 118 associated with thesensor array 106 are configured to store onebaseline image 122 of thecargo area 110 of thetrailer 102 when thetrailer 102 is in an empty state. Theprocessor 118 is configured to detect a triggering event 206 (e.g., using the triggeringevent algorithm 142 of the dispatch server 126) and to illuminate thecargo area 110 of thetrailer 102 using at least onelight source 114. Theprocessor 118 is further configured to capture acurrent image 144 of thecargo area 110 of thetrailer 102 using the set ofcameras 112. Theprocessor 118 compares (e.g., using thedifference algorithm 148 of the dispatch server 126) eachcurrent image 144 of aninterior cavity 402 with thecorresponding baseline image 122 of a cargo cavity. Theprocessor 118 associated with thesensor array 106 is configured to determine a cargo status 124 (e.g., using the cargo status algorithm 125 of the dispatch server 126) based upon a difference 148 (e.g., using thedifference algorithm 148 of the dispatch server 126) between thecurrent image 144 and thebaseline image 122. Furthermore, theprocessor 118 is configured to send thecargo status 124 to adispatcher 134 using acellular modem 136. - In yet another embodiment, a
trailer 102 of asemi-trailer truck 104 includes asensor array 106 affixed to asurface 108 of thetrailer 102 to automatically determine whether acargo area 110 of thesemi-trailer truck 104 is occupied. Thetrailer 102 of thesemi-trailer truck 104 also includes a set ofcameras 112 of thesensor array 106. Each camera of the set ofcameras 112 is each embedded in individual recess(es) 113 of thesensor array 106 such that each of the set ofcameras 112 are interior to a flush plane of thesurface 108 to prevent cargo from damaging each camera. Each of the set ofcameras 112 peers into thecargo area 110 of thesemi-trailer truck 104. Thetrailer 102 of thesemi-trailer truck 104 further includes at least onelight source 114 to illuminate thecargo area 110. - A
memory 116 and aprocessor 118 associated with thesensor array 106 are configured to store onebaseline image 122 of thecargo area 110 of thetrailer 102 when thetrailer 102 is in an empty state. Theprocessor 118 is configured to detect a triggeringevent 206 and/or to illuminate thecargo area 110 of thetrailer 102 using at least onelight source 114. Theprocessor 118 is further configured to capture acurrent image 144 of thecargo area 110 of thetrailer 102 using the set ofcameras 112. Theprocessor 118 is also configured to compare eachcurrent image 144 of aninterior cavity 402 with thecorresponding baseline image 122 of a cargo cavity. Furthermore, theprocessor 118 is configured to determine a cargo status 124 (e.g., using the cargo status algorithm 125 of the dispatch server 126) based upon a difference 148 (e.g., using thedifference algorithm 148 of the dispatch server 126) between thecurrent image 144 and thebaseline image 122. Also, theprocessor 118 is configured to send thecargo status 124 to adispatcher 134 using acellular modem 136. - The
sensor array 106 includes abackup camera 202 to observe arear area 204 of thetrailer 102 of thesemi-trailer truck 104. Thebackup camera 202 is mounted to thesensor array 106 such that thebackup camera 202 views (e.g., using the triggeringevent algorithm 142 of the dispatch server 126) a door of the trailer, a loading area of the trailer, and/or an area behind the trailer. Adriver 208 of thetrailer 102 may view avideo feed 210 from thebackup camera 202 using a wired connection and/or a wireless connection between thebackup camera 202 and/or adisplay 212 in acabin area 214 of thesemi-trailer truck 104. -
FIG. 1A is a uppercorner placement view 150A of a sensor array illustrating thesensor array 106 affixed to an upper corner of atrailer 102 of asemi-trailer truck 104 to automatically determine whether acargo area 110 of thesemi-trailer truck 104 is occupied and sending thecargo status 124 to adispatcher 134 using acellular modem 136, according to one embodiment. - Particularly,
FIG. 1A illustrates thetrailer 102, anetwork 101, thesemi-trailer truck 104, thesensor array 106, thesurface 108, thecargo area 110, a set ofcameras 112, arecess 113, alight source 114, aprojection areas 115, amemory 116, aprocessor 118, adatabase 120, abaseline image 122, acargo status 124, a cargo status algorithm 125, adispatch server 126, adispatch server memory 128, adispatch server processor 130, adispatch server database 132, adispatcher 134, auser device 135, and acellular modem 136, according to one embodiment. - The
trailer 102 may be a nonmotorized vehicle designed to be hauled by a motor vehicle (e.g., a truck, utility vehicles, and/or a tractor). Thenetwork 101 may be a group of computing devices (e.g., hardware and software) that are linked together through communication channels to facilitate communication and resource-sharing among a wide range of entities (e.g., dispatcher 134). Thesemi-trailer truck 104 may be a large vehicle that consists of a towing engine, known as a tractor and/or a truck, attached to one or more semi-trailers to carry freight, according to one embodiment. - The
sensor array 106 may be a device in the form of a bar and/or a series of bars that may be affixed to a wall and/or upright supports (e.g., asurface 108 of the trailer 102) which detects or measures a physical property (e.g., light, heat, motion, moisture, pressure, or any one of a great number of other environmental phenomena) of the occupancy inside thetrailer 102 and records, indicates, and/or otherwise responds to it as an output. The sensor array 106 (e.g., a sensor rail, a sensor housing, etc.) may hold a single camera or may hold multiple cameras. Thesensor array 106 may be connected through a wired and/or wired networking topology. In one embodiment, cameras are positioned in different locations of thetrailer 102 individually, and thesensor array 106 provides a housing in which to communicatively couple thesensor array 106 to the trailer without the need of a separate rail. In another embodiment, thesensor array 106 includes multiple cameras on a single sensor rail. Thesensor array 106 may include optional temperature, humidity, and/or pressure sensing in addition to visual sensing to determine general conditions in which cargo is housed inside thetrailer 102. - The output may be generally a signal that is converted to human-readable display at the sensor location or transmitted electronically over the
network 101 for reading or further processing to determine thecargo status 124 of thetrailer 102. Thesurface 108 may be the uppermost layer of the wall or ceiling of thetrailer 102 on which thesensor array 106 is affixed. Thecargo area 110 may be the space inside thetrailer 102 of thesemi-trailer truck 104 where the goods are kept for freighting, according to one embodiment. - The set of
cameras 112 may be a group and/or a collection of a number of cameras that may be used for recording visual images of the inside of thetrailer 102 in the form of photographs, film, or video signals. Therecess 113 may be a small space created by building part of a wall of thetrailer 102 further back from the rest so as to affix the set ofcameras 112 of thesensor array 106. Thelight source 114 may be any device serving as a source of illumination to make things visible inside thetrailer 102. Theprojection areas 115 may be the extent or measurement covered by each camera of the set ofcameras 112 to capture visual images of the inside of thetrailer 102 in the form of photographs, film, or video signals, according to one embodiment. - The
memory 116 may be an electronic holding place for instructions and data that theprocessor 118 of thesensor array 106 can reach quickly. Theprocessor 118 may be a logic circuitry that responds to and processes the basic instructions that drives thesensor array 106 for monitoring thesemi-trailer truck 104. Thedatabase 120 may be a structured collection of information collected by the set ofcameras 112 that is organized to be easily accessed, managed, and/or updated by thedispatcher 134. Thebaseline image 122 may be a visual representation of the inside of thecargo area 110 of thetrailer 102 at an empty state. Thecargo status 124 may be the present situation of thecargo area 110 in terms of occupancy of goods in thetrailer 102 as captured by the set ofcameras 112. The cargo status algorithm 125 may be a process or set of rules to be followed in calculations or other problem-solving operations for identifying the occupancy of goods in thecargo area 110 of thetrailer 102. - The
dispatch server 126 may be a computer system that provides local area networking services to multiple users (e.g., dispatcher 134) to send off the cargo to its respective destination by managing resources and services of thenetwork 101, while handling requests by thedispatcher 134 from different computers to access the said resources, according to one embodiment. - The
dispatch server memory 128 may be an electronic holding place for instructions and data that thedispatch server processor 130 can reach quickly. Thedispatch server processor 130 may be a logic circuitry that responds to and processes the basic instructions that drives thedispatch server 126 for monitoring thesemi-trailer truck 104. Thedispatch server database 132 may be a collection of information that is organized to be easily accessed, managed, and/or updated by thedispatcher 134, according to one embodiment. Thedispatcher 134 may be the personnel responsible (e.g., overseeing) for receiving and transmitting pure and reliable messages, tracking vehicles and equipment, and recording other important information regarding the cargo status 124 (e.g., using the cargo status algorithm 125 of the dispatch server 126) of thesemi-trailer truck 104. Theuser device 135 may be a computing device that enables thedispatcher 134 to communicate with thedispatch server 126 through thenetwork 101. Thecellular modem 136 may be a device that adds wireless 3G or 4G (LTE) connectivity to a laptop or a desktop computer in order to send thecargo status 124 to thedispatcher 134, according to one embodiment. -
FIG. 1A illustrates asensor array 106 affixed to an upper corner of thetrailer 102. Thesensor array 106 includes a set ofcameras 112. Each camera is each embedded in anindividual recess 113 of thesensor array 106. At least onelight source 114 is coupled with each of the set ofcameras 112. Thesensor array 106 is communicatively coupled to adispatch server 126 through thenetwork 101. Thedispatch server 126 includes adispatch server database 132 coupled with adispatch server processor 130 anddispatch server memory 128, according to one embodiment. Thedispatch server 126 is communicatively coupled to theuser device 135 through thenetwork 101. Thesensor array 106 is communicatively coupled to thedispatch server 126 through acellular modem 136, according to one embodiment. - The
cargo status 124 may be automatically determined using thesensor array 106. In circle ‘1’, thesensor array 106 is affixed to the upper corner of thetrailer 102. In circle ‘2’, each camera is each embedded in anindividual recess 113 of thesensor array 106. In circle ‘3’, at least onelight source 114 illuminates thecargo area 110 associated with each camera of the set ofcameras 112. In circle ‘4’, abaseline image 122 captured by the set ofcameras 112 is communicated to thedispatch server 126. In circle ‘5’,cargo status 124 is communicated to thedispatcher 134 through thecellular modem 136, according to one embodiment. -
FIG. 1B is a middle-top placement view 150B of thesensor array 106 ofFIG. 1 illustrating a set ofcameras 112 communicatively generating acomposite view 146 of thecargo area 110 based on a triggering event (e.g., using the triggeringevent algorithm 142 of the dispatch server 126), according to one embodiment. Particularly,FIG. 1B illustrates aseparate housing 138, anexterior face 140, a triggeringevent algorithm 142, acurrent image 144, acomposite view 146, and adifference algorithm 148, according to one embodiment. - According to at least one embodiment, the
separate housing 138 may be a discrete rigid casing that encloses and protects the various components of thesensor array 106. Theexterior face 140 may be outermost part of the middle-top section of thetrailer 102 on which thesensor array 106 is affixed. The triggeringevent algorithm 142 may be a process or set of rules to be followed in calculations or other problem-solving operations for identifying the occurrence of a trailer opening event, a trailer closing event, a motion detection event (e.g., using a global positioning device and/or a motion sensor), a stopping event, a time-based event, a geographic-location based event, and/or a velocity based event of thetrailer 102 ofsemi-trailer truck 104, according to one embodiment. - The
current image 144 may be the present visual representation of the inside of thecargo area 110 of thetrailer 102 after occurrence of the triggering event. Thecomposite view 146 may be a combined visual representation of the inside of thecargo area 110 of thetrailer 102 captured by the set ofcameras 112 after occurrence of the triggering event. Thedifference algorithm 148 may be a process or set of rules to be followed in calculations or other problem-solving operations for identifying the distinctness or dissimilarity of thecomposite view 146 of thecargo area 110 after occurrence of the triggering event from thebaseline image 122 of thecargo area 110 at an empty state, according to one embodiment. -
FIG. 1B illustrates asensor array 106 affixed to a middle-top section of thetrailer 102 on theexterior face 140. Thesensor array 106 is placed in aseparate housing 138 from the cargo area, according to one embodiment. - The
cargo status 124 based on a triggering event may be automatically determined using thesensor array 106. In circle ‘6’, a triggering event (e.g., using the triggeringevent algorithm 142 of the dispatch server 126) is identified by theprocessor 118 of thesensor array 106. In circle ‘7’, acurrent image 144 captured by the set ofcameras 112 is communicated to thedispatch server 126. In circle ‘8’, the composite view is communicated to thedispatch server 126. In circle ‘9’, the cargo status 124 (e.g., using the cargo status algorithm 125 of the dispatch server 126) is communicated to thedispatch server 126, according to one embodiment. -
FIG. 2 is abackup camera view 250 illustrating abackup camera 202 mounted to the sensor array ofFIG. 1 enabling adriver 208 of thetrailer 102 to view avideo feed 210 from thebackup camera 202, according to one embodiment. Particularly,FIG. 2 illustrates abackup camera 202, arear area 204, a triggeringevent 206, adriver 208, avideo feed 210, adisplay 212, and acabin area 214, according to one embodiment. - The
backup camera 202 may be a used for recording visual images of therear area 204 of thetrailer 102 in the form of photographs, film, or video signals. Therear area 204 may be the back part of the trailer 102 (e.g., a door of the trailer, a loading area of the trailer, and/or an area behind the trailer). The triggeringevent 206 may be a situation (e.g., a trailer opening event, a trailer closing event, a motion detection event, a stopping event, a time-based event, a geographic-location based event, and/or a velocity based event) to cause the set ofcameras 112 of thesensor array 106 to record the visual images of the inside of thecargo area 110. Thedriver 208 may be the person driving thesemi-trailer truck 104. Thevideo feed 210 may be a sequence of images from the set of cameras processed electronically into an analog or digital format and displayed on adisplay 212 with sufficient rapidity so as to create the illusion of motion and continuity. Thedisplay 212 may be a computer output surface and projecting mechanism that showsvideo feed 210 or graphic images to thedriver 208, using a cathode ray tube (CRT), liquid crystal display (LCD), light-emitting diode, gas plasma, or other image projection technology. Thecabin area 214 may be the private compartment for thedriver 208 in the front portion of thesemi-trailer truck 104, according to one embodiment. -
FIG. 2 illustrates abackup camera 202 mounted to thesensor array 106 to observe therear area 204 of thetrailer 102 of thesemi-trailer truck 104, according to one embodiment. - In circle ‘10’, the triggering event is communicated to the
processor 118. In circle ‘11’, theprojection area 115 in therear area 204 of thetrailer 102 is captured by thebackup camera 202. In circle ‘12’, thevideo feed 210 is sent to thedriver 208 using a wired connection and/or a wireless connection of thesensor array 106, according to one embodiment. -
FIG. 3 is a block diagram 350 representing one embodiment of thesensor array 106 of the trailer ofsemi-trailer truck 104 illustrated inFIG. 1 . According to one example embodiment, thesensor array 106 includes a set ofcameras 112 associated with alight source 114. Thesensor array 106 of the trailer ofsemi-trailer truck 104 further includes aprocessor 118, adatabase 120 and amemory 116. - The
processor 118 of thesensor array 106 may be configured to capture thebaseline image 122 using the set ofcameras 112. Thelight source 114 associated with each of the set ofcameras 112 illuminates the inside cavity of thecargo area 110. Theprocessor 118 identifies the triggering event (e.g., using the triggeringevent algorithm 142 of the dispatch server 126) caused by a trailer opening event, a trailer closing event, a motion detection event, a stopping event, a time-based event, a geographic-location based event, and/or a velocity based event. Acurrent image 144 is captured by each of the set ofcameras 112. Acomposite view 146 is generated based on thecurrent image 144 captured by each of the set ofcameras 112. Thecomposite view 146 and thebaseline image 122 is compared to conclude the cargo status 124 (e.g., using the cargo status algorithm 125 of the dispatch server 126) of thetrailer 102. Thecargo status 124 is communicated to thedispatcher 134, according to one embodiment. -
FIG. 4 is acomposite view 450 illustrating the overlappingdistortion 406 captured by eachcamera 112A-D of the set ofcameras 112 of thesensor array 106 ofFIG. 1 providing thecargo status 124 of thetrailer 102, according to one embodiment. - Particularly,
FIG. 4 illustrates aninterior cavity 402, a field ofview 404, and an overlappingdistortion 406. Theinterior cavity 402 may be an empty space inside thetrailer 102 of the semi-trailer truck where the cargo is kept for dispatch. The field ofview 404 may be the extent or measurement covered by each camera of the set ofcameras 112 to capture visual images of the inside of thetrailer 102 in the form of photographs, film, or video signals. The overlappingdistortion 406 may be the covering or extension of field ofview 404 of one camera over the field ofview 404 of its adjoining camera of the set ofcameras 112 of thesensor array 106, according to one embodiment. - Particularly,
composite view 450 illustrates an example embodiment of thesensor array 106 running the length of thetrailer 102 with embedded set ofcameras 112, electronics, wiring and LED light source and other sensors mounted on ceiling. Each camera is looking for distortion fromreference baseline image 122. No distortion from any of the camera indicates that the trailer is empty. Overlappingdistortion 406 provides information on the extent of quadrant load in each of theprojection areas 114A-E. Each quadrant (e.g.,projection areas 114A-E) represents 20% of thecargo area 110. Ifonly projection area 114A has distortion, then the trailer </=20% full. Ifprojection areas 114A and B has distortion, then the trailer </=40% full. Ifonly projection areas 114A, B and C has distortion, then the trailer </=60% full and ifprojection area 114A, B, and C has distortion, then the trailer is </=80% full, according to one embodiment. -
FIG. 5 is a table view illustrating the storing ofundistorted baseline image 122 captured at empty state of thetrailer 102 ofFIG. 1 and the corresponding distorted image after occurrence of the triggeringevent 206 for determining thecargo status 124, according to one embodiment. Particularly,FIG. 5 is atable view 550 showing the fields associated with thedispatcher 134, atrailer 102 field, a set ofcameras 112 field, abaseline image distortion 502 field, a triggeringevent 206 field, distortion incurrent image 504 field, and acargo status 124 field, according to one embodiment. - Particularly,
FIG. 5 illustrates an example of two records for adispatcher 134 with two trailers having a sensor array having a set ofcameras 112 affixed to each of itstrailer 102. The baseline image(s) 122 captured in empty state of thetrailer event 206 caused by the trailer opening event intrailer 1 depicts a distortion incurrent image 504 captured bycamera 112A oftrailer 1. The resultingcargo status 124 is shown as </=20% full caused by the triggeringevent 206 as communicated to thedispatcher 134. Similarly, the triggeringevent 206 caused by the velocity based event intrailer 2 depicts a distortion incurrent image 504 captured bycamera 112A-C oftrailer 2. The resultingcargo status 124 is shown as </=60% full caused by the triggeringevent 206 is communicated to thedispatcher 134, according to one embodiment. -
FIG. 6 is an exploded view of the triggeringevent algorithm 142 of thesensor array 106 ofFIG. 1 , according to one embodiment. Particularly,FIG. 6 illustrates a traileropening event module 602, a trailerclosing event module 604, a time-basedevent module 606, a motiondetection event module 608, a stoppingevent module 610, a geographic-location basedevent module 612, and a velocity basedevent module 614, according to one embodiment. - The trailer
opening event module 602 may be a part and/or a separate unit of a program of the triggeringevent algorithm 142 that assists in identifying the occurrence of a trailer opening event in order to activate the set ofcameras 112 to capture thecurrent image 144. The trailerclosing event module 604 may be a part and/or a separate unit of a program of the triggeringevent algorithm 142 that assists in identifying the occurrence of a trailer closing event in order to activate the set ofcameras 112 to capture thecurrent image 144. The time-basedevent module 606 may be a part and/or a separate unit of a program of the triggeringevent algorithm 142 that assists in identifying the occurrence of an event based on time, according to one embodiment. - The motion
detection event module 608 may be a part and/or a separate unit of a program of the triggeringevent algorithm 142 to detect motion of thesemi-trailer truck 104. The stoppingevent module 610 may be a part and/or a separate unit of a program of the triggeringevent algorithm 142 that assists in identifying the stopping of thesemi-trailer truck 104. The geographic-location basedevent module 612 may be a part and/or a separate unit of a program of the triggeringevent algorithm 142 that assists in identifying the occurrence of a situation based on the geographic-location of thesemi-trailer truck 104. The velocity basedevent module 614 may be a part and/or a separate unit of a program of the triggeringevent algorithm 142 that assists in identifying the occurrence of a situation based on the velocity of thesemi-trailer truck 104, according to one embodiment. -
FIG. 7 is a critical path view illustrating a flow based on time in which critical operations of the sensor array ofFIG. 1 are established, according to one embodiment. - In
operation 702, thedispatcher 134 affixes asensor array 106 to asurface 108 of atrailer 102 of asemi-trailer truck 104. Inoperation 704, thesensor array 106 peers each of the camera of the set ofcameras 112 into thecargo area 110 of thesemi-trailer truck 104. Inoperation 706, thedispatcher 134 configures amemory 116 and aprocessor 118 to store at least onebaseline image 122 of thecargo area 110 of thetrailer 102 when trailer is in empty state. Inoperation 708, thedispatcher 134 configures theprocessor 118 to detect a triggeringevent 206. Inoperation 710, thedispatcher 134 configures theprocessor 118 to illuminate thecargo area 110 of thetrailer 102 using at least onelight source 114. Inoperation 712, thesensor array 106 captures acurrent image 144 of thecargo area 110 of thetrailer 102 using at least one of the set ofcameras 112. Inoperation 714, thesensor array 106 compares eachcurrent image 144 of the interior cavity with thecorresponding baseline image 122 of the cargo cavity. Inoperation 716, thesensor array 106 determines acargo status 124 based upon a difference between thecurrent image 144 and thebaseline image 122. Inoperation 718, thesensor array 106 sends thecargo status 124 to adispatcher 134 using acellular modem 136, according to one embodiment. -
FIG. 8 is a process flow diagram of thesensor array 106 ofFIG. 1 to determine thecargo status 124 of thetrailer 102 of thesemi-trailer truck 104 ofFIG. 1 , according to one embodiment. - In
operation 802, asensor array 106 is affixed to asurface 108 of atrailer 102 to automatically determine whether acargo area 110 of thesemi-trailer truck 104 is occupied. Inoperation 804, each of thecamera 112A-D of a set ofcameras 112 of thesensor array 106 peers into thecargo area 110 of thesemi-trailer truck 104. Inoperation 806, amemory 116 and aprocessor 118 associated with thesensor array 106 are configured to store at least onebaseline image 122 of thecargo area 110 of thetrailer 102 when thetrailer 102 is in an empty state. Inoperation 808, theprocessor 118 is configured to detect a triggeringevent 206. Inoperation 810, thecargo area 110 is illuminated using at least onelight source 114. Inoperation 812, acurrent image 144 of thecargo area 110 of thetrailer 102 is captured using at least one of the set ofcameras 112, according to one embodiment. - In
operation 814, eachcurrent image 144 of the interior cavity is compared with thecorresponding baseline image 122 of the cargo cavity. Inoperation 816, acargo status 124 is determined based upon a difference between thecurrent image 144 and thebaseline image 122. Inoperation 818, thecargo status 124 is sent to adispatcher 134 using acellular modem 136, according to one embodiment. -
FIG. 9 is a schematic diagram ofgeneric computing device 990 that can be used to implement the methods and systems disclosed herein, according to one or more embodiments.FIG. 9 is a schematic diagram ofgeneric computing device 990 and a genericmobile computing device 930 that can be used to perform and/or implement any of the embodiments disclosed herein. In one or more embodiments,dispatch server 126 and/oruser device 135 ofFIG. 1A may be thegeneric computing device 900. - The
generic computing device 900 may represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and/or other appropriate computers. The genericmobile computing device 930 may represent various forms of mobile devices, such as smartphones, camera phones, personal digital assistants, cellular telephones, and other similar mobile devices. The components shown here, their connections, couples, and relationships, and their functions, are meant to be exemplary only, and are not meant to limit the embodiments described and/or claimed, according to one embodiment. - The
generic computing device 900 may include aprocessor 902, amemory 904, astorage device 906, ahigh speed interface 908 coupled to thememory 904 and a plurality of highspeed expansion ports 910, and alow speed interface 912 coupled to a low speed bus 914 and astorage device 906. In one embodiment, each of the components heretofore may be inter-coupled using various buses, and may be mounted on a common motherboard and/or in other manners as appropriate. Theprocessor 902 may process instructions for execution in thegeneric computing device 900, including instructions stored in thememory 904 and/or on thestorage device 906 to display a graphical information for a GUI on an external input/output device, such as adisplay unit 916 coupled to thehigh speed interface 908, according to one embodiment. - In other embodiments, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and/or types of memory. Also, a plurality of
computing device 900 may be coupled with, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, and/or a multi-processor system). - The
memory 904 may be coupled to thegeneric computing device 900. In one embodiment, thememory 904 may be a volatile memory. In another embodiment, thememory 904 may be a non-volatile memory. Thememory 904 may also be another form of computer-readable medium, such as a magnetic and/or an optical disk. Thestorage device 906 may be capable of providing mass storage for thegeneric computing device 900. In one embodiment, thestorage device 906 may be includes a floppy disk device, a hard disk device, an optical disk device, a tape device, a flash memory and/or other similar solid state memory device. In another embodiment, thestorage device 906 may be an array of the devices in a computer-readable medium previously mentioned heretofore, computer-readable medium, such as, and/or an array of devices, including devices in a storage area network and/or other configurations. - A computer program may be comprised of instructions that, when executed, perform one or more methods, such as those described above. The instructions may be stored in the
memory 904, thestorage device 906, a memory coupled to theprocessor 902, and/or a propagated signal. - The
high speed interface 908 may manage bandwidth-intensive operations for thegeneric computing device 900, while thelow speed interface 912 may manage lower bandwidth-intensive operations. Such allocation of functions is exemplary only. In one embodiment, thehigh speed interface 908 may be coupled to thememory 904, the display unit 916 (e.g., through a graphics processor and/or an accelerator), and to the plurality of highspeed expansion ports 910, which may accept various expansion cards. - In the embodiment, the
low speed interface 912 may be coupled to thestorage device 906 and the low speed bus 914. The low speed bus 914 may be comprised of a wired and/or wireless communication port (e.g., a Universal Serial Bus (“USB”), a Bluetooth® port, an Ethernet port, and/or a wireless Ethernet port). The low speed bus 914 may also be coupled to thescan unit 928, aprinter 926, a keyboard, a mouse 924, and a networking device (e.g., a switch and/or a router) through a network adapter. - The
generic computing device 900 may be implemented in a number of different forms, as shown in the figure. In one embodiment, thecomputing device 900 may be implemented as astandard server 918 and/or a group of such servers. In another embodiment, thegeneric computing device 900 may be implemented as part of arack server system 922. In yet another embodiment, thegeneric computing device 900 may be implemented as a general computer 920 such as a laptop or desktop computer. Alternatively, a component from thegeneric computing device 900 may be combined with another component in a genericmobile computing device 930. In one or more embodiments, an entire system may be made up of a plurality ofgeneric computing device 900 and/or a plurality ofgeneric computing device 900 coupled to a plurality of genericmobile computing device 930. - In one embodiment, the generic
mobile computing device 930 may include a mobilecompatible processor 932, a mobilecompatible memory 934, and an input/output device such as amobile display 946, acommunication interface 952, and atransceiver 938, among other components. The genericmobile computing device 930 may also be provided with a storage device, such as a microdrive or other device, to provide additional storage. In one embodiment, the components indicated heretofore are inter-coupled using various buses, and several of the components may be mounted on a common motherboard. - The mobile
compatible processor 932 may execute instructions in the genericmobile computing device 930, including instructions stored in the mobilecompatible memory 934. The mobilecompatible processor 932 may be implemented as a chipset of chips that include separate and multiple analog and digital processors. The mobilecompatible processor 932 may provide, for example, for coordination of the other components of the genericmobile computing device 930, such as control of user interfaces, applications run by the genericmobile computing device 930, and wireless communication by the genericmobile computing device 930. - The mobile
compatible processor 932 may communicate with a user through thecontrol interface 936 and thedisplay interface 944 coupled to amobile display 946. In one embodiment, themobile display 946 may be a Thin-Film-Transistor Liquid Crystal Display (“TFT LCD”), an Organic Light Emitting Diode (“OLED”) display, and another appropriate display technology. Thedisplay interface 944 may comprise appropriate circuitry for driving themobile display 946 to present graphical and other information to a user. Thecontrol interface 936 may receive commands from a user and convert them for submission to the mobilecompatible processor 932. - In addition, an
external interface 942 may be provide in communication with the mobilecompatible processor 932, so as to enable near area communication of the genericmobile computing device 930 with other devices.External interface 942 may provide, for example, for wired communication in some embodiments, or for wireless communication in other embodiments, and multiple interfaces may also be used. - The mobile
compatible memory 934 may be coupled to the genericmobile computing device 930. The mobilecompatible memory 934 may be implemented as a volatile memory and a non-volatile memory. Theexpansion memory 958 may also be coupled to the genericmobile computing device 930 through theexpansion interface 956, which may comprise, for example, a Single In Line Memory Module (“SIMM”) card interface. Theexpansion memory 958 may provide extra storage space for the genericmobile computing device 930, or may also store an application or other information for the genericmobile computing device 930. - Specifically, the
expansion memory 958 may comprise instructions to carry out the processes described above. Theexpansion memory 958 may also comprise secure information. For example, theexpansion memory 958 may be provided as a security module for the genericmobile computing device 930, and may be programmed with instructions that permit secure use of the genericmobile computing device 930. In addition, a secure application may be provided on the SIMM card, along with additional information, such as placing identifying information on the SIMM card in a non-hackable manner. - The mobile compatible memory may include a volatile memory (e.g., a flash memory) and a non-volatile memory (e.g., a non-volatile random-access memory (“NVRAM”)). In one embodiment, a computer program comprises a set of instructions that, when executed, perform one or more methods. The set of instructions may be stored on the mobile
compatible memory 934, theexpansion memory 958, a memory coupled to the mobilecompatible processor 932, and a propagated signal that may be received, for example, over thetransceiver 938 and/or theexternal interface 942. - The generic
mobile computing device 930 may communicate wirelessly through thecommunication interface 952, which may be comprised of a digital signal processing circuitry. Thecommunication interface 952 may provide for communications using various modes and/or protocols, such as, a Global System for Mobile Communications (“GSM”) protocol, a Short Message Service (“SMS”) protocol, an Enhanced Messaging System (“EMS”) protocol, a Multimedia Messaging Service (“MMS”) protocol, a Code Division Multiple Access (“CDMA”) protocol, Time Division Multiple Access (“TDMA”) protocol, a Personal Digital Cellular (“PDC”) protocol, a Wideband Code Division Multiple Access (“WCDMA”) protocol, a CDMA2000 protocol, and a General Packet Radio Service (“GPRS”) protocol. - Such communication may occur, for example, through the transceiver 938 (e.g., radio-frequency transceiver). In addition, short-range communication may occur, such as using a Bluetooth®, Wi-Fi, and/or other such transceiver. In addition, a GPS (“Global Positioning System”)
receiver module 954 may provide additional navigation-related and location-related wireless data to the genericmobile computing device 930, which may be used as appropriate by a software application running on the genericmobile computing device 930. - The generic
mobile computing device 930 may also communicate audibly using anaudio codec 940, which may receive spoken information from a user and convert it to usable digital information. Theaudio codec 940 may likewise generate audible sound for a user, such as through a speaker (e.g., in a handset smartphone of the generic mobile computing device 930). Such a sound may comprise a sound from a voice telephone call, a recorded sound (e.g., a voice message, a music files, etc.) and may also include a sound generated by an application operating on the genericmobile computing device 930. - The generic
mobile computing device 930 may be implemented in a number of different forms, as shown in the figure. In one embodiment, the genericmobile computing device 930 may be implemented as asmartphone 948. In another embodiment, the genericmobile computing device 930 may be implemented as a personal digital assistant (“PDA”). In yet another embodiment, the generic mobile computing device, 930 may be implemented as atablet device 950. - An example embodiment will now be described. The ACME Haulage Corporation may provide cargo transportation services in remote areas of the United States. The ACME Haulage Corporation may be compensated based on a type of goods being carried inside a cargo area of its trailer of a transportation vehicle (e.g., a semi-trailer truck 104). For this reason, the ACME Haulage Corporation may want to understand the ‘load’ status of their equipment (e.g., a semi-trailer truck 104) to optimize the dispatch and routing of their transportation assets. In order to understand the load status of their equipment (e.g., a trailer 102), the ACME Haulage Corporation may have to rely on field reports. The ACME Haulage Corporation may have employed sensors (e.g. weight sensors, wave sensors, ultrasound sensors) in an interior space of its trailers. These sensors may not be able to detect patterns or types of cargo and exactly where in the trailer the cargo is located. The incorrect and unreliable cargo status provided by these sensors may have resulted into a number of untoward situations. For example, a driver of its semi-trailer truck may have embarked on a long journey, when, in fact, its cargo area is filled with the wrong type of cargo or may even be empty. This may have lead The ACME Haulage Corporation to a loss of invaluable time, fuel, efficiency, customer dissatisfaction, and/or ultimately, loss of revenue for its services.
- To prevent these continuing losses, the ACME Haulage Corporation may have decided to invest in embodiments described herein (e.g., use of various embodiments of the
FIGS. 1-9 ) for optimum utilization of interior spaces of the cargo area of its trailers (e.g., a trailer 102). The use of technologies described in various embodiments of theFIGS. 1-9 may enable the dispatch managers of ACME Haulage Corporation to remotely monitor and manage its entire fleets of cargo transport equipment (e.g., trailer 102) and asset utilization in real-time. The various embodiments of theFIGS. 1-9 may have also enabled the dispatch managers of the ACME Haulage Corporation to know the actual load status of its cargo transport equipment (e.g., a trailer 102) through image analysis and to verify the contents of the equipment through a photographic image. Additionally, the image analysis may have enabled the central dispatch (e.g., dispatcher 134) of the ACME Haulage Corporation to know what areas and/or zones of the equipment (e.g., trailer 102) are actually loaded. - The use of technologies described in various embodiments of the
FIGS. 1-9 facilitated the dispatch managers (e.g., dispatcher 134) of ACME Haulage Corporation to utilize an easy-to-use mobile interface, giving it real-time visibility of the cargo areas of its trailers for their daily operations along with helping dispatch managers (e.g., dispatcher 134). The dispatch managers (e.g., dispatcher 134) of the ACME Haulage Corporation may now be able to automate manual business processes and optimize performance of its transportation equipments (e.g., trailer 102) by using the rich data platform as described in various embodiments of theFIGS. 1-9 maximizing trailer utilization. - The use of technologies described in various embodiments of the
FIGS. 1-9 may have enabled trailer management system of the ACME Haulage Corporation to instantly connect dispatch managers to a host of powerful, easy-to-use analytics and insights via web-based, highly intuitive trailer tracking dashboards, customizable trailer tracking reports and exception-based alerts. Armed with this intelligence, dispatch managers (e.g., dispatcher 134) of the ACME Haulage Corporation may have the ability to automate yard checks; better manage and distribute trailer pools; improve detention billing; increase the efficiencies and productivity of dispatch operations; secure trailers and high-value cargo; deter fraud and unauthorized trailer use; improve driver and customer satisfaction; and maximize trailer utilization for a more profitable fleet. The ACME Haulage Corporation may now utilize their cargo area to its optimum capacity. This may have lead the ACME Haulage Corporation to save time, fuel, increase efficiency, customer satisfaction, and/or ultimately, prevent loss of revenue for its transportation services raising its profit. - Various embodiments of the systems and techniques described here can be realized in a digital electronic circuitry, an integrated circuitry, a specially designed application specific integrated circuits (“ASICs”), a piece of computer hardware, a firmware, a software application, and a combination thereof. These various embodiments can include embodiment in one or more computer programs that are executable and/or interpretable on a programmable system including one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, one input device, and at least one output device.
- These computer programs (also known as programs, software, software applications, and/or code) comprise machine-readable instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms “machine-readable medium” and/or “computer-readable medium” refers to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, and/or Programmable Logic Devices (“PLDs”)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.
- To provide for interaction with a user, the systems and techniques described here may be implemented on a computing device having a display device (e.g., a cathode ray tube (“CRT”) and/or liquid crystal (“LCD”) monitor) for displaying information to the user and a keyboard and a mouse 924 by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, and/or tactile feedback) and input from the user can be received in any form, including acoustic, speech, and/or tactile input.
- The systems and techniques described here may be implemented in a computing system that includes a back end component (e.g., as a data server), a middleware component (e.g., an application server), a front end component (e.g., a client computer having a graphical user interface, and/or a Web browser through which a user can interact with an embodiment of the systems and techniques described here), and a combination thereof. The components of the system may also be coupled through a communication network.
- The communication network may include a local area network (“LAN”) and a wide area network (“WAN”) (e.g., the Internet). The computing system can include a client and a server. In one embodiment, the client and the server are remote from each other and interact through the communication network.
- A number of embodiments have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the claimed invention. In addition, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. In addition, other steps may be provided, or steps may be eliminated, from the described flows, and other components may be added to, or removed from, the described systems. Accordingly, other embodiments are within the scope of the following claims.
- It may be appreciated that the various systems, methods, and apparatus disclosed herein may be embodied in a machine-readable medium and/or a machine accessible medium compatible with a data processing system (e.g., a computer system), and/or may be performed in any order.
- The structures and modules in the figures may be shown as distinct and communicating with only a few specific structures and not others. The structures may be merged with each other, may perform overlapping functions, and may communicate with other structures not shown to be connected in the figures. Accordingly, the specification and/or drawings may be regarded in an illustrative rather than a restrictive sense.
Claims (21)
1. A trailer of a semi-trailer truck, comprising:
a sensor array affixed outside of the trailer in a manner to provide optimal utilization of interior spaces to automatically determine whether a cargo area of the semi-trailer truck is occupied,
the sensor array comprising a set of cameras and a backup camera,
wherein each camera of the set of cameras is each embedded in individual recesses of the sensor array such that each of the set of cameras do not protrude from the sensor array into the cargo area,
wherein the backup camera to observe a rear area of the trailer, and
wherein each of the set of cameras to peer into the cargo area of the semi-trailer truck, and
wherein the sensor array is placed in a separate housing from the cargo area outside the trailer resting on an exterior face of the trailer such that storage space inside the cargo area is not constrained because the sensor array is placed outside the cargo area;
a light source to illuminate the cargo area;
wherein the sensor array and the light source are powered by a solar array mounted on the trailer,
wherein a memory and a processor associated with the sensor array is configured to store a baseline image of the cargo area of the trailer when the trailer is in an empty state,
wherein the processor is configured:
to detect a triggering event,
to illuminate the cargo area of the trailer using at least one light source,
to capture, when triggered by the triggering event:
a current image of the cargo area of the trailer using at least one of the set of cameras, and
another current image of the rear area of the trailer,
to compare each current image of an interior cavity with the corresponding baseline image of a cargo cavity,
to determine a cargo status based upon a difference between the current image and the baseline image, and
to send the cargo status to a dispatcher using a cellular modem,
wherein the triggering event is at least one of a trailer opening event, a trailer closing event, a motion detection event through at least one of a global positioning device and a motion sensor in the trailer, a stopping event, a geographic-location based event, and a velocity based event, and
wherein the light source is placed in another separate housing from the cargo area outside the trailer resting on another exterior face of the trailer such that storage space inside the cargo area is not constrained because the light source is placed outside the cargo area.
2. The trailer of the semi-trailer truck of claim 1 wherein the sensor array is affixed to an upper corner of the trailer.
3. (canceled)
4. The trailer of the semi-trailer truck of claim 1 wherein at least one light source is a light-emitting diode that is associated with each camera of the set of cameras.
5. (canceled)
6. The trailer of the semi-trailer truck of claim 1 :
wherein the backup camera is mounted to the sensor array, wherein the backup camera to view at least one of a door of the trailer, a loading area of the trailer, and an area behind the trailer, and
wherein a driver of the trailer may view a video feed from the backup camera using at least one of a wired connection and a wireless connection between the backup camera and a display in a cabin area of the semi-trailer truck.
7. The trailer of the semi-trailer truck of claim 1 , wherein a field of view of each of the set of cameras to at least partially overlap with the field of view of at least another of the set of cameras.
8. (canceled)
9. The trailer of the semi-trailer truck of claim 1 :
wherein the sensor array to communicatively generate a composite view of the cargo area using the set of cameras,
wherein the sensor array to communicate the composite view to at least one of the cabin area of the semi-trailer truck and a central server communicatively coupled with the semi-trailer truck through an Internet network using the processor and the memory of the semi-trailer truck, and
wherein the cellular modem to periodically provide a reporting of a location of the semi-trailer truck captured with a geographic positioning receiver to the central server along with the composite view using the processor and the memory.
10. (canceled)
11. (canceled)
12. (canceled)
13. (canceled)
14. (canceled)
15. (canceled)
16. (canceled)
17. (canceled)
18. (canceled)
19. (canceled)
20. (canceled)
21. A trailer of a semi-trailer truck, comprising:
a sensor array affixed outside of the trailer in a manner to provide optimal utilization of interior spaces to automatically determine whether a cargo area of the semi-trailer truck is occupied,
the sensor array comprising a set of cameras and a backup camera,
wherein each camera of the set of cameras is each embedded in individual recesses of the sensor array such that each of the set of cameras do not protrude from the sensor array into the cargo area,
wherein the backup camera to observe a rear area of the trailer, and
wherein each of the set of cameras to peer into the cargo area of the semi-trailer truck, and
wherein the sensor array is placed in a separate housing from the cargo area outside the trailer resting on an exterior face of the trailer such that storage space inside the cargo area is not constrained because the sensor array is placed outside the cargo area;
a light source to illuminate the cargo area;
wherein the sensor array and the light source are powered by a solar array mounted on the trailer,
wherein a memory and a processor associated with the sensor array is configured to store a baseline image of the cargo area of the trailer when the trailer is in an empty state,
wherein the processor is configured:
to detect a triggering event,
to illuminate the cargo area of the trailer using at least one light source,
to capture, when triggered by the triggering event:
a current image of the cargo area of the trailer using at least one of the set of cameras, and
another current image of the rear area of the trailer,
to compare each current image of an interior cavity with the corresponding baseline image of a cargo cavity,
to determine a cargo status based upon a difference between the current image and the baseline image, and
to send the cargo status to a dispatcher using a cellular modem,
wherein the triggering event is at least one of a trailer opening event, a trailer closing event, a motion detection event through at least one of a global positioning device and a motion sensor in the trailer, a stopping event, a geographic-location based event, and a velocity based event,
wherein the light source is placed in another separate housing from the cargo area outside the trailer resting on another exterior face of the trailer such that storage space inside the cargo area is not constrained because the light source is placed outside the cargo area,
wherein the sensor array is affixed to an upper corner of the trailer,
wherein at least one light source is a light-emitting diode that is associated with each camera of the set of cameras,
wherein the backup camera is mounted to the sensor array,
wherein the backup camera to view at least one of a door of the trailer, a loading area of the trailer, and an area behind the trailer,
wherein a driver of the trailer may view a video feed from the backup camera using at least one of a wired connection and a wireless connection between the backup camera and a display in a cabin area of the semi-trailer truck,
wherein a field of view of each of the set of cameras to at least partially overlap with the field of view of at least another of the set of cameras,
wherein the sensor array to communicatively generate a composite view of the cargo area using the set of cameras,
wherein the sensor array to communicate the composite view to at least one of the cabin area of the semi-trailer truck and a central server communicatively coupled with the semi-trailer truck through an Internet network using the processor and the memory of the semi-trailer truck, and
wherein the cellular modem to periodically provide a reporting of a location of the semi-trailer truck captured with a geographic positioning receiver to the central server along with the composite view using the processor and the memory.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/682,086 US20160297361A1 (en) | 2015-04-08 | 2015-04-08 | Camera array system and method to detect a load status of a semi- trailer truck |
US15/491,040 US10311315B2 (en) | 2015-04-08 | 2017-04-19 | Camera array system and method to detect a load status of a semi-trailer truck |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/682,086 US20160297361A1 (en) | 2015-04-08 | 2015-04-08 | Camera array system and method to detect a load status of a semi- trailer truck |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/491,040 Continuation US10311315B2 (en) | 2015-04-08 | 2017-04-19 | Camera array system and method to detect a load status of a semi-trailer truck |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160297361A1 true US20160297361A1 (en) | 2016-10-13 |
Family
ID=57111794
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/682,086 Abandoned US20160297361A1 (en) | 2015-04-08 | 2015-04-08 | Camera array system and method to detect a load status of a semi- trailer truck |
US15/491,040 Active 2035-11-13 US10311315B2 (en) | 2015-04-08 | 2017-04-19 | Camera array system and method to detect a load status of a semi-trailer truck |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/491,040 Active 2035-11-13 US10311315B2 (en) | 2015-04-08 | 2017-04-19 | Camera array system and method to detect a load status of a semi-trailer truck |
Country Status (1)
Country | Link |
---|---|
US (2) | US20160297361A1 (en) |
Cited By (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170230620A1 (en) * | 2016-02-04 | 2017-08-10 | Panasonic Intellectual Property Management Co., Ltd. | Container use state determining device |
US20170372484A1 (en) * | 2016-06-24 | 2017-12-28 | Justin W Carlson | Machine vision cargo monitoring in a vehicle |
US20180045823A1 (en) * | 2016-08-09 | 2018-02-15 | Delphi Technologies, Inc. | Trailer dimension estimation with two dimensional radar and camera |
US10118576B2 (en) | 2002-06-11 | 2018-11-06 | Intelligent Technologies International, Inc. | Shipping container information recordation techniques |
US20180365501A1 (en) * | 2017-06-15 | 2018-12-20 | Blackberry Limited | Method & system for rear status detection |
US20190176760A1 (en) * | 2017-12-12 | 2019-06-13 | Toyota Jidosha Kabushiki Kaisha | Vehicle interior monitoring system, storage apparatus, and vehicle |
US20190199999A1 (en) * | 2017-12-22 | 2019-06-27 | Symbol Technologies, Llc | Systems and methods for detecting if package walls are beyond 3d depth camera range in commercial trailer loading |
US20190197716A1 (en) * | 2017-12-22 | 2019-06-27 | Symbol Technologies, Llc | Systems and methods for determining commercial trailer fullness |
US10401502B2 (en) * | 2016-06-07 | 2019-09-03 | Timothy B. Morford | Low energy Wi-Fi device for location |
US10521674B2 (en) * | 2017-12-22 | 2019-12-31 | Symbol Technologies, Llc | Trailer door monitoring and reporting |
EP3597490A1 (en) * | 2018-07-19 | 2020-01-22 | Fahrzeugwerk Bernard Krone GmbH & Co. KG | Method for monitoring the condition of commercial vehicles or interchangeable bodies for commercial vehicles |
US10546384B2 (en) | 2017-07-21 | 2020-01-28 | Blackberry Limited | Method and system for mapping to facilitate dispatching |
US20200066118A1 (en) * | 2018-08-24 | 2020-02-27 | Anthony Wiggins | Apparatus and method for indicating load level of vehicle |
US10643337B2 (en) * | 2017-12-22 | 2020-05-05 | Symbol Technologies, Llc | Systems and methods for segmenting and tracking package walls in commercial trailer loading |
US10670479B2 (en) | 2018-02-27 | 2020-06-02 | Methode Electronics, Inc. | Towing systems and methods using magnetic field sensing |
US10696109B2 (en) | 2017-03-22 | 2020-06-30 | Methode Electronics Malta Ltd. | Magnetolastic based sensor assembly |
CN112193479A (en) * | 2020-08-27 | 2021-01-08 | 浙江双友物流器械股份有限公司 | Intelligent binding system for cargo transportation |
US10955540B2 (en) | 2017-12-01 | 2021-03-23 | Aptiv Technologies Limited | Detection system |
US11014417B2 (en) | 2018-02-27 | 2021-05-25 | Methode Electronics, Inc. | Towing systems and methods using magnetic field sensing |
US11077825B2 (en) | 2019-12-16 | 2021-08-03 | Plusai Limited | System and method for anti-tampering mechanism |
US11084342B2 (en) | 2018-02-27 | 2021-08-10 | Methode Electronics, Inc. | Towing systems and methods using magnetic field sensing |
US11092668B2 (en) | 2019-02-07 | 2021-08-17 | Aptiv Technologies Limited | Trailer detection system and method |
US11097931B2 (en) * | 2018-04-23 | 2021-08-24 | Toyota Material Handling Manufacturing Sweden Ab | Material handling vehicle and a material handling system comprising such a vehicle |
EP3872725A1 (en) * | 2020-02-27 | 2021-09-01 | Continental Automotive GmbH | Method for determining loading state, system and transport management system |
US11125598B1 (en) * | 2020-04-23 | 2021-09-21 | Zebra Technologies Corporation | Three-dimensional (3D) imaging systems and methods for determining vehicle storage areas and vehicle door statuses |
US20210295664A1 (en) * | 2020-03-19 | 2021-09-23 | Logistics and Supply Chain MultiTech R&D Centre Limited | System and device for video-based vehicle surrounding awareness monitoring for air cargo transit security under all-weather driving conditions |
US11135882B2 (en) | 2018-02-27 | 2021-10-05 | Methode Electronics, Inc. | Towing systems and methods using magnetic field sensing |
US20210312643A1 (en) * | 2019-02-28 | 2021-10-07 | Ford Global Technologies, Llc | Method and apparatus for adaptive trailer content monitoring |
US20210319582A1 (en) * | 2018-08-27 | 2021-10-14 | Daimler Ag | Method(s) and System(s) for Vehicular Cargo Management |
US20210337135A1 (en) * | 2019-03-19 | 2021-10-28 | Ricoh Company, Ltd. | Imaging apparatus, vehicle and image capturing method |
US11221262B2 (en) | 2018-02-27 | 2022-01-11 | Methode Electronics, Inc. | Towing systems and methods using magnetic field sensing |
US11272144B2 (en) * | 2020-01-15 | 2022-03-08 | George Gorgees | Large vehicle backup camera apparatus |
US11313704B2 (en) * | 2019-12-16 | 2022-04-26 | Plusai, Inc. | System and method for a sensor protection assembly |
US11408995B2 (en) | 2020-02-24 | 2022-08-09 | Aptiv Technologies Limited | Lateral-bin monitoring for radar target detection |
US11435466B2 (en) | 2018-10-08 | 2022-09-06 | Aptiv Technologies Limited | Detection system and method |
US11470265B2 (en) | 2019-12-16 | 2022-10-11 | Plusai, Inc. | System and method for sensor system against glare and control thereof |
US11491832B2 (en) | 2018-02-27 | 2022-11-08 | Methode Electronics, Inc. | Towing systems and methods using magnetic field sensing |
US20220366556A1 (en) * | 2021-05-14 | 2022-11-17 | Carrier Corporation | Systems and methods for container condition determination in transport refrigiration |
US11650415B2 (en) | 2019-12-16 | 2023-05-16 | Plusai, Inc. | System and method for a sensor protection mechanism |
DE102021130882A1 (en) | 2021-11-25 | 2023-05-25 | Zf Cv Systems Global Gmbh | Driver assistance system for a commercial vehicle with a trailer and method for controlling such a system |
US11724669B2 (en) | 2019-12-16 | 2023-08-15 | Plusai, Inc. | System and method for a sensor protection system |
US11738694B2 (en) | 2019-12-16 | 2023-08-29 | Plusai, Inc. | System and method for anti-tampering sensor assembly |
US11754689B2 (en) | 2019-12-16 | 2023-09-12 | Plusai, Inc. | System and method for detecting sensor adjustment need |
US11772667B1 (en) | 2022-06-08 | 2023-10-03 | Plusai, Inc. | Operating a vehicle in response to detecting a faulty sensor using calibration parameters of the sensor |
WO2024059032A1 (en) * | 2022-09-13 | 2024-03-21 | Stoneridge, Inc. | Trailer change detection system for commercial vehicles |
WO2024076616A1 (en) * | 2022-10-06 | 2024-04-11 | Stoneridge Electronics Ab | Camera monitoring system including trailer monitoring video compression |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10089598B2 (en) | 2009-07-17 | 2018-10-02 | Spireon, Inc. | Methods and apparatus for monitoring and control of electronic devices |
US11210627B1 (en) | 2018-01-17 | 2021-12-28 | Spireon, Inc. | Monitoring vehicle activity and communicating insights from vehicles at an automobile dealership |
US10636280B2 (en) | 2018-03-08 | 2020-04-28 | Spireon, Inc. | Apparatus and method for determining mounting state of a trailer tracking device |
US10605847B1 (en) | 2018-03-28 | 2020-03-31 | Spireon, Inc. | Verification of installation of vehicle starter disable and enable circuit |
US11100194B2 (en) * | 2018-06-01 | 2021-08-24 | Blackberry Limited | Method and system for cargo sensing estimation |
CN108973863A (en) * | 2018-07-29 | 2018-12-11 | 合肥市智信汽车科技有限公司 | A kind of transport vehicle carriage cargo real time monitoring apparatus and its application method |
US11299219B2 (en) | 2018-08-20 | 2022-04-12 | Spireon, Inc. | Distributed volumetric cargo sensor system |
US11023742B2 (en) | 2018-09-07 | 2021-06-01 | Tusimple, Inc. | Rear-facing perception system for vehicles |
WO2020069520A1 (en) * | 2018-09-28 | 2020-04-02 | I.D. Systems, Inc. | Cargo sensors, cargo-sensing units, cargo-sensing systems, and methods of using the same |
US11087485B2 (en) | 2018-09-28 | 2021-08-10 | I.D. Systems, Inc. | Cargo sensors, cargo-sensing units, cargo-sensing systems, and methods of using the same |
US11475680B2 (en) | 2018-12-12 | 2022-10-18 | Spireon, Inc. | Cargo sensor system implemented using neural network |
CN110147838B (en) * | 2019-05-20 | 2021-07-02 | 苏州微创关节医疗科技有限公司 | Product specification inputting and detecting method and system |
WO2022132239A1 (en) * | 2020-12-16 | 2022-06-23 | Motion2Ai | Method, system and apparatus for managing warehouse by detecting damaged cargo |
US11801809B2 (en) | 2021-02-17 | 2023-10-31 | Corey Gautreau | Wheelchair wheel cleaning assembly |
NL2035868B1 (en) * | 2022-09-29 | 2024-08-09 | Continental Automotive Tech Gmbh | Vehicle comprising a cargo space |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070013779A1 (en) * | 2003-08-28 | 2007-01-18 | Jack Gin | Dual surveillance camera system |
US20150172518A1 (en) * | 2013-12-13 | 2015-06-18 | Convoy Technologies, LLC, | Monitoring system and method including selectively mountable wireless camera |
-
2015
- 2015-04-08 US US14/682,086 patent/US20160297361A1/en not_active Abandoned
-
2017
- 2017-04-19 US US15/491,040 patent/US10311315B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070013779A1 (en) * | 2003-08-28 | 2007-01-18 | Jack Gin | Dual surveillance camera system |
US20150172518A1 (en) * | 2013-12-13 | 2015-06-18 | Convoy Technologies, LLC, | Monitoring system and method including selectively mountable wireless camera |
Cited By (66)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10118576B2 (en) | 2002-06-11 | 2018-11-06 | Intelligent Technologies International, Inc. | Shipping container information recordation techniques |
US20170230620A1 (en) * | 2016-02-04 | 2017-08-10 | Panasonic Intellectual Property Management Co., Ltd. | Container use state determining device |
US10401502B2 (en) * | 2016-06-07 | 2019-09-03 | Timothy B. Morford | Low energy Wi-Fi device for location |
US20170372484A1 (en) * | 2016-06-24 | 2017-12-28 | Justin W Carlson | Machine vision cargo monitoring in a vehicle |
US10163219B2 (en) * | 2016-06-24 | 2018-12-25 | Fca Us Llc | Machine vision cargo monitoring in a vehicle |
US20180045823A1 (en) * | 2016-08-09 | 2018-02-15 | Delphi Technologies, Inc. | Trailer dimension estimation with two dimensional radar and camera |
US10481255B2 (en) * | 2016-08-09 | 2019-11-19 | Aptiv Technologies Limited | Trailer dimension estimation with two dimensional radar and camera |
US10696109B2 (en) | 2017-03-22 | 2020-06-30 | Methode Electronics Malta Ltd. | Magnetolastic based sensor assembly |
US10940726B2 (en) | 2017-03-22 | 2021-03-09 | Methode Electronics Malta Ltd. | Magnetoelastic based sensor assembly |
US20180365501A1 (en) * | 2017-06-15 | 2018-12-20 | Blackberry Limited | Method & system for rear status detection |
US10339392B2 (en) * | 2017-06-15 | 2019-07-02 | Blackberry Limited | Method and system for rear status detection |
US10949680B2 (en) * | 2017-06-15 | 2021-03-16 | Blackberry Limited | Method and system for rear status detection |
US11689700B2 (en) | 2017-07-21 | 2023-06-27 | Blackberry Limited | Method and system for mapping to facilitate dispatching |
US10546384B2 (en) | 2017-07-21 | 2020-01-28 | Blackberry Limited | Method and system for mapping to facilitate dispatching |
US11474224B2 (en) | 2017-12-01 | 2022-10-18 | Aptiv Technologies Limited | Detection system |
US10955540B2 (en) | 2017-12-01 | 2021-03-23 | Aptiv Technologies Limited | Detection system |
US20190176760A1 (en) * | 2017-12-12 | 2019-06-13 | Toyota Jidosha Kabushiki Kaisha | Vehicle interior monitoring system, storage apparatus, and vehicle |
US10841559B2 (en) * | 2017-12-22 | 2020-11-17 | Symbol Technologies, Llc | Systems and methods for detecting if package walls are beyond 3D depth camera range in commercial trailer loading |
US20190197716A1 (en) * | 2017-12-22 | 2019-06-27 | Symbol Technologies, Llc | Systems and methods for determining commercial trailer fullness |
US10657666B2 (en) * | 2017-12-22 | 2020-05-19 | Symbol Technologies, Llc | Systems and methods for determining commercial trailer fullness |
US20190199999A1 (en) * | 2017-12-22 | 2019-06-27 | Symbol Technologies, Llc | Systems and methods for detecting if package walls are beyond 3d depth camera range in commercial trailer loading |
US10521674B2 (en) * | 2017-12-22 | 2019-12-31 | Symbol Technologies, Llc | Trailer door monitoring and reporting |
US10643337B2 (en) * | 2017-12-22 | 2020-05-05 | Symbol Technologies, Llc | Systems and methods for segmenting and tracking package walls in commercial trailer loading |
US11084342B2 (en) | 2018-02-27 | 2021-08-10 | Methode Electronics, Inc. | Towing systems and methods using magnetic field sensing |
US11014417B2 (en) | 2018-02-27 | 2021-05-25 | Methode Electronics, Inc. | Towing systems and methods using magnetic field sensing |
US10670479B2 (en) | 2018-02-27 | 2020-06-02 | Methode Electronics, Inc. | Towing systems and methods using magnetic field sensing |
US11491832B2 (en) | 2018-02-27 | 2022-11-08 | Methode Electronics, Inc. | Towing systems and methods using magnetic field sensing |
US11221262B2 (en) | 2018-02-27 | 2022-01-11 | Methode Electronics, Inc. | Towing systems and methods using magnetic field sensing |
US11135882B2 (en) | 2018-02-27 | 2021-10-05 | Methode Electronics, Inc. | Towing systems and methods using magnetic field sensing |
US11097931B2 (en) * | 2018-04-23 | 2021-08-24 | Toyota Material Handling Manufacturing Sweden Ab | Material handling vehicle and a material handling system comprising such a vehicle |
EP3597490A1 (en) * | 2018-07-19 | 2020-01-22 | Fahrzeugwerk Bernard Krone GmbH & Co. KG | Method for monitoring the condition of commercial vehicles or interchangeable bodies for commercial vehicles |
US20200066118A1 (en) * | 2018-08-24 | 2020-02-27 | Anthony Wiggins | Apparatus and method for indicating load level of vehicle |
US20210319582A1 (en) * | 2018-08-27 | 2021-10-14 | Daimler Ag | Method(s) and System(s) for Vehicular Cargo Management |
US12080015B2 (en) * | 2018-08-27 | 2024-09-03 | Mercedes-Benz Group AG | Method(s) and system(s) for vehicular cargo management |
US11768284B2 (en) | 2018-10-08 | 2023-09-26 | Aptiv Technologies Limited | Detection system and method |
US11435466B2 (en) | 2018-10-08 | 2022-09-06 | Aptiv Technologies Limited | Detection system and method |
US11092668B2 (en) | 2019-02-07 | 2021-08-17 | Aptiv Technologies Limited | Trailer detection system and method |
US20210312643A1 (en) * | 2019-02-28 | 2021-10-07 | Ford Global Technologies, Llc | Method and apparatus for adaptive trailer content monitoring |
US11922641B2 (en) * | 2019-02-28 | 2024-03-05 | Ford Global Technologies, Llc | Method and apparatus for adaptive trailer content monitoring |
US11546526B2 (en) * | 2019-03-19 | 2023-01-03 | Ricoh Company, Ltd. | Imaging apparatus, vehicle and image capturing method |
US20210337135A1 (en) * | 2019-03-19 | 2021-10-28 | Ricoh Company, Ltd. | Imaging apparatus, vehicle and image capturing method |
US20220417404A1 (en) | 2019-12-16 | 2022-12-29 | Plusai, Inc. | System and method for sensor system against glare and control thereof |
US11722787B2 (en) | 2019-12-16 | 2023-08-08 | Plusai, Inc. | System and method for sensor system against glare and control thereof |
US11077825B2 (en) | 2019-12-16 | 2021-08-03 | Plusai Limited | System and method for anti-tampering mechanism |
US11470265B2 (en) | 2019-12-16 | 2022-10-11 | Plusai, Inc. | System and method for sensor system against glare and control thereof |
US11754689B2 (en) | 2019-12-16 | 2023-09-12 | Plusai, Inc. | System and method for detecting sensor adjustment need |
US11313704B2 (en) * | 2019-12-16 | 2022-04-26 | Plusai, Inc. | System and method for a sensor protection assembly |
US11738694B2 (en) | 2019-12-16 | 2023-08-29 | Plusai, Inc. | System and method for anti-tampering sensor assembly |
US20220252437A1 (en) * | 2019-12-16 | 2022-08-11 | Plusai, Inc. | System and method for a sensor protection assembly |
US11731584B2 (en) | 2019-12-16 | 2023-08-22 | Plusai, Inc. | System and method for anti-tampering mechanism |
US11650415B2 (en) | 2019-12-16 | 2023-05-16 | Plusai, Inc. | System and method for a sensor protection mechanism |
US11724669B2 (en) | 2019-12-16 | 2023-08-15 | Plusai, Inc. | System and method for a sensor protection system |
US11662231B2 (en) * | 2019-12-16 | 2023-05-30 | Plusai, Inc. | System and method for a sensor protection assembly |
US11272144B2 (en) * | 2020-01-15 | 2022-03-08 | George Gorgees | Large vehicle backup camera apparatus |
US11408995B2 (en) | 2020-02-24 | 2022-08-09 | Aptiv Technologies Limited | Lateral-bin monitoring for radar target detection |
US11802961B2 (en) | 2020-02-24 | 2023-10-31 | Aptiv Technologies Limited | Lateral-bin monitoring for radar target detection |
EP3872725A1 (en) * | 2020-02-27 | 2021-09-01 | Continental Automotive GmbH | Method for determining loading state, system and transport management system |
US20210295664A1 (en) * | 2020-03-19 | 2021-09-23 | Logistics and Supply Chain MultiTech R&D Centre Limited | System and device for video-based vehicle surrounding awareness monitoring for air cargo transit security under all-weather driving conditions |
US11410513B2 (en) * | 2020-03-19 | 2022-08-09 | Logistics and Supply Chain MultiTech R&D Centre Limited | System and device for video-based vehicle surrounding awareness monitoring for air cargo transit security under all-weather driving conditions |
US11125598B1 (en) * | 2020-04-23 | 2021-09-21 | Zebra Technologies Corporation | Three-dimensional (3D) imaging systems and methods for determining vehicle storage areas and vehicle door statuses |
CN112193479A (en) * | 2020-08-27 | 2021-01-08 | 浙江双友物流器械股份有限公司 | Intelligent binding system for cargo transportation |
US20220366556A1 (en) * | 2021-05-14 | 2022-11-17 | Carrier Corporation | Systems and methods for container condition determination in transport refrigiration |
DE102021130882A1 (en) | 2021-11-25 | 2023-05-25 | Zf Cv Systems Global Gmbh | Driver assistance system for a commercial vehicle with a trailer and method for controlling such a system |
US11772667B1 (en) | 2022-06-08 | 2023-10-03 | Plusai, Inc. | Operating a vehicle in response to detecting a faulty sensor using calibration parameters of the sensor |
WO2024059032A1 (en) * | 2022-09-13 | 2024-03-21 | Stoneridge, Inc. | Trailer change detection system for commercial vehicles |
WO2024076616A1 (en) * | 2022-10-06 | 2024-04-11 | Stoneridge Electronics Ab | Camera monitoring system including trailer monitoring video compression |
Also Published As
Publication number | Publication date |
---|---|
US10311315B2 (en) | 2019-06-04 |
US20170262717A1 (en) | 2017-09-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10311315B2 (en) | Camera array system and method to detect a load status of a semi-trailer truck | |
US9551788B2 (en) | Fleet pan to provide measurement and location of a stored transport item while maximizing space in an interior cavity of a trailer | |
US10963978B2 (en) | Enhanced alert/notification system for law enforcement identifying and tracking of stolen vehicles and cargo | |
US8933802B2 (en) | Switch and actuator coupling in a chassis of a container associated with an intermodal freight transport system | |
US10783487B2 (en) | System operated responsive to data bearing records | |
US8717193B2 (en) | Method and system for providing traffic alerts | |
US20140143169A1 (en) | Systems, devices, and methods for carrier verification in a freight transportation network | |
US20140122187A1 (en) | Fleet Vehicle Management Systems and Methods | |
US20140297058A1 (en) | System and Method for Capturing and Preserving Vehicle Event Data | |
US11663890B2 (en) | Systems and methods for artificial intelligence (AI) theft prevention and recovery | |
CN202486835U (en) | Logistics dynamic management system based on Internet of things technology | |
US20150339624A1 (en) | Systems and methods for device-based carrier verification in a freight transportation network | |
US20170053234A1 (en) | Method and systems for tracking assets of shipping transactions in real time | |
US20150039466A1 (en) | System, method, and apparatus for assessing the condition of tangible property that is loaned, rented, leased, borrowed or placed in the trust of another person | |
US11429924B2 (en) | System for parcel transport and tracking operated responsive to data bearing records | |
US20210279680A1 (en) | System for parcel pickup and delivery operated responsive to data bearing records | |
US20200132530A1 (en) | Apparatus and method for verifying a shipping load | |
US9820097B1 (en) | Geofence location detection | |
EP3662426A1 (en) | Tracking system and method for monitoring and ensuring security of shipments | |
US11568696B2 (en) | System operated responsive to data bearing records | |
CA3065993C (en) | System operated responsive to data bearing records | |
US20100017125A1 (en) | Enhanced Information Security System | |
US20120236835A1 (en) | Method and system for recording a geographical location from a mobile communication device | |
WO2019075138A1 (en) | Enhanced alert/notification system for law enforcement identifying and tracking of stolen vehicles and cargo | |
CN114297282A (en) | Rubber-tyred vehicle information uploading method and system and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: WELLS FARGO BANK, NATIONAL ASSOCIATION, CALIFORNIA Free format text: PATENT SECURITY AGREEMENT;ASSIGNORS:SPIREON, INC.;INILEX, INC.;REEL/FRAME:040056/0153 Effective date: 20160830 |
|
STCB | Information on status: application discontinuation |
Free format text: EXPRESSLY ABANDONED -- DURING EXAMINATION |