WO2022116688A1 - 一种图像投射方法和系统 - Google Patents
一种图像投射方法和系统 Download PDFInfo
- Publication number
- WO2022116688A1 WO2022116688A1 PCT/CN2021/123236 CN2021123236W WO2022116688A1 WO 2022116688 A1 WO2022116688 A1 WO 2022116688A1 CN 2021123236 W CN2021123236 W CN 2021123236W WO 2022116688 A1 WO2022116688 A1 WO 2022116688A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- light
- environment
- projection
- intensity
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 238000001514 detection method Methods 0.000 claims description 17
- 239000013589 supplement Substances 0.000 claims description 5
- 230000001902 propagating effect Effects 0.000 claims description 3
- 230000017525 heat dissipation Effects 0.000 abstract description 2
- 238000012545 processing Methods 0.000 description 34
- 230000008569 process Effects 0.000 description 10
- 238000012986 modification Methods 0.000 description 8
- 230000004048 modification Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 230000003190 augmentative effect Effects 0.000 description 6
- 230000001276 controlling effect Effects 0.000 description 5
- 238000012937 correction Methods 0.000 description 5
- 239000000463 material Substances 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000009286 beneficial effect Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 230000035945 sensitivity Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 238000005266 casting Methods 0.000 description 2
- 230000015556 catabolic process Effects 0.000 description 2
- 239000011248 coating agent Substances 0.000 description 2
- 238000000576 coating method Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 230000004907 flux Effects 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000011835 investigation Methods 0.000 description 2
- 239000000047 product Substances 0.000 description 2
- 230000029305 taxis Effects 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 229920002972 Acrylic fiber Polymers 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000009500 colour coating Methods 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000003733 fiber-reinforced composite Substances 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 239000004615 ingredient Substances 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 239000010977 jade Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 239000003607 modifier Substances 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 239000004417 polycarbonate Substances 0.000 description 1
- 229920000515 polycarbonate Polymers 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09F—DISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
- G09F19/00—Advertising or display means not otherwise provided for
- G09F19/12—Advertising or display means not otherwise provided for using special optical effects
- G09F19/18—Advertising or display means not otherwise provided for using special optical effects involving the use of optical projection means, e.g. projection of images on clouds
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/315—Modulator illumination systems
- H04N9/3155—Modulator illumination systems for controlling the light source
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/317—Convergence or focusing systems
Definitions
- the present application relates to the field of information interaction, and in particular, to an image projection method and system.
- the information interaction between pedestrians and vehicles, and between vehicles and vehicles is realized by projecting images.
- the vehicle state is projected near the vehicle through an image projection device, so that the surrounding pedestrians or vehicles can learn the vehicle state.
- the existing image projection device generally projects mainly through the light source configured by itself, and cannot be adjusted according to different environments. Therefore, it is necessary to provide an image projection method and system to adapt to different environments.
- One of the embodiments of the present application provides an image projection system.
- the system includes at least one processor and at least one storage device, where the storage device is configured to store instructions, and when the at least one processor executes the instructions, the method according to any embodiment of the present application is implemented.
- the device includes: an active light source for providing active light; a condensing component for receiving light in the environment where the vehicle is located and reflecting it to form reflected light; a projection component, including a projection film, the projection film is located in the the reflected light and the propagating light path in the active light; and a control component for controlling the projection component to represent the vehicle based on the light provided by the active light source or the concentrating component according to the condition parameters of the environment in which the vehicle is located An image of the state is projected into the environment.
- FIG. 3 is an exemplary flowchart of an image projection method according to some embodiments of the present application.
- FIG. 5 is a schematic diagram of an image projection apparatus according to some embodiments of the present application.
- system means for distinguishing different components, elements, parts, parts or assemblies at different levels.
- device means for converting signals into signals.
- unit means for converting signals into signals.
- module means for converting signals into signals.
- the embodiments of the present application can be applied to different transportation systems, including but not limited to one or a combination of land, sea, aviation, aerospace, and the like.
- taxis, special cars, rides, buses, chauffeurs, trains, motor trains, high-speed rail, ships, airplanes, hot air balloons, unmanned vehicles, delivery/delivery, etc. apply management and/or distribution transportation systems .
- the application scenarios of the different embodiments of the present application include, but are not limited to, one or a combination of a web page, a browser plug-in, a client, a customized system, an enterprise internal analysis system, an artificial intelligence robot, and the like. It should be understood that the application scenarios of the system and method of the present application are only some examples or embodiments of the present application. For those of ordinary skill in the art, without creative work, they can also use these drawings according to the drawings. Apply this application to other similar scenarios. For example, other similar guided user parking systems.
- the "passenger”, “client”, “user terminal”, “customer”, “demander”, “service demander”, “consumer”, “consumer”, “use demander”, etc. described in this application are Interchangeable refers to the party who needs or orders the service, which can be an individual or a tool.
- “driver”, “driver end”, “provider”, “supplier”, “service provider”, “service provider”, “service party”, etc. described in this application are also interchangeable, referring to Persons, tools or other entities that provide services or assist in providing services.
- the "user” described in this application may be a party who needs or subscribes to a service, or a party who provides services or assists in providing services.
- FIG. 1 is a schematic diagram of an application scenario of an image projection system according to some embodiments of the present application.
- the image projection system 100 can be used to control an image projection device to project an image including vehicle state information (referred to as vehicle state) into the environment.
- vehicle state information
- the image projection system 100 may obtain the vehicle state at the current time and detect the condition parameters (eg, light intensity, visibility, etc.) of the environment where the vehicle is located at the current time, and based on the condition parameters ( For example, light intensity), the image projection device is controlled to project an image (eg, a pattern) including vehicle status information into the environment.
- a vehicle refers to a means of transportation, which can include taxis, special cars, hitchhikers, buses, trains, bullet trains, high-speed rail, ships, airplanes, unmanned vehicles, and the like.
- the image projection system 100 may be an online service platform for providing Internet services.
- the image projection system 100 may be applied to a car-hailing service platform that provides transportation services.
- the online car-hailing service platform can provide transportation services such as taxi calls, express train calls, special car calls, minibus calls, carpooling, bus services, driver hiring, pick-up and drop-off services, and chauffeur-driven services.
- the image projection system 100 may also be applied to service platforms such as express delivery, takeaway, and travel (eg, travel).
- the image projection system 100 may be applied to a navigation service platform.
- the image projection system 100 may be applied to an unmanned system.
- the following uses a navigation service platform as an example to describe the application of the image projection system 100 . This is not intended to be limiting, and the image projection system 100 may be applied to any service platform.
- server 110 may be used to process information and/or data related to image projection system 100, eg, to process vehicle state projections.
- server 110 may be a single server or a group of servers. Server groups may be centralized or distributed (eg, server 110 may be a distributed system).
- server 110 may be local or remote.
- the server 110 may access information and/or data stored in the first user terminal 120 , the second user terminal 140 and/or the storage device 130 via the network 150 .
- the server 110 may be directly connected to the first user terminal 120, the second user terminal 140 and/or the storage device 130 to access stored information and/or data.
- server 110 may be implemented on a cloud platform or on-board computer.
- the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distribution cloud, an internal cloud, a multi-layer cloud, etc., or any combination thereof.
- server 110 may include processing device 112 .
- Processing device 112 may process information and/or data related to image projection system 100 to perform one or more functions described herein. For example, the processing device 112 may obtain the projection request of the vehicle state sent by the first user terminal 120 and/or the second user terminal 140, obtain the vehicle state at the current time, and detect the condition parameters of the environment in which the vehicle is located at the current time (eg, light intensity, visibility), and then based on the condition parameters of the environment in which the vehicle is located, the image projection device is controlled to project an image representing the state of the vehicle into the environment.
- condition parameters of the environment in which the vehicle is located eg, light intensity, visibility
- the processing device 112 may also acquire the vehicle state of the vehicle at the next time, obtain the condition parameters of the environment where the vehicle is located at the next time, and then determine whether to switch the current time based on the condition parameters of the environment where the vehicle is located at the next time The corresponding projection mode. In some embodiments, the processing device 112 may also adjust the power of the active light source to adjust the light intensity based on the light intensity in the environment in which the vehicle is located. In some embodiments, processing device 112 may include one or more processing engines (eg, a single-chip processing engine or a multi-chip processing engine).
- the processing device 112 may include a central processing unit (CPU), an application specific integrated circuit (ASIC), an application specific instruction set processor (ASIP), a graphics processing unit (GPU), a physical processing unit (PPU), a digital signal A processor (DSP), a field programmable gate array (FPGA), a programmable logic device (PLD), a controller, a microcontroller unit, a reduced instruction set computer (RISC), a microprocessor, etc., or any combination thereof.
- CPU central processing unit
- ASIC application specific integrated circuit
- ASIP application specific instruction set processor
- GPU graphics processing unit
- PPU physical processing unit
- DSP digital signal A processor
- FPGA field programmable gate array
- PLD programmable logic device
- controller a microcontroller unit
- RISC reduced instruction set computer
- Storage device 130 may be used to store data and/or instructions related to image projection requests.
- the storage device 130 may store data obtained/retrieved from the first user terminal 120 and/or the second user terminal 140.
- storage device 130 may store data and/or instructions that server 110 executes or uses to accomplish the example methods described in this application.
- storage device 130 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), the like, or any combination thereof.
- Exemplary mass storage may include magnetic disks, optical disks, solid state disks, and the like.
- Exemplary removable storage may include flash drives, floppy disks, optical disks, memory cards, compact disks, magnetic tapes, and the like.
- Exemplary volatile read-write memory may include random access memory (RAM).
- Exemplary RAMs may include dynamic random access memory (DRAM), double data rate synchronous dynamic random access memory (DDRSDRAM), static random access memory (SRAM), thyristor random access memory (T-RAM), and zero capacitance Random Access Memory (Z-RAM), etc.
- Exemplary read only memory may include model read only memory (MROM), programmable read only memory (PROM), erasable programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM), optical disks Read Only Memory (CD-ROM) and Digital Versatile Disk Read Only Memory, etc.
- storage device 130 may be implemented on a cloud platform.
- the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distribution cloud, an internal cloud, a multi-layer cloud, etc., or any combination thereof.
- storage device 130 may be connected to network 150 to communicate with one or more components of image projection system 100 (eg, server 110, first user terminal 120, second user terminal 140).
- One or more components of image projection system 100 may access data or instructions stored in storage device 130 via network 150 .
- storage device 130 may be directly connected to or in communication with one or more components of image projection system 100 (eg, server 110 , first user terminal 120 , second user terminal 140 ).
- storage device 130 may be part of server 110 .
- storage device 130 may be a separate memory.
- the first user terminal 120 may be a person, tool, or other entity directly related to the image projection request.
- a user may be an image projection requester.
- “user” and “user terminal” can be used interchangeably.
- the first user terminal 120 may include a desktop computer 120-1, a notebook computer 120-2, an in-vehicle device 120-3 in a motor vehicle, and a mobile device 120-4, etc., or any combination thereof.
- the mobile device 120-4 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, the like, or any combination thereof.
- smart home devices may include smart lighting devices, smart appliance control devices, smart monitoring devices, smart TVs, smart cameras, walkie-talkies, etc., or any combination thereof.
- the wearable device may include smart bracelets, smart footwear, smart glasses, smart helmets, smart watches, smart wear, smart backpacks, smart accessories, etc., or any combination thereof.
- an intelligent mobile device may include a smartphone, a personal digital assistant (PDA), a gaming device, a navigation device, a point of sale (POS), the like, or any combination thereof.
- PDA personal digital assistant
- the virtual reality device and/or augmented reality device may include a virtual reality headset, virtual reality glasses, virtual reality eyewear, augmented virtual reality helmet, augmented reality glasses, augmented reality eyewear, etc., or any combination thereof.
- virtual reality devices and/or augmented reality devices may include Google Glass, Oculus Rift, HoloLens, or Gear VR, among others.
- the onboard equipment 120-3 in the motor vehicle may include an onboard computer, an onboard television, and the like.
- the second user terminal 140 may be a similar or the same device as the first user terminal 120 .
- the second user terminal 140 may include a desktop computer 140-1, a laptop computer 140-2, an in-vehicle device 140-3 in a motor vehicle, and a mobile device 140-4, etc., or any combination thereof.
- the first user terminal 120 and/or the second user terminal 140 may be devices with positioning technology. In some embodiments, the first user terminal 120 and/or the second user terminal 140 may communicate with another positioning device to determine the location of the first user terminal 120 and/or the second user terminal 140 . In some embodiments, the first user terminal 120 and/or the second user terminal 140 may send the positioning information to the server 110 . In some embodiments, the first user terminal 120 and/or the second user terminal 140 may include image projection devices. In some embodiments, the first user terminal 120 and/or the second user terminal 140 may be communicatively connected to an image projection device for projecting an image representing the state of the vehicle into the environment through the image projection device.
- the first user terminal 120 and/or the second user terminal 140 may correspond to a vehicle.
- a means of transport may also be referred to as a vehicle. It is envisaged that the vehicle may be an electric vehicle, a fuel cell vehicle, a hybrid vehicle, or a conventional internal combustion engine vehicle, for example, a vehicle, a coupe, a sedan, a pickup truck, a station wagon, a sports utility vehicle (Sports) Utility Vehicle, SUV), minivan or modified car.
- a vehicle may be an electric vehicle, a fuel cell vehicle, a hybrid vehicle, or a conventional internal combustion engine vehicle, for example, a vehicle, a coupe, a sedan, a pickup truck, a station wagon, a sports utility vehicle (Sports) Utility Vehicle, SUV), minivan or modified car.
- SUV sports utility vehicle
- the transport can be equipped with various sensors mounted to the body.
- the sensors may be configured to capture data as the vehicle travels along the trajectory.
- the sensor may be a combination of a LiDAR scanner configured to scan the surroundings and acquire a point cloud and/or a 3-D camera to acquire a digital image.
- the sensor may be a sensor used in a navigation unit, such as a GPS receiver and one or more IMU sensors.
- GPS is a global navigation satellite system that provides geographic positioning and time information to GPS receivers.
- An IMU is an electronic device that uses various inertial sensors such as accelerometers and gyroscopes, and sometimes magnetometers, to measure and provide specific forces, angular rates, and sometimes magnetic fields around the vehicle.
- GPS receivers and IMU sensors to provide real-time attitude information of the vehicle as it travels, including the location and orientation of the vehicle at each timestamp.
- Network 150 may facilitate the exchange of information and/or data.
- one or more components of image projection system 100 eg, server 110 , first user terminal 120 , storage device 130 , second user terminal 140
- Other components of projection system 100 e.g, the server 110 may obtain/acquire the casting request from the first user terminal 120 via the network 150 .
- the network 150 may be a wired network or a wireless network, or the like, or any combination thereof.
- the network 150 may include a cable network, a wired network, a fiber optic network, a telecommunications network, an internal network, the Internet, a local area network (LAN), a wide area network (WAN), a wireless local area network (WLAN), a metropolitan area network (MAN) , Public Switched Telephone Network (PSTN), Bluetooth network, Zigbee network, Near Field Communication (NFC) network, Global System for Mobile communications (GSM) network, Code Division Multiple Access (CDMA) network, Time Division Multiple Access (TDMA) network, General Packet Radio Service (GPRS) networks, Enhanced Data Rates for GSM Evolution (EDGE) networks, Wideband Code Division Multiple Access (WCDMA) networks, High Speed Downlink Packet Access (HSDPA) networks, Long Term Evolution (LTE) networks, User Data Protocol (UDP) networks, Transmission Control Protocol/Internet Protocol (TCP/IP) networks, Short Message Service (SMS) networks, Wireless Application Protocol (WAP) networks, Ultra Wideband (UWB) networks,
- GSM Global
- Information source 160 is one source that provides additional information to image projection system 100 .
- the information source 160 may be used to provide the system with information related to image projection, such as projection time, visibility, weather information, legal and regulatory information, news information, life information, life guide information, and the like.
- the information source 160 may exist in the form of a single central server, or may exist in the form of multiple servers connected through a network, or may exist in the form of a large number of personal devices. When the information source 160 exists in the form of a large number of personal devices, these devices can use a user-generated content (user-generated content) method, such as uploading text, voice, images, videos, etc. to the cloud server, so that the cloud server can communicate with it.
- the connected numerous personal devices together make up the information source 160 .
- FIG. 2 is a block diagram of a processing device according to some embodiments of the present application.
- the processing device 112 may include an acquisition module 210 , a detection module 220 and a projection module 230 .
- the obtaining module 210 can obtain the vehicle status at the current time, and the vehicle status includes going straight, turning left, turning right, U-turn, changing lanes to the left, changing lanes to the right, stopping to give way, decelerating to give way, giving way to meeting vehicles, and parking.
- vehicle status includes going straight, turning left, turning right, U-turn, changing lanes to the left, changing lanes to the right, stopping to give way, decelerating to give way, giving way to meeting vehicles, and parking.
- One or more types of vehicle and vehicle failures For a detailed description of obtaining the vehicle state at the current time, reference may be made to FIG. 3 , which will not be repeated here.
- the detection module 220 may acquire condition parameters of the environment where the vehicle is located at the current time, where the condition parameters include light intensity and/or visibility in the environment where the vehicle is located.
- the detection module 220 may obtain state parameters associated with the environment in which the vehicle is located from an environment perception system related to the vehicle. For a detailed description of detecting the condition parameters of the environment in which the vehicle is located at the current time, reference may be made to FIG. 3 , which will not be repeated here.
- the system and its modules of the present application can not only be implemented by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, etc., or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc. , can also be implemented by software executed by various types of processors, for example, or by a combination of the above-mentioned hardware circuits and software (eg, firmware).
- the above description of the image projection system and its modules is only for the convenience of description, and does not limit the present application to the scope of the illustrated embodiments. It can be understood that for those skilled in the art, after understanding the principle of the system, various modules may be combined arbitrarily, or a subsystem may be formed to connect with other modules without departing from the principle.
- the acquisition module 210 , the detection module 220 and the projection module 230 may be different modules in a system, or may be one module to implement the functions of the above two or more modules.
- each module may share one storage module, and each module may also have its own storage module. Such deformations are all within the protection scope of the present application.
- Step 310 Obtain the vehicle status at the current time.
- step 310 may be implemented by acquisition module 210 .
- the vehicle state may be a running state or a stopped state of the vehicle during operation.
- the vehicle state may include going straight, turning left, turning right, turning around, changing lanes to the left, changing lanes to the right, stop and yield, slow down, yield, meet, park, vehicle failure one or more types.
- the obtaining module 210 may obtain vehicle status information by communicating with the first user terminal, the second user terminal and/or the storage device 130 .
- the obtaining module 210 may obtain the vehicle state at the current time directly or through the network 150 from the first user terminal and the second user terminal.
- the obtaining module 210 may obtain the vehicle state at the current time corresponding to the planned navigation information stored therein directly or through a network from the storage device 130 .
- the obtaining module 210 may also call the data interface to obtain the vehicle state at the current time from the navigation device or the navigation module of the vehicle.
- Step 320 Detect a condition parameter of the environment where the vehicle is located at the current time, where the condition parameter includes the light intensity in the environment where the vehicle is located.
- step 320 may be implemented by detection module 220 .
- the light intensity of the environment where the vehicle is located at the current time may be detected by a light intensity sensor, and the visibility of the environment where the vehicle is located at the current time may be detected by a visibility detector.
- the light intensity sensor and/or visibility detector can be mounted on the vehicle or on the user terminal.
- Step 330 based on the condition parameters of the environment where the vehicle is located, control the image projection device to project an image representing the state of the vehicle into the environment.
- step 330 may be implemented by projection module 230 .
- the projection module 230 may determine whether the vehicle state of the vehicle at the current time satisfies the projection condition. If satisfied, control the image projection device to project the image representing the vehicle state; if not, continue to acquire the next vehicle state.
- Projection conditions may include a reference type of vehicle state for which projection is required. The vehicle state of the vehicle at the current time meets the projection conditions, including that the type of the vehicle state of the vehicle at the current time matches or is the same as the reference type of the vehicle state that needs to be projected; the vehicle state of the vehicle at the current time does not meet the projection conditions, including the vehicle at the current time. The type of the vehicle state does not match or is not the same as the reference type of the vehicle state that needs to be projected.
- the next time is the time at which the next vehicle state of the vehicle needs to be projected.
- the next time can be the next time with a shorter time interval from the current time, or it can be another time after the current time.
- the obtaining module 210 may obtain the vehicle state of the vehicle at the next time.
- the obtaining module 210 obtains the vehicle state of the vehicle at the next time in a manner similar to that of obtaining the vehicle state at the current time.
- the obtaining of the current time by the obtaining module 210 For the way of the vehicle state, please refer to the description of step 310, which will not be repeated here.
- the vehicle state at the next time may include going straight, turning left, turning right, turning around, changing lanes to the left, changing lanes to the right, stop and yield, decelerate, yield, meet, park, One or more states of vehicle failure.
- the vehicle state at the next time and the vehicle state at the current time may be the same or different.
- the environment where the vehicle is located at the next time may be a physical environment where the vehicle is located at the next time, for example, a road, a cell, a parking lot, and the like.
- the environment where the vehicle is located at the next time and the environment where the vehicle is located at the current time may be the same or different.
- the condition parameters of the environment at the next time may include light intensity, visibility, etc., or a combination thereof.
- the status parameter of the environment at the next time may be the same as or different from the status parameter of the current time.
- the detection module 220 may acquire the condition parameters of the environment where the vehicle is located at the next time, and the manner in which the detection module 220 acquires the condition parameters of the environment where the vehicle is located at the next time and the manner of acquiring the condition parameters of the environment where the vehicle is located at the current time Similarly, for the manner in which the detection module 220 obtains the status parameter of the environment in which the current time is located, reference may be made to the description of step 320, which will not be repeated here.
- the projection mode corresponding to the current time may be the same as or different from the projection mode corresponding to the next time. Specifically, if the projection mode corresponding to the current time is the same as the projection mode corresponding to the next time, the projection mode will not be switched at the next time; if the projection mode corresponding to the current time and the projection mode corresponding to the next time Need to switch projection mode.
- the projection module 230 may determine whether to switch the projection mode corresponding to the current time based on the condition parameters of the environment where the vehicle is located at the next time. Specifically, the projection module 230 can determine the projection mode of the vehicle at the next time in the manner of step 410 and step 420 in FIG. 4 . If the projection mode at the next time is the same as the projection mode at the current time, the projection module 230 determines not to switch the current time The projection mode corresponding to the time; if the projection mode at the next time is different from the projection mode at the current time, the projection module 230 determines to switch the projection mode corresponding to the current time to the projection mode at the next time.
- the projection module 230 can determine the projection mode of the vehicle at the next time in the manner of steps 410 and 420 in FIG. 4 , which is consistent with the method of determining the projection mode of the vehicle at the current time. For details, please refer to step 410 in FIG. 4 . and 420 related descriptions, which will not be repeated here.
- steps 310 and 320 are not limited to the order in FIG. 3 .
- the processing device 112 may perform step 310 first and then step 320.
- the processing device 112 may perform step 310 and step 320 simultaneously. Such deformations are all within the protection scope of the present application.
- Step 410 compare the light intensity in the environment where the vehicle is located with the first intensity threshold.
- step 410 may be implemented by projection module 230 .
- the first intensity threshold is a light intensity value.
- the first intensity threshold may be used to determine whether to select the first mode or the second mode for projection. For more content about the first mode and the second mode, reference may be made to the description of step 420, which will not be repeated here.
- the first intensity threshold may be obtained according to multiple tests, and may also be adjusted according to different situations, which is not limited in this application.
- the first intensity threshold may be determined based on statistical results obtained from multiple experimental investigations.
- different first intensity thresholds may be determined according to different degrees of sensitivity of different groups of people (eg, divided by age, by visual state, etc.) to light intensity. Sensitivity is the furthest distance that the human eye can see an object clearly, or the intensity of light the human eye needs to see an object clearly.
- the comparison result may be obtained by comparing the luminous flux per unit area of the light irradiating the light intensity sensor on the light intensity sensor in the environment where the vehicle is located and the first intensity threshold.
- the comparison results include: the light intensity in the environment where the vehicle is located is greater than the first intensity threshold, the light intensity in the environment where the vehicle is located is equal to the first intensity threshold, and the light intensity in the environment where the vehicle is located is less than the first intensity threshold.
- Step 420 based on the comparison result, determine the projection mode of the image projection device.
- step 420 may be implemented by projection module 230 .
- the image projection device may include an active light source, a concentrating component, a projection component, and a control component.
- the projection mode may include a first mode and a second mode.
- the first mode is a mode in which the active light source is used for projection
- the second mode is a mode in which the light in the environment acquired by the concentrating component is used for projection.
- the first mode may be a mode in which only active light sources are used for projection.
- the second mode may be a mode in which only light in the environment acquired by the light-concentrating component is used for projection.
- the second mode may be a mode in which the light in the environment is acquired by the light concentrating component, and the active light source is used to supplement part of the light for projection.
- the projection mode of the image projection device is that the image projection device adopts the first mode or the second mode for projection.
- the image projection device includes transparencies.
- the transparencies may be pre-designed combinations of different patterns and colors to display vehicle status, for example, a left turn is represented by a red left turn arrow, a right turn is represented by a green right turn arrow, and a vehicle breakdown is represented by a red left turn arrow. Indicated by a yellow exclamation point.
- the transparencies may be one or more of coated glass, acrylic plastic mats, or other materials (eg, polycarbonate materials, fiber-reinforced composite materials) that appear in different colors.
- the power of the active light source can be adjusted to a larger value (for example, 30W) so that the light intensity emitted by the active light source is greater than a certain threshold (for example, a third intensity threshold), an image of the state of the vehicle can be viewed.
- a certain threshold for example, a third intensity threshold
- the power of the active light source can be adjusted to a smaller value (for example, 15W) so that the light intensity emitted by the active light source is lower than a certain threshold (for example, the third intensity threshold), the image of the vehicle state can be easily observed.
- the third intensity threshold reference may be made to the descriptions of other embodiments of this specification, and details are not repeated here.
- the light-condensing component may be an array comprising one or more convex lenses having the function of concentrating light, or other structures having the same function.
- the image projection device in the second mode in which the light in the environment acquired by the light-concentrating group element is used for projection, the image projection device can acquire the light in the environment, reflect the acquired light, and pass the reflected light through the transparencies for projection. Through the first mode or the second mode, after passing the light through the projection film, a projection image that is clearly distinguishable from the environment in which the vehicle is located can be projected.
- the light in the environment obtained by the light concentrating component is used for projection, and the projection method may also be determined according to the intensity of the light in the environment. Specifically, whether to adjust the intensity of the light received by the light concentrating component can be determined according to the light intensity in the environment where the vehicle is located, and the light concentrating component is used to reflect the received light to form reflected light, and use the reflected light for projection.
- the light intensity in the environment where the vehicle is located is greater than the second intensity threshold, it means that the light intensity in the environment where the vehicle is located is too strong, and at this time, the amount of light received in the environment where the vehicle is located should be reduced to reduce reflection
- the light intensity of the light if the light intensity in the environment where the vehicle is located is less than the third intensity threshold and greater than the first intensity threshold, it means that the light intensity in the environment where the vehicle is located is weak, and the active light source is controlled to turn on the supplementary light to increase the The light intensity of the reflected light; if the light intensity in the environment where the vehicle is located is between the second intensity threshold and the third intensity threshold, it means that the light intensity in the environment where the vehicle is located at this time is moderate, and the light collecting component is not adjusted at this time.
- the angles of one or more of the convex lenses may be adjusted so that they do not receive light and reflect the light, so that a part of the convex lens in the condensing array receives the light and reflects the light.
- controlling the active light source to turn on the supplementary light may be based on the light intensity in the environment where the vehicle is located, by adjusting the power of the active light source so that the sum of the light intensity is greater than or equal to the third intensity threshold and less than or equal to the second intensity threshold .
- the projection module 230 may determine the projection mode of the image projection device based on the comparison result obtained in step 410 .
- Step 430 Project the image into the environment based on the determined projection method.
- step 430 may be implemented by projection module 230 .
- the projection module 230 may control the image projection device to project the image into the environment. Specifically, based on the determined projection mode, the projection module 230 may control the image projection device to project the image representing the vehicle state on the ground in front, rear, left and/or right of the vehicle or on the outer contour of the vehicle.
- the power value of the active light source may not be limited to the form in step 420, and may also be other power values (eg, the smaller value is 5W, 10W, 20W, and the larger value is 50W, 70W, 100W).
- FIG. 5 is a schematic diagram of an image projection apparatus according to some embodiments of the present application.
- the image projection apparatus 500 may include an active light source 510, a light concentrating assembly 520, a projection assembly 530, and a control assembly (not shown in FIG. 5).
- active light source 510 may be used to generate and emit active light.
- the image projection device may use active light to perform image projection in the first mode.
- the active light source 510 can be used to generate and emit supplementary light to increase the light intensity of the light reflected by the light concentrating component 520;
- the image projection device may use supplementary light to perform image projection in the second mode.
- the light concentrating component 520 may be used to receive and reflect light from the environment in which the vehicle is located to form reflected light.
- the light condensing assembly 520 may include an array of one or more convex lenses with light concentrating function, or other structures with the same function.
- the projection component 530 is used for projecting an image representing the state of the vehicle into the environment where the vehicle is located under the action of active light or reflected light to form a projected image 540 .
- the projection assembly 530 may include a transparence film positioned in the propagation path of the reflected light as well as the active light.
- control assembly may be configured to control the projection assembly 530 to project an image representing the state of the vehicle into the environment based on the light provided by the active light source 510 or the light focusing assembly 520 according to the condition parameters of the environment in which the vehicle is located.
- control component may include at least one processing device (eg, processing device 112) and a storage device. The structure of the processing device may be the same as or similar to the processing device 112 described in FIG. 2 .
- the control component for example, the acquisition module 210 of the processing device 112 acquires the vehicle state and acquires the condition parameters of the environment in which the vehicle is located (for example, the detection module 220 of the processing device 112 )
- the control component eg, the projection module 230 of the processing device 112
- the control component may control the image projection device 5 to project an image representing the state of the vehicle into the environment based on the light intensity in the environment in which the vehicle is located, for example, the control component controls the projection
- the transparencies in assembly 530 display patterns corresponding to the vehicle state; meanwhile, the control assembly (eg, the projection module 230 of the processing device 112 ) can determine one of the following ways to project: (1) the control assembly (eg, If the projection module 230 of the processing device 112 determines that the first mode is adopted, it controls the image projection device 500 to turn on the active light source 510 to emit light.
- the possible beneficial effects of the embodiments of the present application include, but are not limited to: (1) Projecting the vehicle state by reflecting the light in the environment where the vehicle is located can solve the problem of heat dissipation of the image projection device during the day and save power; (2) Different projection modes are selected according to the light intensity in the environment, which can be applied to different application scenarios and facilitate the observation of the patterns of the vehicle state. It should be noted that different embodiments may have different beneficial effects, and in different embodiments, the possible beneficial effects may be any one or a combination of the above, or any other possible beneficial effects.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Accounting & Taxation (AREA)
- Marketing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Controls And Circuits For Display Device (AREA)
- Lighting Device Outwards From Vehicle And Optical Signal (AREA)
Abstract
本申请实施例公开了一种图像投射方法和系统。所述图像投射方法包括:获取当前时间的车辆状态;检测所述车辆所处环境在所述当前时间的状况参数,所述状况参数包括所述车辆所处环境中的光线强度;以及基于所述车辆所处环境中的光线强度,控制图像投射设备将表示所述车辆状态的图像投射至所述环境中。本申请提供的图像投射方法,通过反射车辆所处环境中的光线进行车辆状态投射,并根据环境中的光线强度选择不同的投射模式,可以解决图像投射设备在白天的散热问题,节约了电能,可以适用于不同的应用场景。
Description
交叉引用
本申请要求2020年12月1日提交的中国申请号202011383007.8的优先权,其全部内容通过引用并入本文。
本申请涉及信息交互领域,特别涉及一种图像投射方法和系统。
目前出于交通安全考虑,会通过投射图像来实现行人与车辆、车辆与车辆之间的信息交互。例如,通过图像投射设备将车辆状态投射在车辆附近,以便于周围的行人或车辆获知车辆状态。但是,现有的图像投射设备一般主要是通过自身配置的光源进行投射,不能根据不同的环境进行调整。因此,有必要提供一种图像投射方法和系统,以适应不同的环境。
发明内容
本申请实施例之一提供一种图像投射方法。所述图像投射方法包括:获取当前时间的车辆状态;检测所述车辆所处环境在所述当前时间的状况参数,所述状况参数包括所述车辆所处环境中的光线强度;以及基于所述车辆所处环境中的光线强度,控制图像投射设备将表示所述车辆状态的图像投射至所述环境中。
本申请实施例之一提供一种图像投射系统。所述图像投射系统包括:获取模块,用于获取当前时间的车辆状态;检测模块,用于检测所述车辆所处环境在所述当前时间的状况参数,所述状况参数包括所述车辆所处环境中的光线强度;以及投射模块,用于基于所述车辆所处环境中的光线强度,控制图像投射设备将表示所述车辆状态的图像投射至所述环境中。
本申请实施例之一提供一种图像投射系统。所述系统包括至少一个处理器和至少一个存储设备,所述存储设备用于存储指令,当所述至少一个处理器执行所述指令时,实现如本申请任一实施例所述的方法。
本申请实施例之一提供一种计算机可读存储介质。所述存储介质存储计算机指令,当计算机读取所述存储介质中的所述计算机指令后,所述计算机执行如本申请任一实施例所述的方法。
本申请实施例之一提供一种图像投射设备。所述设备包括:主动光源,用于提供主动光线;聚光组件,用于接收车辆所处环境中的光线并进行反射,以形成反射光线;投影组件,包括投影胶片,所述投影胶片位于所述反射光线以及所述主动光线中的传播光路上;以及控制组件,用于根据所述车辆所处环境的状况参数控制所述投影组件基于所述主动光源或聚光组件提供的光线将表示车辆状态的图像投射至所述环境中。
本申请将以示例性实施例的方式进一步说明,这些示例性实施例将通过附图进行详细描述。这些实施例并非限制性的,在这些实施例中,相同的编号表示相同的结构,其中:
图1是根据本申请一些实施例所示的图像投射系统的应用场景示意图;
图2是根据本申请一些实施例所示的图像投射系统的模块图;
图3是根据本申请一些实施例所示的图像投射方法的示例性流程图;
图4是根据本申请又一些实施例所示的图像投射方法的示例性流程图;以及
图5是根据本申请一些实施例所示的图像投射设备的示意图。
为了更清楚地说明本申请实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单的介绍。显而易见地,下面描述中的附图仅仅是本申请的一些示例或实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图将本申请应用于其它类似情景。除非从语言环境中显而易见或另做说明,图中相同标号代表相同结构或操作。
应当理解,本文使用的“系统”、“装置”、“单元”和/或“模组”是用于区分不同级别的不同组件、元件、部件、部分或装配的一种方法。然而,如果其他词语可实现相同的目的,则可通过其他表达来替换所述词语。
如本申请和权利要求书中所示,除非上下文明确提示例外情形,“一”、“一个”、“一种”和/或“该”等词并非特指单数,也可包括复数。一般说来,术语“包括”与“包含”仅提示包括已明确标识的步骤和元素,而这些步骤和元素不构成一个排它性的罗列,方法或者设备也可能包含其它的步骤或元素。
本申请中使用了流程图用来说明根据本申请的实施例的系统所执行的操作。应当理解的是,前面或后面操作不一定按照顺序来精确地执行。相反,可以按照倒序或同 时处理各个步骤。同时,也可以将其他操作添加到这些过程中,或从这些过程移除某一步或数步操作。
本申请的实施例可以应用于不同的运输系统,不同的运输系统包括但不限于陆地、海洋、航空、航天等中的一种或几种的组合。例如,出租车、专车、顺风车、巴士、代驾、火车、动车、高铁、船舶、飞机、热气球、无人驾驶的交通工具、收/送快递等应用了管理和/或分配的运输系统。本申请的不同实施例应用场景包括但不限于网页、浏览器插件、客户端、定制系统、企业内部分析系统、人工智能机器人等中的一种或几种的组合。应当理解的是,本申请的系统及方法的应用场景仅仅是本申请的一些示例或实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图将本申请应用于其它类似情景。例如,其他类似的引导用户停车系统。
本申请描述的“乘客”、“乘客端”、“用户终端”、“顾客”、“需求者”、“服务需求者”、“消费者”、“消费方”、“使用需求者”等是可以互换的,是指需要或者订购服务的一方,可以是个人,也可以是工具。同样地,本申请描述的“司机”、“司机端”、“提供者”、“供应者”、“服务提供者”、“服务者”、“服务方”等也是可以互换的,是指提供服务或者协助提供服务的个人、工具或者其他实体等。另外,本申请描述的“用户”可以是需要或者订购服务的一方,也可以是提供服务或者协助提供服务的一方。
图1是根据本申请一些实施例所示的图像投射系统的应用场景示意图。
图像投射系统100可以用于控制图像投射设备将包括车辆状态信息(简称车辆状态)的图像投射至环境中。在一些实施例中,图像投射系统100可以获取当前时间的车辆状态以及检测车辆所处环境在当前时间的状况参数(例如,光线强度、可见度等),并基于车辆所处环境中的状况参数(例如,光线强度),控制图像投射设备将包括车辆状态信息的图像(例如,图案)投射至环境中。如本申请中使用的,车辆是指交通工具,所述交通工具可以包括出租车、专车、顺风车、巴士、火车、动车、高铁、船舶、飞机、无人驾驶车辆等。
在一些实施例中,图像投射系统100可以是用于提供互联网服务的线上服务平台。例如,图像投射系统100可以应用于提供交通运输服务的网约车服务平台。所述网约车服务平台可以提供诸如出租车呼叫、快车呼叫、专车呼叫、小巴呼叫、拼车、公交服务、司机雇佣、接送服务、代驾等运输服务。又例如,图像投射系统100还可以应用于快递、外卖、出行(如旅游)等服务平台。又例如,图像投射系统100可以应用于导 航服务平台。在一些实施例中,图像投射系统100可以应用于无人驾驶系统。为方便描述,下文以导航服务平台为例来描述图像投射系统100的应用。这并不旨在限定,图像投射系统100可以应用于任何服务平台。
如图1所示,图像投射系统100可以包括服务器110、第一用户终端120、存储设备130、第二用户终端140、网络150和信息源160。
在一些实施例中,服务器110可以用于处理与图像投射系统100有关的信息和/或数据,例如,用于处理车辆状态投射。在一些实施例中,服务器110可以是单个服务器,也可以是服务器组。服务器组可以是集中式的或分布式的(例如,服务器110可以是一分布式系统)。在一些实施例中,服务器110可以是本地的,也可以是远程的。例如,服务器110可以经由网络150访问存储在第一用户终端120、第二用户终端140和/或存储设备130中的信息和/或数据。又例如,服务器110可以直接连接到第一用户终端120、第二用户终端140和/或存储设备130以访问存储信息和/或数据。在一些实施例中,服务器110可以在云平台或车载计算机上实现。仅作为示例,该云平台可以包括私有云、公共云、混合云、社区云、分布云、内部云、多层云等或其任意组合。
在一些实施例中,服务器110可以包括处理设备112。处理设备112可以处理与图像投射系统100有关的信息和/或数据,以执行本申请中描述的一个或以上功能。例如,处理设备112可以获取第一用户终端120和/或第二用户终端140发送的车辆状态的投射请求,并获取当前时间的车辆状态,检测车辆所处环境在当前时间的状况参数(例如,光线强度、可见度),然后基于车辆所处环境中的状况参数,控制图像投射设备将表示车辆状态的图像投射至所述环境中。在一些实施例中,处理设备112还可以获取车辆在下一时间的车辆状态,并获取车辆在下一时间所处环境的状况参数,然后基于车辆在下一时间所处环境的状况参数判断是否切换当前时间对应的投射模式。在一些实施例中,处理设备112还可以基于车辆所处环境中的光线强度,调整主动光源的功率以调整光线强度。在一些实施例中,处理设备112可以包括一个或以上处理引擎(例如,单芯片处理引擎或多芯片处理引擎)。仅作为示例,处理设备112可以包括中央处理单元(CPU)、特定应用集成电路(ASIC)、特定应用指令集处理器(ASIP)、图形处理单元(GPU)、物理处理单元(PPU)、数字信号处理器(DSP)、现场可编程门阵列(FPGA)、可编程逻辑设备(PLD)、控制器、微控制器单元、精简指令集计算机(RISC)、微处理器等或其任意组合。
存储设备130可以用于存储与图像投射请求相关的数据和/或指令。在一些实施 例中,存储设备130可以存储从第一用户终端120和/或第二用户终端140中获得/获取的数据。在一些实施例中,存储设备130可以储存服务器110用来执行或使用以完成本申请中描述的示例性方法的数据和/或指令。在一些实施例中,存储设备130可包括大容量存储器、可移动存储器、易失性读写存储器、只读存储器(ROM)等或其任意组合。示例性的大容量存储器可以包括磁盘、光盘、固态磁盘等。示例性可移动存储器可以包括闪存驱动器、软盘、光盘、存储卡、压缩盘、磁带等。示例性易失性读写存储器可以包括随机存取存储器(RAM)。示例性RAM可包括动态随机存取存储器(DRAM)、双倍数据速率同步动态随机存取存储器(DDRSDRAM)、静态随机存取存储器(SRAM)、晶闸管随机存取存储器(T-RAM)和零电容随机存取存储器(Z-RAM)等。示例性只读存储器可以包括模型只读存储器(MROM)、可编程只读存储器(PROM)、可擦除可编程只读存储器(EPROM)、电可擦除可编程只读存储器(EEPROM)、光盘只读存储器(CD-ROM)和数字多功能磁盘只读存储器等。在一些实施例中,存储设备130可在云平台上实现。仅作为示例,该云平台可以包括私有云、公共云、混合云、社区云、分布云、内部云、多层云等或其任意组合。在一些实施例中,存储设备130可以连接到网络150以与图像投射系统100的一个或以上组件(例如,服务器110、第一用户终端120、第二用户终端140)通信。图像投射系统100的一个或以上组件可以经由网络150访问存储在存储设备130中的数据或指令。在一些实施例中,存储设备130可以直接连接到图像投射系统100的一个或以上组件(例如,服务器110、第一用户终端120、第二用户终端140)或与之通信。在一些实施例中,存储设备130可以是服务器110的一部分。在一些实施例中,存储设备130可以单独的存储器。
在一些实施例中,第一用户终端120可以是与图像投射请求直接相关的个人、工具或其他实体。用户可以是图像投射请求者。在本申请中,“用户”、“用户终端”可以互换使用。在一些实施例中,第一用户终端120可以包括台式电脑120-1、笔记本电脑120-2、机动车辆中的车载设备120-3、以及移动设备120-4等或其任意组合。在一些实施例中,移动设备120-4可以包括智能家居设备、可穿戴设备、智能移动设备、虚拟现实设备、增强现实设备等或其任意组合。在一些实施例中,智能家居设备可以包括智能照明设备、智能电器控制设备、智能监控设备、智能电视、智能摄像机、对讲机等或其任意组合。在一些实施例中,可穿戴设备可以包括智能手镯、智能鞋袜、智能眼镜、智能头盔、智能手表、智能穿着、智能背包、智能配件等或其任意组合。在一些实施例中,智能移动设备可以包括智能电话、个人数字助理(PDA)、游戏设备、导航设备、 销售点(POS)等或其任意组合。在一些实施例中,虚拟现实设备和/或增强现实设备可以包括虚拟现实头盔、虚拟现实眼镜、虚拟现实眼罩、增强型虚拟现实头盔、增强现实眼镜、增强现实眼罩等或其任意组合。例如,虚拟现实设备和/或增强现实设备可以包括Google Glass、Oculus Rift、HoloLens或Gear VR等。在一些实施例中,机动车辆中的车载设备120-3可以包括车载计算机、车载电视等。
在一些实施例中,第二用户终端140可以是与第一用户终端120类似或相同的设备。在一些实施例中,第二用户终端140可以包括台式电脑140-1、笔记本电脑140-2、机动车辆中的车载设备140-3、以及移动设备140-4等或其任意组合。
在一些实施例中,第一用户终端120和/或第二用户终端140可以是具有定位技术的装置。在一些实施例中,第一用户终端120和/或第二用户终端140可以与另一定位设备通信以确定第一用户终端120和/或第二用户终端140的位置。在一些实施例中,第一用户终端120和/或第二用户终端140可以将定位信息发送到服务器110。在一些实施例中,第一用户终端120和/或第二用户终端140可以包括图像投射设备。在一些实施例中,第一用户终端120和/或第二用户终端140可以与图像投射设备通信连接,以通过图像投射设备将表示车辆状态的图像投射至环境中。
在一些实施例中,第一用户终端120和/或第二用户终端140可以与运输工具对应。运输工具也可以称为车辆。可以设想,运输工具可以是电动运输工具、燃料电池运输工具、混合动力运输工具或常规内燃发动机运输工具,例如,运输工具、轿跑车、轿车、皮卡车、旅行车、运动型多功能车(Sports Utility Vehicle,SUV)、小型货车或改装车。
运输工具可以配备有安装到车身的各种传感器。当运输工具沿着轨迹行进时,传感器可以被配置为捕获数据。例如,传感器可以是被配置为扫描周围并获取点云的LiDAR扫描仪和/或获取数字影像的3-D摄影机的组合。又例如,传感器可以是导航单元中使用的传感器,例如GPS接收器和一个或多个IMU传感器。GPS是全球导航卫星系统,其向GPS接收器提供地理定位和时间信息。IMU是一种电子装置,其使用各种惯性传感器例如加速度计和陀螺仪,有时还有磁力计,测量并提供运输工具的特定力、角速率、有时运输工具周围的磁场。GPS接收器和IMU传感器以在运输工具行进时提供运输工具的实时姿态信息,包括运输工具在每个时间戳处的位置和方向。
网络150可以促进信息和/或数据的交换。在一些实施例中,图像投射系统100的一个或以上组件(例如,服务器110、第一用户终端120、存储设备130、第二用户终 端140)可以经由网络150将信息和/或数据发送至图像投射系统100的其他组件。例如,服务器110可以经由网络150从第一用户终端120获得/获取投射请求。在一些实施例中,网络150可以是有线网络或无线网络等或其任意组合。仅作为示例,网络150可以包括电缆网络、有线网络、光纤网络、电信网络、内部网络、互联网、局域网络(LAN)、广域网络(WAN)、无线局域网络(WLAN)、城域网(MAN)、公共交换电话网络(PSTN)、蓝牙网络、紫蜂网络、近场通讯(NFC)网络、全球移动通讯系统(GSM)网络、码分多址(CDMA)网络、时分多址(TDMA)网络、通用分组无线服务(GPRS)网络、增强数据速率GSM演进(EDGE)网络、宽带码分多址接入(WCDMA)网络、高速下行分组接入(HSDPA)网络、长期演进(LTE)网络、用户数据报协议(UDP)网络、传输控制协议/互联网协议(TCP/IP)网络、短讯息服务(SMS)网络、无线应用协议(WAP)网络、超宽带(UWB)网络、红外线等或其任意组合。在一些实施例中,图像投射系统100可以包括一个或以上网络接入点。例如,基站和/或无线接入点150-1、150-2、…,图像投射系统100的一个或以上组件可以连接到网络150以交换数据和/或信息。
信息源160是为图像投射系统100提供其他信息的一个源。信息源160可以用于为系统提供与图像投射相关的信息,例如,投射时间、可见度、天气信息、法律法规信息、新闻信息、生活资讯、生活指南信息等。信息源160可以是一个单独的中央服务器的形式存在,也可以是以多个通过网络连接的服务器的形式存在,还可以是以大量的个人设备形式存在。当信息源160以大量个人设备形式存在时,这些设备可以通过一种用户生成内容(user-generated contents)的方式,例如向云端服务器上传文字、语音、图像、视频等,从而是云端服务器连通与其连接的众多个人设备一起组成信息源160。
应当注意图像投射系统100仅仅是为了说明的目的而提供的,并不意图限制本申请的范围。对于本领域的普通技术人员来说,可以根据本申请的描述,做出多种修改或变化。例如,图像投射系统100还可以包括数据库。又例如,图像投射系统100可以在其他设备上实现类似或不同的功能。然而,这些变化和修改不会背离本申请的范围。
图2是根据本申请一些实施例所示的处理设备的模块图。
如图2所示,处理设备112可以包括获取模块210、检测模块220和投射模块230。
获取模块210可以获取当前时间的车辆状态,所述车辆状态包括直行、左转、右转、掉头、向左变道、向右变道、停车让行、减速让行、会车让行、泊车、车辆故障 中的一种或多种类型。关于获取当前时间的车辆状态的详细描述可以参见图3,在此不再赘述。
检测模块220可以获取车辆所处环境在当前时间的状况参数,状况参数包括车辆所处环境中的光线强度和/或可见度。检测模块220可以从与车辆有关的环境感知系统获取所述车辆所处环境相关联的状态参数。关于检测车辆所处环境在当前时间的状况参数的详细描述可以参见图3,在此不再赘述。
投射模块230可以基于车辆所处环境中的光线强度,控制图像投射设备将表示车辆状态的图像投射至所述环境中。所述图像投射设备包括主动光源、聚光组件、投影组件、控制组件和光强度传感器。关于控制图像投射设备将表示车辆状态的图像投射至所述环境中的详细描述可以参见图3,在此不再赘述。
应当理解,图2所示的系统及其模块可以利用各种方式来实现。例如,在一些实施例中,系统及其模块可以通过硬件、软件或者软件和硬件的结合来实现。其中,硬件部分可以利用专用逻辑来实现;软件部分则可以存储在存储器中,由适当的指令执行系统,例如微处理器或者专用设计硬件来执行。本领域技术人员可以理解上述的方法和系统可以使用计算机可执行指令和/或包含在处理器控制代码中来实现,例如在诸如磁盘、CD或DVD-ROM的载体介质、诸如只读存储器(固件)的可编程的存储器或者诸如光学或电子信号载体的数据载体上提供了这样的代码。本申请的系统及其模块不仅可以有诸如超大规模集成电路或门阵列、诸如逻辑芯片、晶体管等的半导体、或者诸如现场可编程门阵列、可编程逻辑设备等的可编程硬件设备的硬件电路实现,也可以用例如由各种类型的处理器所执行的软件实现,还可以由上述硬件电路和软件的结合(例如,固件)来实现。
需要注意的是,以上对于图像投射系统及其模块的描述,仅为描述方便,并不能把本申请限制在所举实施例范围之内。可以理解,对于本领域的技术人员来说,在了解该系统的原理后,可能在不背离这一原理的情况下,对各个模块进行任意组合,或者构成子系统与其他模块连接。例如,在一些实施例中,获取模块210、检测模块220和投射模块230可以是一个系统中的不同模块,也可以是一个模块实现上述的两个或两个以上模块的功能。又例如,各个模块可以共用一个存储模块,各个模块也可以分别具有各自的存储模块。诸如此类的变形,均在本申请的保护范围之内。
图3是根据本申请一些实施例所示的图像投射方法的示例性流程图。过程300可以由图像投射系统100执行。例如,过程300可以由图2所述的处理设备112执行。
步骤310,获取当前时间的车辆状态。在一些实施例中,步骤310可以由获取模块210实现。
在一些实施例中,当前时间可以是需要对车辆状态进行投射的时刻,每个时刻有对应的车辆状态。
在一些实施例中,车辆状态可以是车辆在运行过程中的行驶状态或停止状态。在一些实施例中,车辆状态可以包括直行、左转、右转、掉头、向左变道、向右变道、停车让行、减速让行、会车让行、泊车、车辆故障中的一种或多种类型。
在一些实施例中,获取模块210可以通过与第一用户终端、第二用户终端和/或存储设备130进行通信以获取车辆状态信息。例如,获取模块210可以直接或通过网络150从第一用户终端、第二用户终端获取当前时间的车辆状态。又例如,获取模块210可以直接或通过网络从存储设备130中获取存储其中的已规划好的导航信息对应的当前时间的车辆状态。在一些实施例中,获取模块210还可以调用数据接口从车辆的导航设备或导航模块中获取当前时间的车辆状态。
例如,车辆状态可以由安装在车辆上的各种传感器和/或探测器获取的数据。传感器可以包括导航单元中使用的传感器,例如GPS接收器和/或一个或多个IMU传感器。GPS是全球导航卫星系统,其向GPS接收器提供地理定位和时间信息。IMU是一种电子装置,其使用各种惯性传感器例如加速度计和陀螺仪,有时还有磁力计,测量并提供运输工具的特定力、角速率、有时运输工具周围的磁场。由GPS接收器和IMU传感器,传感器可以在车辆行进时提供车辆的实时姿态信息,包括车辆在每个时间戳处的位置和方向。
步骤320,检测车辆所处环境在当前时间的状况参数,所述状况参数包括所述车辆所处环境中的光线强度。在一些实施例中,步骤320可以由检测模块220实现。
在一些实施例中,车辆所处环境可以是车辆当前时间所处的物理环境,例如,公路、小区、停车场、道路等。在一些实施例中,状况参数可以包括车辆当前时间所处的物理环境中的光照参数,例如,状况参数可以包括光线强度、可见度、光线颜色等或其组合。在一些实施例中,光线可以包括车辆所处环境中的自然光和/或灯光。例如,在白天时,光线可以为车辆所处环境中的自然光;在夜晚时,光线可以为车辆所处环境中的灯光;在傍晚时,光线可以为车辆所处环境中的自然光和灯光的总光线。在一些实施例中,光线强度是以自然光和/或灯光为光源,在单位面积上接收光照的光通量。可见度是指视力正常者能将目标从车辆所处环境中识别出来的最大距离。在一些实施例中,光 线强度与可见度具有一定的相关性,可以相互进行转换。例如,可以同时检测视力正常者的眼睛所在处的光线强度和在该光线强度下将目标从所处环境中识别出来的最大距离,通过多次测量可以得到一系列光线强度与可见度的对应关系,并可以相互进行转换。
在一些实施例中,车辆所处环境在当前时间的光线强度可以通过光强度传感器进行检测,车辆所处环境在当前时间的可见度可以通过可见度检测器进行检测。光强度传感器和/或可见度检测器可以安装在车辆上,也可安装在用户终端上。
在一些实施例中,检测模块220可以发送检测指令给光强度传感器和/或可见度检测器,使其检测车辆所处环境在当前时间的光线强度或可见度,光强度传感器或可见度检测器可以将光线强度或可见度的检测结果可以直接发送给检测模块220,也可以将光线强度或可见度的检测结果存储在存储设备130中,由检测模块220从存储设备130中获取。
步骤330,基于车辆所处环境中的状况参数,控制图像投射设备将表示车辆状态的图像投射至所述环境中。在一些实施例中,步骤330可以由投射模块230实现。
在一些实施例中,表示车辆状态的图像可以是通过图案或文字表示车辆当前状态的图像,以供与车辆有关的人员(例如,车辆附近的行人、其他司机)观测。在一些实施例中,表示车辆状态的图像可以与车辆状态相对应。具体地,车辆状态包括表示车辆直行、车辆左转、右转、掉头、向左变道、向右变道、停车让行、减速让行、会车让行、泊车、车辆故障中的一种或多种。相应的,表示车辆状态的图像可以为包括左转箭头、右转箭头、掉头箭头、向左变道箭头、向右变道箭头、停车让行标志、减速让行标志、会车让行标志、泊车标志、车辆故障标志等图像。不同的车辆状态可以对应不同的图像。投射模块230可以基于车辆状态确定对应的图像并控制图像投射设备将包括车辆状态对应的图像投射在所处环境中。
在一些实施例中,图像投射设备可以用于将表示车辆状态的图像投射至所述车辆所处环境中。图像投射设备可以包括主动光源、聚光组件、投影组件和控制组件。关于图像投射设备的更多内容可以参见图5及其相关说明,此处不再赘述。
在一些实施例中,投射模块230可以控制图像投射设备将表示车辆状态的图像投射在车辆前方、后方、左侧、右侧中一个或多个方位的地面上,或投射在车辆前方、后方、左侧、右侧中一个或多个方位的车辆外轮廓上。车辆前方、后方、左侧、右侧中一个或多个方位的地面上或车辆的外轮廓上可以是行人或其他车辆驾驶员清晰可见的位置。
在一些实施例中,投射模块230可以基于车辆所处环境中的光线强度,控制图像投射设备将表示车辆状态的图像投射至车辆所处的环境中。具体地,投射模块230可以比较车辆所处环境中的光线强度和第一强度阈值;并基于比较结果,确定图像投射设备的投射方式;以及基于确定的投射方式,将图像投射至车辆所处的环境中。在一些实施例中,投射方式可以包括利用主动光源进行投射的第一模式或利用聚光组件获取的环境中的光线进行投射的第二模式。关于控制图像投射设备将表示车辆状态的图像投射至车辆所处的环境中的更多内容可以参见图4及其相关说明,此处不再赘述。
在一些实施例中,投射模块230可以基于车辆所处环境的可见度,控制图像投射设备将表示车辆状态的图像投射至车辆所处的环境中。具体地,投射模块230可以比较车辆所处环境的可见度和一个或多个可见度阈值;并基于比较结果对表示车辆状态的图像进行颜色镀膜处理。例如,若车辆所处环境的可见度低于第一可见度阈值,对表示车辆状态的图像进行红色镀膜处理;若车辆所处环境的可见度大于第一可见度阈值或小于第二可见度阈值,对表示车辆状态的图像进行黄色镀膜处理;若车辆所处环境的可见度大于第二可见度阈值且小于第三可见度阈值,对表示车辆状态的图像进行绿色镀膜处理;若车辆所处环境的可见度大于第三可见度阈值,不对表示车辆状态的图像进行镀膜处理。在一些实施例中,投射模块230可以基于车辆状态判断是否对所述车辆状态进行投射。例如,投射模块230可以判断车辆在当前时间的车辆状态是否满足投射条件。若满足,则控制图像投射设备对表示车辆状态的图像进行投射;若不满足,则继续获取下一车辆状态。投射条件可以包括需要进行投射的车辆状态的参考类型。车辆在当前时间的车辆状态满足投射条件包括车辆在当前时间的车辆状态的类型与需要进行投射的车辆状态的参考类型匹配或相同;车辆在当前时间的车辆状态不满足投射条件包括车辆在当前时间的车辆状态的类型与需要进行投射的车辆状态的参考类型不匹配或不相同。投射条件可以由用户设置或根据系统默认设置。例如,可以根据车辆所行驶的道路类型进行设定。进一步地,若道路类型为高速路,则可以设定需要进行投射的车辆状态的参考类型包括向左变道、向右变道、减速让行、会车让行、泊车、车辆故障中等;若道路类型为城市道路,则可以设定需要进行投射的车辆状态的参考类型包括车辆直行、车辆左转、右转、掉头、向左变道、向右变道、停车让行、减速让行、会车让行、泊车、车辆故障中的一种或多种等。
在一些实施例中,下一时间是需要对所述车辆的下一个车辆状态进行投射的时刻。下一时间可以是与当前时间时间间隔较近的下一个时刻,也可以是除当前时间外之 后的其他时刻。在一些实施例中,获取模块210可以获取车辆在下一时间的车辆状态,获取模块210获取车辆在下一时间的车辆状态的方式与获取当前时间的车辆状态的方式类似,关于获取模块210获取当前时间的车辆状态的方式可以参见步骤310的描述,在此不作赘述。
在一些实施例中,下一时间的车辆状态可以包括直行、左转、右转、掉头、向左变道、向右变道、停车让行、减速让行、会车让行、泊车、车辆故障中的一种或多种状态。下一时间的车辆状态与当前时间的车辆状态可以相同,也可以不同。
在一些实施例中,车辆在下一时间所处环境可以是车辆在下一时间所处的物理环境,例如,公路、小区、停车场等。车辆在下一时间所处环境与当前时间车辆所处环境可以相同,也可以不同。
在一些实施例中,下一时间所处环境的状况参数可以包括光线强度、可见度等或其组合。下一时间所处环境的状况参数与当前时间的状况参数可以相同,也可以不同。在一些实施例中,检测模块220可以获取车辆在下一时间所处环境的状况参数,检测模块220获取车辆在下一时间所处环境的状况参数的方式与获取当前时间所处环境的状况参数的方式类似,关于检测模块220获取当前时间所处环境的状况参数的方式可以参见步骤320的描述,在此不作赘述。
关于投射模式的更多内容可以参见步骤420的描述,在此不作赘述。在一些实施例中,当前时间对应的投射模式可以与下一时间对应的投射模式相同,也可以不同。具体地,若当前时间对应的投射模式与下一时间对应的投射模式相同,则在下一时间不切换投射模式;若当前时间对应的投射模式与下一时间对应的投射模式不同,则在下一时间需要切换投射模式。
在一些实施例中,投射模块230可以基于车辆在下一时间所处环境的状况参数判断是否切换当前时间对应的投射模式。具体地,投射模块230可以按照图4中步骤410和步骤420的方式确定车辆在下一时间的投射模式,若下一时间的投射模式与当前时间的投射模式相同,则投射模块230判断不切换当前时间对应的投射模式;若下一时间的投射模式与当前时间的投射模式不同,则投射模块230判断切换当前时间对应的投射模式为下一时间的投射模式。需要说明的是,投射模块230可以按照图4中步骤410和步骤420的方式确定车辆在下一时间的投射模式,与确定车辆在当前时间的投射模式的方法一致,具体可以参见图4中步骤410和420的相关说明,此处不再赘述。
应当注意的是,上述各流程的描述仅仅是为了示例和说明,而不限定本说明书 的适用范围。对于本领域技术人员来说,在本说明书的指导下可以对流程进行各种修正和改变。然而,这些修正和改变仍在本说明书的范围之内。在一些实施例中,步骤310和步骤320的执行顺序不限于图3中的顺序。例如,处理设备112可以先执行步骤310、后执行步骤320。又例如,处理设备112可以同时执行步骤310和步骤320。诸如此类的变形,均在本申请的保护范围之内。
图4是根据本申请又一些实施例所示的图像投射方法的示例性流程图。过程400可以由图像投射系统100执行。例如,过程400可以由图2所述的处理设备112执行。
步骤410,比较车辆所处环境中的光线强度和第一强度阈值。在一些实施例中,步骤410可以由投射模块230实现。
第一强度阈值为光线强度值。在一些实施例中,第一强度阈值可以用于确定选择第一模式或第二模式进行投射。关于第一模式和第二模式的更多内容可以参见步骤420的描述,在此不作赘述。
在一些实施例中,第一强度阈值可以根据多次测试得到,也可以根据不同情况进行调整,本申请对此不作限制。例如,可以依据多次实验调查得到的统计结果来确定第一强度阈值。又例如,可以依据不同人群(例如,按年龄划分、按视力状态划分等)对光线强度的敏感程度不同而确定不同的第一强度阈值。敏感程度是人眼可以看清物体所能达到的最远距离,或人眼看清物体需要的光线强度。
关于车辆所处环境中的光线强度可以参照图3的描述,在此不作赘述。在一些实施例中,可以通过比较车辆所处环境中光线照射在光强度传感器上单位面积的光通量与第一强度阈值来得到比较结果。比较结果包括:车辆所处环境中的光线强度大于第一强度阈值、车辆所处环境中的光线强度等于第一强度阈值、车辆所处环境中的光线强度小于第一强度阈值。
步骤420,基于比较结果,确定图像投射设备的投射方式。在一些实施例中,步骤420可以由投射模块230实现。
图像投射设备可以包括主动光源、聚光组件、投影组件和控制组件。
在一些实施例中,投射方式可以包括第一模式和第二模式。第一模式是利用主动光源进行投射的模式,第二模式是利用聚光组件获取的环境中的光线进行投射的模式。在一些实施例中,第一模式可以是仅利用主动光源进行投射的模式。在一些实施例中,第二模式可以是仅利用聚光组件获取的环境中的光线进行投射的模式。在一些实施例中,第二模式可以是利用聚光组件获取的环境中的光线,并利用主动光源补充部分光线进行 投射的模式。图像投射设备的投射方式是图像投射设备采取第一模式或第二模式进行投射。
在一些实施例中,图像投射设备包括投影胶片。在一些实施例中,投影胶片可以是预先设计好的不同图案与颜色的搭配,用于显示车辆状态,例如,左转用红色左转箭头表示、右转用绿色右转箭头表示、车辆故障用黄色惊叹号表示。在一些实施例中,投影胶片可以是呈现不同颜色的镀膜玻璃、亚克力塑料垫或其他材料(例如,聚碳酸酯材料、纤维增强复合材料)的一种或多种。
在一些实施例中,主动光源可以是指图像投射设备自身提供的光源。在一些实施例中,利用主动光源进行投射的第一模式下,可以控制图像投射设备打开主动光源,产生主动光线以及使主动光线经过投影胶片以进行投射。在一些实施例中,仅利用主动光源进行投射时,可以根据环境中的光线强度调整主动光源的功率。例如,若车辆处于车道灯光比较明亮的夜晚,环境中的光线强度值不为零,此时可以调整主动光源的功率为较大值(例如,30W)使得主动光源发射的光线强度大于一定阈值(例如,第三强度阈值),便于车辆状态的图像被观察。又例如,若车辆处于漆黑的夜晚,环境中的光线强度值为零,此时可以调整主动光源的功率为较小值(例如,15W)使得主动光源发射的光线强度低于一定阈值(例如,第三强度阈值),车辆状态的图像即可容易被观察到。关于第三强度阈值可以参见本说明书其他实施例的描述,在此不作赘述。
在一些实施例中,聚光组件可以是包括一个或多个具有凸透镜聚光功能的凸透镜组成的阵列,或其他具有相同功能的结构。在一些实施例中,利用聚光组元件获取的环境中的光线进行投射的第二模式下,图像投射设备可以获取环境中的光线、反射获取的光线并使反射光线经过投影胶片以进行投射。通过第一模式或第二模式,将光线经过投影胶片后可以投射产生明显与车辆所处环境区分的投影图像。
在一些实施例中,利用聚光组件获取的环境中的光线进行投射还可以根据环境中的光线强度确定投射的方式。具体地,可以根据车辆所处环境中的光线强度判断是否调整聚光组件接收到的光线的强度,并利用聚光组件对接收到的光线进行反射,形成反射光线,以及利用反射光线进行投射。
在一些实施例中,可以根据车辆所处环境中的光线强度与第一强度阈值、第二强度阈值或第三强度阈值的大小关系,判断是否调整聚光组件接收到的光线的强度,其中,第二强度阈值大于第三强度阈值,第三强度阈值大于第一强度阈值。具体地,若车辆所处环境中的光线强度大于第二强度阈值,说明此时车辆所处环境中的光线强度过强, 此时应减少对车辆所处环境中的光线的接收量以降低反射光线的光线强度;若车辆所处环境中的光线强度小于第三强度阈值且大于第一强度阈值,说明此时车辆所处环境中的光线强度较弱,此时控制主动光源开启补充光线以增加反射光线的光线强度;若车辆所处环境中的光线强度介于第二强度阈值和第三强度阈值之间,说明此时车辆所处环境中的光线强度适中,此时不调整聚光组件接收到的光线的强度。其中,第一强度阈值、第二强度阈值和第三强度阈值均为光线强度值。在一些实施例中,第二强度阈值和第三强度阈值可以根据多次测试得到,也可以根据不同情况进行调整,本申请对此不作限制。例如,可以依据多次实验调查得到的统计结果来确定第二强度阈值和第三强度阈值。又例如,可以依据不同人群(例如,按年龄划分、按视力状态划分等)对光线强度的敏感程度不同而确定不同的第二强度阈值和第三强度阈值。在一些实施例中,若车辆所处环境中的光线强度大于第二强度阈值,为了减少对车辆所处环境中的光线的接收量,可以调整其中一个或几个凸透镜的角度,使其不接收和反射光线,而使聚光阵列中一部分凸透镜接收光线并反射光线。在一些实施例中,控制主动光源开启补充光线可以是根据车辆所处环境中的光线强度,通过调整主动光源的功率以使得光线强度总和大于或等于第三强度阈值且小于或等于第二强度阈值。
在一些实施例中,投射模块230可以基于步骤410中得到的比较结果,确定图像投射设备的投射方式。
步骤430,基于确定的投射方式,将图像投射至所述环境中。在一些实施例中,步骤430可以由投射模块230实现。
在一些实施例中,基于步骤420中确定的投射方式,投射模块230可以控制图像投射设备将图像投射至环境中。具体地,基于确定的投射方式,投射模块230可以控制图像投射设备将表示车辆状态的图像投射在车辆前方、后方、左侧和/或右侧的地面上或车辆的外轮廓上。
应当注意的是,上述各流程的描述仅仅是为了示例和说明,而不限定本说明书的适用范围。对于本领域技术人员来说,在本说明书的指导下可以对流程进行各种修正和改变。然而,这些修正和改变仍在本说明书的范围之内。例如,主动光源的功率值可以不限于步骤420中的形式,还可以为其他功率值(例如,较小值为5W、10W、20W,较大值为50W、70W、100W)。
图5是根据本申请一些实施例所示的图像投射设备的示意图。
如图5所示,图像投射设备500可以包括主动光源510、聚光组件520、投影组 件530和控制组件(图5中未示出)。
在一些实施例中,主动光源510可以用于产生并发射主动光线。图像投射设备可以利用主动光线进行第一模式下的图像投射。在一些实施例中,在所处环境中的光线强度小于第三强度阈值且大于第一强度阈值时,主动光源510可以用于产生并发射补充光线以增加聚光组件520反射光线的光线强度;图像投射设备可以利用补充光线进行第二模式下的图像投射。关于利用主动光源补充光线或主动光线进行投射的更多内容可以参见图4的相关内容,在此不作赘述。
在一些实施例中,聚光组件520可以用于接收车辆所处环境中的光线并进行反射,以形成反射光线。在一些实施例中,聚光组件520可以包括一个或多个具有聚光功能的凸透镜组成的阵列,或其他具有相同功能的结构。
投影组件530用于在主动光线或反射光线的作用下将表示车辆状态的图像投影至车辆所处环境中以形成投影图像540。在一些实施例中,投影组件530可以包括投影胶片,投影胶片位于反射光线以及主动光线的传播路径上。
在一些实施例中,控制组件可以用于根据车辆所处环境的状况参数控制投影组件530基于主动光源510或聚光组件520提供的光线将表示车辆状态的图像投影至环境中。在一些实施例中,控制组件可以包括至少一个处理设备(如处理设备112)和存储设备。处理设备的结构可以与图2所述的处理设备112相同或相似。
在一些实施例中,图像投射设备还可以包括光强度传感器(图5中未示出),用于检测所处环境中的光线强度。
下面通过一个实施例对图像投射设备的投射方式进行描述:控制组件(例如,处理设备112的获取模块210)获取车辆状态以及获取车辆所处环境的状况参数(例如,处理设备112的检测模块220),控制组件(例如,处理设备112的投射模块230)可以基于车辆所处环境中的光线强度,控制图像投射设备5将表示车辆状态的图像投射至所述环境中,例如,控制组件控制投影组件530中的投影胶片显示车辆状态对应的图案;同时,控制组件(例如,处理设备112的投射模块230)可以确定以下几种方式中的一种方式进行投射:(1)控制组件(例如,处理设备112的投射模块230)若确定采用第一模式后,控制图像投射设备500开启主动光源510发出光线,该光线通过投影胶片后,会在车辆所处环境中显示车辆状态对应的图像540(例如,左转图案);(2)控制组件(例如,处理设备112的投射模块230)若确定采用第二模式后,控制图像投射设备500控制聚光组件520中的一部分凸透镜反射环境中的光线,该光线通过投影胶片 后,会在车辆所处环境中显示车辆状态对应的图像540(例如,左转图案);(3)控制组件(例如,处理设备112的投射模块230)若确定采用第二模式后,控制图像投射设备500控制聚光组件520中的全部凸透镜反射环境中的光线,该光线通过投影胶片后,会在车辆所处环境中显示车辆状态对应的图像540(例如,左转图案);(4)控制组件(例如,处理设备112的投射模块230)若确定采用第二模式后,控制图像投射设备500控制聚光组件520中的全部凸透镜反射环境中的光线,并开启主动光源510补充部分光线,该凸透镜反射环境中的光线和主动光源510补充部分光线合并后光线通过投影胶片后,会在车辆所处环境中显示车辆状态对应的图像540(例如,左转图案)。
本申请实施例可能带来的有益效果包括但不限于:(1)通过反射车辆所处环境中的光线进行车辆状态投射,可以解决图像投射设备在白天的散热问题,节约了电能;(2)根据环境中的光线强度选择不同的投射模式,可以适用于不同的应用场景,并便于车辆状态的图案被观察。需要说明的是,不同实施例可能产生的有益效果不同,在不同的实施例里,可能产生的有益效果可以是以上任意一种或几种的组合,也可以是其他任何可能获得的有益效果。
上文已对基本概念做了描述,显然,对于本领域技术人员来说,上述详细披露仅仅作为示例,而并不构成对本申请的限定。虽然此处并没有明确说明,本领域技术人员可能会对本申请进行各种修改、改进和修正。该类修改、改进和修正在本申请中被建议,所以该类修改、改进、修正仍属于本申请示范实施例的精神和范围。
同时,本申请使用了特定词语来描述本申请的实施例。如“一个实施例”、“一实施例”、和/或“一些实施例”意指与本申请至少一个实施例相关的某一特征、结构或特点。因此,应强调并注意的是,本说明书中在不同位置两次或多次提及的“一实施例”或“一个实施例”或“一个替代性实施例”并不一定是指同一实施例。此外,本申请的一个或多个实施例中的某些特征、结构或特点可以进行适当的组合。
此外,本领域技术人员可以理解,本申请的各方面可以通过若干具有可专利性的种类或情况进行说明和描述,包括任何新的和有用的工序、机器、产品或物质的组合,或对他们的任何新的和有用的改进。相应地,本申请的各个方面可以完全由硬件执行、可以完全由软件(包括固件、常驻软件、微码等)执行、也可以由硬件和软件组合执行。以上硬件或软件均可被称为“数据块”、“模块”、“引擎”、“单元”、“组件”或“系统”。此外,本申请的各方面可能表现为位于一个或多个计算机可读介质中的计算机产品,该产品包括计算机可读程序编码。
计算机存储介质可能包含一个内含有计算机程序编码的传播数据信号,例如在基带上或作为载波的一部分。该传播信号可能有多种表现形式,包括电磁形式、光形式等,或合适的组合形式。计算机存储介质可以是除计算机可读存储介质之外的任何计算机可读介质,该介质可以通过连接至一个指令执行系统、装置或设备以实现通讯、传播或传输供使用的程序。位于计算机存储介质上的程序编码可以通过任何合适的介质进行传播,包括无线电、电缆、光纤电缆、RF、或类似介质,或任何上述介质的组合。
本申请各部分操作所需的计算机程序编码可以用任意一种或多种程序语言编写,包括面向对象编程语言如Java、Scala、Smalltalk、Eiffel、JADE、Emerald、C++、C#、VB.NET、Python等,常规程序化编程语言如C语言、Visual Basic、Fortran 2003、Perl、COBOL 2002、PHP、ABAP,动态编程语言如Python、Ruby和Groovy,或其他编程语言等。该程序编码可以完全在用户计算机上运行、或作为独立的软件包在用户计算机上运行、或部分在用户计算机上运行部分在远程计算机运行、或完全在远程计算机或服务器上运行。在后种情况下,远程计算机可以通过任何网络形式与用户计算机连接,比如局域网(LAN)或广域网(WAN),或连接至外部计算机(例如通过因特网),或在云计算环境中,或作为服务使用如软件即服务(SaaS)。
此外,除非权利要求中明确说明,本申请所述处理元素和序列的顺序、数字字母的使用、或其他名称的使用,并非用于限定本申请流程和方法的顺序。尽管上述披露中通过各种示例讨论了一些目前认为有用的发明实施例,但应当理解的是,该类细节仅起到说明的目的,附加的权利要求并不仅限于披露的实施例,相反,权利要求旨在覆盖所有符合本申请实施例实质和范围的修正和等价组合。例如,虽然以上所描述的系统组件可以通过硬件设备实现,但是也可以只通过软件的解决方案得以实现,如在现有的服务器或移动设备上安装所描述的系统。
同理,应当注意的是,为了简化本申请披露的表述,从而帮助对一个或多个发明实施例的理解,前文对本申请实施例的描述中,有时会将多种特征归并至一个实施例、附图或对其的描述中。但是,这种披露方法并不意味着本申请对象所需要的特征比权利要求中提及的特征多。实际上,实施例的特征要少于上述披露的单个实施例的全部特征。
一些实施例中使用了描述成分、属性数量的数字,应当理解的是,此类用于实施例描述的数字,在一些示例中使用了修饰词“大约”、“近似”或“大体上”来修饰。除非另外说明,“大约”、“近似”或“大体上”表明所述数字允许有±20%的变化。相应地,在一些实施例中,说明书和权利要求中使用的数值参数均为近似值,该近似值根 据个别实施例所需特点可以发生改变。在一些实施例中,数值参数应考虑规定的有效数位并采用一般位数保留的方法。尽管本申请一些实施例中用于确认其范围广度的数值域和参数为近似值,在具体实施例中,此类数值的设定在可行范围内尽可能精确。
针对本申请引用的每个专利、专利申请、专利申请公开物和其他材料,如文章、书籍、说明书、出版物、文档等,特此将其全部内容并入本申请作为参考。与本申请内容不一致或产生冲突的申请历史文件除外,对本申请权利要求最广范围有限制的文件(当前或之后附加于本申请中的)也除外。需要说明的是,如果本申请附属材料中的描述、定义、和/或术语的使用与本申请所述内容有不一致或冲突的地方,以本申请的描述、定义和/或术语的使用为准。
最后,应当理解的是,本申请中所述实施例仅用以说明本申请实施例的原则。其他的变形也可能属于本申请的范围。因此,作为示例而非限制,本申请实施例的替代配置可视为与本申请的教导一致。相应地,本申请的实施例不仅限于本申请明确介绍和描述的实施例。
Claims (16)
- 一种图像投射方法,其中,所述方法包括:获取当前时间的车辆状态;检测所述车辆所处环境在所述当前时间的状况参数,所述状况参数包括所述车辆所处环境中的光线强度;以及基于所述车辆所处环境中的光线强度,控制图像投射设备将表示所述车辆状态的图像投射至所述环境中。
- 根据权利要求1所述的方法,其中,所述基于所述车辆所处环境中的光线强度,控制图像投射设备将表示所述车辆状态的图像投射至所述环境中包括:比较所述车辆所处环境中的光线强度和第一强度阈值;基于比较结果,确定所述图像投射设备的投射方式;以及基于确定的投射方式,将所述图像投射至所述环境中,其中,所述投射方式包括利用主动光源进行投射的第一模式或利用聚光组件获取的所述环境中的光线进行投射的第二模式,以及所述基于比较结果,确定所述图像投射设备的投射方式包括:若所述车辆所处环境中的光线强度大于所述第一强度阈值,确定所述投射模式包括所述第二模式;或者若所述车辆所处环境中的光线强度小于所述第一强度阈值,确定所述投射模式包括所述第一模式。
- 根据权利要求1所述的方法,其中,所述车辆状态包括直行、左转、右转、掉头、向左变道、向右变道、停车让行、减速让行、会车让行、泊车、车辆故障中的一种或多种类型。
- 根据权利要求1所述的方法,其中,所述车辆所处环境中的光线包括所述车辆所处环境中的自然光和/或灯光。
- 根据权利要求1所述的方法,其中,所述控制图像投射设备将表示所述车辆状态的图像投射至所述环境中包括:将表示所述车辆状态的图像投射在所述车辆前方、后方、左侧和/或右侧的地面上或所述车辆的外轮廓上。
- 根据权利要求2所述的方法,其中,所述利用聚光组件获取的所述环境中的光线进行投射包括:根据所述车辆所处环境中的光线强度判断是否调整所述聚光组件接收到的光线的强度;利用所述聚光组件对所述接收到的光线进行反射,形成反射光线;以及利用所述反射光线进行投射。
- 根据权利要求6所述的方法,其中,所述根据所述车辆所处环境中的光线强度判断是否调整所述聚光组件接收到的光线的强度包括:若所述车辆所处环境中的光线强度大于第二强度阈值,减少对所述车辆所处环境中的光线的接收量以降低所述反射光线的光线强度;若所述车辆所处环境中的光线强度小于第三强度阈值且大于所述第一强度阈值,控制所述主动光源开启补充光线以增加所述反射光线的光线强度;或者,若所述车辆所处环境中的光线强度介于所述第二强度阈值和所述第三强度阈值之间,不调整所述聚光组件接收到的光线的强度;其中,所述第二强度阈值大于所述第三强度阈值,所述第三强度阈值大于所述第一强度阈值。
- 根据权利要求2所述的方法,其中,所述图像投射设备包括投影胶片;所述利用主动光源进行投射包括打开主动光源,产生主动光线以及使所述主动光线经过所述投影胶片以进行投射;所述利用聚光组件获取的所述环境中的光线进行投射包括反射所述获取的光线并使所述反射光线经过所述投影胶片以进行投射。
- 根据权利要求2所述的方法,其中,还包括:获取所述车辆在下一时间的车辆状态;获取所述车辆在所述下一时间所处环境的状况参数;以及基于所述车辆在所述下一时间所处环境的状况参数判断是否切换所述当前时间对应的所述投射模式。
- 根据权利要求2所述的方法,其中,所述方法还包括:基于所述车辆所处环境中的光线强度,调整所述主动光源的功率以调整所述光线强度。
- 一种图像投射系统,其中,所述系统包括:获取模块,用于获取当前时间的车辆状态;检测模块,用于检测所述车辆所处环境在所述当前时间的状况参数,所述状况参数包括所述车辆所处环境中的光线强度;投射模块,用于基于所述车辆所处环境中的光线强度,控制图像投射设备将表示所述车辆状态的图像投射至所述环境中。
- 一种图像投射系统,其中,所述系统包括至少一个处理器和至少一个存储设备,所述存储设备用于存储指令,当所述至少一个处理器执行所述指令时,实现如权利要求1~10中任一项所述的方法。
- 一种计算机可读存储介质,所述存储介质存储计算机指令,当计算机读取所述存储介质中的所述计算机指令后,所述计算机执行如权利要求1~10中任一项所述的方法。
- 一种图像投射设备,其中,所述设备包括:主动光源,用于提供主动光线;聚光组件,用于接收车辆所处环境中的光线并进行反射,以形成反射光线;投影组件,包括投影胶片,所述投影胶片位于所述反射光线以及所述主动光线中的传播光路上;以及控制组件,用于根据所述车辆所处环境的状况参数控制所述投影组件基于所述主动光源或聚光组件提供的光线将表示车辆状态的图像投射至所述环境中。
- 根据权利要求14所述的设备,其中,还包括:光强度传感器,用于检测所处环境中的光线强度。
- 根据权利要求15所述的设备,其中,所述主动光源用于:在所述所处环境中的光线强度小于第三强度阈值且大于第一强度阈值时,补充光线以增加所述反射光线的光线强度;或者,在所述所处环境中的光线强度小于所述第一强度阈值时,作为独立的光源进行投射。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011383007.8 | 2020-12-01 | ||
CN202011383007.8A CN112565724B (zh) | 2020-12-01 | 2020-12-01 | 一种图像投射方法和系统 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022116688A1 true WO2022116688A1 (zh) | 2022-06-09 |
Family
ID=75045840
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/123236 WO2022116688A1 (zh) | 2020-12-01 | 2021-10-12 | 一种图像投射方法和系统 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112565724B (zh) |
WO (1) | WO2022116688A1 (zh) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112565724B (zh) * | 2020-12-01 | 2022-05-17 | 北京航迹科技有限公司 | 一种图像投射方法和系统 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN203472670U (zh) * | 2013-07-31 | 2014-03-12 | 北京兴科迪科技有限公司 | 一种车载停车安全提醒装置 |
CN206031192U (zh) * | 2016-07-14 | 2017-03-22 | 奇瑞汽车股份有限公司 | 基于投影显示的高位刹车灯装置 |
US20170124927A1 (en) * | 2012-05-21 | 2017-05-04 | Omri KRIEZMAN | Vehicle projection systems and method |
CN107139832A (zh) * | 2017-05-08 | 2017-09-08 | 杨科 | 一种汽车光学投影警示系统及其方法 |
CN107554409A (zh) * | 2017-09-13 | 2018-01-09 | 芜湖皖江知识产权运营中心有限公司 | 一种汽车通讯系统 |
CN109572535A (zh) * | 2018-10-19 | 2019-04-05 | 河南中远光电科技有限公司 | 基于视觉处理和传感器技术的车用投影照明显示系统 |
CN111169370A (zh) * | 2020-01-14 | 2020-05-19 | 吉利汽车研究院(宁波)有限公司 | 一种行车信号指示系统、方法及车辆 |
CN111923858A (zh) * | 2020-07-10 | 2020-11-13 | 江苏大学 | 一种基于地面投影的汽车防碰撞预警装置及方法 |
CN112565724A (zh) * | 2020-12-01 | 2021-03-26 | 北京航迹科技有限公司 | 一种图像投射方法和系统 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102016209526A1 (de) * | 2015-06-12 | 2016-12-15 | Ford Global Technologies, Llc | Projektionseinrichtung und Verfahren zum Projizieren eines virtuellen Bilds in einen Sichtbereich eines Fahrers eines Fahrzeugs |
KR102673293B1 (ko) * | 2018-11-08 | 2024-06-11 | 현대자동차주식회사 | 서비스 로봇 및 그의 운용 방법 |
CN110017846A (zh) * | 2019-03-19 | 2019-07-16 | 深圳市谙达信息技术有限公司 | 一种基于全息投影技术的导航系统 |
CN210554474U (zh) * | 2019-05-23 | 2020-05-19 | 王慧敏 | 一种汽车后窗投影系统 |
CN111251977A (zh) * | 2020-03-12 | 2020-06-09 | 上汽大众汽车有限公司 | 投影式高清像素大灯 |
-
2020
- 2020-12-01 CN CN202011383007.8A patent/CN112565724B/zh active Active
-
2021
- 2021-10-12 WO PCT/CN2021/123236 patent/WO2022116688A1/zh active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170124927A1 (en) * | 2012-05-21 | 2017-05-04 | Omri KRIEZMAN | Vehicle projection systems and method |
CN203472670U (zh) * | 2013-07-31 | 2014-03-12 | 北京兴科迪科技有限公司 | 一种车载停车安全提醒装置 |
CN206031192U (zh) * | 2016-07-14 | 2017-03-22 | 奇瑞汽车股份有限公司 | 基于投影显示的高位刹车灯装置 |
CN107139832A (zh) * | 2017-05-08 | 2017-09-08 | 杨科 | 一种汽车光学投影警示系统及其方法 |
CN107554409A (zh) * | 2017-09-13 | 2018-01-09 | 芜湖皖江知识产权运营中心有限公司 | 一种汽车通讯系统 |
CN109572535A (zh) * | 2018-10-19 | 2019-04-05 | 河南中远光电科技有限公司 | 基于视觉处理和传感器技术的车用投影照明显示系统 |
CN111169370A (zh) * | 2020-01-14 | 2020-05-19 | 吉利汽车研究院(宁波)有限公司 | 一种行车信号指示系统、方法及车辆 |
CN111923858A (zh) * | 2020-07-10 | 2020-11-13 | 江苏大学 | 一种基于地面投影的汽车防碰撞预警装置及方法 |
CN112565724A (zh) * | 2020-12-01 | 2021-03-26 | 北京航迹科技有限公司 | 一种图像投射方法和系统 |
Also Published As
Publication number | Publication date |
---|---|
CN112565724B (zh) | 2022-05-17 |
CN112565724A (zh) | 2021-03-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021135371A1 (zh) | 一种自动驾驶方法、相关设备及计算机可读存储介质 | |
US10726276B2 (en) | Method for providing a sight securing image to vehicle, electronic apparatus and computer readable recording medium therefor | |
WO2021103511A1 (zh) | 一种设计运行区域odd判断方法、装置及相关设备 | |
TWI682321B (zh) | 基於位置資訊執行動作的系統、方法和非暫時電腦可讀取媒體 | |
US20210042531A1 (en) | Systems and methods for monitoring traffic sign violation | |
WO2021057344A1 (zh) | 一种数据呈现的方法及终端设备 | |
KR20170016177A (ko) | 차량 및 그 제어방법 | |
CN112512887B (zh) | 一种行驶决策选择方法以及装置 | |
WO2021036592A1 (zh) | 后视镜自适应调节方法及装置 | |
KR102209421B1 (ko) | 자율 주행 차량과 이를 이용한 주행 제어 시스템 및 방법 | |
US20190023223A1 (en) | Vehicle assistance apparatus and vehicle comprising same | |
CN106530785B (zh) | 一种导航提醒方法和系统 | |
CN108290521A (zh) | 一种影像信息处理方法及增强现实ar设备 | |
WO2022246852A1 (zh) | 基于航测数据的自动驾驶系统测试方法、测试系统及存储介质 | |
WO2021164463A1 (zh) | 检测方法、装置及存储介质 | |
CN109263541A (zh) | 一种车载预警系统、车载预警方法和计算机存储介质 | |
CN110730924A (zh) | 车辆用平视显示装置 | |
WO2021217575A1 (zh) | 用户感兴趣对象的识别方法以及识别装置 | |
KR20210013044A (ko) | 정보 처리 장치, 정보 처리 방법, 촬영 장치, 조명 장치 및 이동체 | |
US20240017719A1 (en) | Mapping method and apparatus, vehicle, readable storage medium, and chip | |
KR101767507B1 (ko) | 차량용 디스플레이 장치 및 그 제어 방법 | |
CN114842075B (zh) | 数据标注方法、装置、存储介质及车辆 | |
CN114882464B (zh) | 多任务模型训练方法、多任务处理方法、装置及车辆 | |
WO2022116688A1 (zh) | 一种图像投射方法和系统 | |
US20220050475A1 (en) | Autonomous vehicle signaling system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21899722 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21899722 Country of ref document: EP Kind code of ref document: A1 |