WO2024078740A1 - Celestial observation system for vehicle and celestial observation method for vehicle - Google Patents
Celestial observation system for vehicle and celestial observation method for vehicle Download PDFInfo
- Publication number
- WO2024078740A1 WO2024078740A1 PCT/EP2023/025427 EP2023025427W WO2024078740A1 WO 2024078740 A1 WO2024078740 A1 WO 2024078740A1 EP 2023025427 W EP2023025427 W EP 2023025427W WO 2024078740 A1 WO2024078740 A1 WO 2024078740A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- celestial
- output
- camera
- celestial observation
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 17
- 230000008859 change Effects 0.000 claims description 26
- 230000000694 effects Effects 0.000 claims description 18
- 230000001960 triggered effect Effects 0.000 claims description 9
- 238000004891 communication Methods 0.000 claims description 7
- 230000002123 temporal effect Effects 0.000 claims description 4
- 238000013519 translation Methods 0.000 claims description 3
- 238000004590 computer program Methods 0.000 claims description 2
- 230000006978 adaptation Effects 0.000 abstract description 3
- PEDCQBHIVMGVHV-UHFFFAOYSA-N Glycerine Chemical compound OCC(O)CO PEDCQBHIVMGVHV-UHFFFAOYSA-N 0.000 description 24
- 230000006870 function Effects 0.000 description 21
- 238000010586 diagram Methods 0.000 description 10
- 230000003993 interaction Effects 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 206010028980 Neoplasm Diseases 0.000 description 3
- 239000005441 aurora Substances 0.000 description 3
- 201000011510 cancer Diseases 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 230000001953 sensory effect Effects 0.000 description 2
- 230000001755 vocal effect Effects 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000036651 mood Effects 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000010587 phase diagram Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/02—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by astronomical means
- G01C21/025—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by astronomical means with the use of startrackers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/28—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with an adjustable field of view
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/56—Accessories
- G03B17/561—Support related camera accessories
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
Definitions
- the present invention relates to a celestial observation system for a vehicle, and the present invention also relates to a celestial observation method for a vehicle and a machine readable storage medium.
- An objective of the present invention is to provide a celestial observation system for a vehicle, comprising: a camera, movably mounted on the vehicle and configured to capture a sky image outside the vehicle; a vehicle positioning module, configured to acquire positional information of the vehicle; a control module, configured to determine, according to the positional information of the vehicle, a positional relationship of at least one celestial object to be observed relative to the vehicle, and adjust an orientation of the camera relative to the vehicle according to the positional relationship, so that the at least one celestial object is within a viewing angle range of the camera; and an output module, configured to output a celestial observation view in the vehicle at least on the basis of the sky image captured via the camera.
- the present invention particularly comprises the following technical idea: the relative orientation of the camera is adjusted to perform dynamic adaptation on a celestial observation angle, thereby capturing a target celestial object precisely and reliably regardless of the vehicle position.
- a condition of real celestial observation is created for a passenger in the vehicle as if the passenger is in the nature, thereby allowing a user to learn the current celestial object shape and change in real time, and expanding space knowledge reserve of the user in addition to enriching the sensory experience thereof.
- the celestial observation system further comprises a vehicle motion determination module configured to acquire a travel direction and/or a travel speed of the vehicle; wherein the control module is further configured to determine a dynamic change in the positional relationship of the at least one celestial object relative to the vehicle according to the travel direction and/or the travel speed of the vehicle, and additionally adjust the orientation of the camera relative to the vehicle according to the dynamic change in the positional relationship.
- a vehicle motion determination module configured to acquire a travel direction and/or a travel speed of the vehicle
- the control module is further configured to determine a dynamic change in the positional relationship of the at least one celestial object relative to the vehicle according to the travel direction and/or the travel speed of the vehicle, and additionally adjust the orientation of the camera relative to the vehicle according to the dynamic change in the positional relationship.
- the celestial observation system further comprises an image recognition module configured to identify the at least one celestial object to be observed in the sky image captured via the camera; wherein the control module is further configured to adjust the orientation of the camera relative to the vehicle according to an identification result of the image recognition module so as to cause the camera to track and capture the identified at least one celestial object.
- the celestial observation system further comprises an image recognition module configured to identify the at least one celestial object to be observed in the sky image captured via the camera; wherein the control module is further configured to:
- control module is further configured to acquire a local segment from the sky image captured via the camera, so that the local segment comprises only the at least one celestial object to be observed, and/or an image ratio of the at least one celestial object to be observed in the local segment exceeds a preset value; and/or the output module is further configured to output a celestial observation view on the basis of a local segment acquired from the sky image.
- control module is further configured to control the output module to output the celestial observation view according to a determined output effect, wherein the outputting the celestial observation view according to the determined output effect comprises:
- control module is further configured to control, at least in a live mode and a nonlive mode, the output of the celestial observation view in the vehicle, wherein in the live mode, the output module is caused to output the celestial observation view in the vehicle on the basis of the sky image captured via the camera in real time, and in the non-live mode, the output module is caused to output the celestial observation view in the vehicle on the basis of another source image, the other source image being different from the sky image captured via the camera in real time.
- the output of the celestial observation view is controlled in the live mode and the non-live mode, so that free switching between celestial observation sources can be performed for different traveling statuses, user requirements, and weather conditions.
- the passenger can be provided with real-time immersive observation experience.
- an atmosphere of being on a sea of stars can be created in the vehicle cabin, and the passenger is provided with the opportunity to learn popular aerospace science knowledge.
- the other source image comprises: a sky image captured or recorded via the camera in advance; and/or a sky image, a starry sky atmosphere image, and/or a celestial popular science image stored locally on the vehicle and/or received from the outside of the vehicle.
- the vehicle entertainment and teaching functions are newly added.
- the user experience in the vehicle cabin is enriched, and the sense of science and technology of the vehicle is enhanced.
- control module is further configured to control, according to a motion status of the vehicle, the output of the celestial observation view by the output module, wherein the control module is configured to prohibit the celestial observation view from being output in the vehicle or allow the celestial observation view to be output only in the non-live mode when the travel speed of the vehicle is greater than a threshold.
- Flexible switching between celestial observation sources is achieved, and traveling safety can be fully ensured.
- control module is further configured to control, according to a weather condition, the output of the celestial observation view by the output module, wherein the control module is configured to allow the celestial observation view to be output only in the non-live mode when the weather condition does not satisfy a preset requirement.
- control module is further configured to recommend an output mode of the celestial observation view, and an output position and/or an output effect of the celestial observation view in the vehicle in a personalize manner according to identity information of a vehicle user.
- control module is further configured to control the output module and at least one vehicle cabin component of the vehicle in a coordinated manner, so that a status change of the at least one vehicle cabin component is triggered in temporal association with the output of the celestial observation view, the at least one vehicle cabin component comprising a seat, ambient lighting, an audio system, and/or an air conditioner of the vehicle.
- coordinated control allows the user to observe the celestial object in the most comfortable posture and angle, and multi-modal interaction enriches the sensory experience of the user.
- the celestial observation system further comprises a user input module configured to receive:
- control module is further configured to:
- the celestial observation system further comprises a user input module configured to receive a gesture input of the vehicle user for the celestial observation view already output in the vehicle, wherein the control module is further configured to:
- the celestial observation system further comprises a communication interface configured to upload the sky image captured via the camera to cloud, transmit the same to a mobile terminal device of the vehicle user, and/or share the same with another vehicle.
- a celestial observation method for a vehicle comprising the following steps: acquiring positional information of the vehicle; determining, according to the positional information of the vehicle, a positional relationship of at least one celestial object to be observed relative to the vehicle, and adjusting an orientation of a camera relative to the vehicle according to the positional relationship, so that the at least one celestial object is within a viewing angle range of the camera; and capturing a sky image outside the vehicle via the camera; and outputting a celestial observation view in the vehicle at least on the basis of the sky image captured via the camera.
- the celestial observation method further comprises the following steps: controlling, at least in a live mode and a non-live mode, the output of the celestial observation view in the vehicle, wherein in the live mode, the output module is caused to output the celestial observation view in the vehicle on the basis of the sky image captured via the camera in real time, and in the non-live mode, the output module is caused to output the celestial observation view in the vehicle on the basis of another source image, the other source image being different from the sky image captured via the camera in real time.
- a machine readable storage medium storing a computer program used to, when run on a computer, perform the celestial observation method according to the first aspect of the present invention.
- FIG. 1 shows a block diagram of a celestial observation system for a vehicle according to an exemplary embodiment of the present invention
- FIGs. 2a-2c show schematic diagrams of outputting a celestial observation view in a vehicle in exemplary scenarios
- FIGs. 3a-3f show schematic diagrams of an interface of a celestial observation view output in a vehicle in a live mode and a non-live mode;
- FIGs. 4a-4d show schematic diagrams showing that a user changes an output effect of a celestial observation view via gesture interaction
- FIG. 5 shows a schematic diagram of an interface of a user input module according to an exemplary embodiment of the present invention.
- FIG. 6 shows a flowchart of a celestial observation method for a vehicle according to an exemplary embodiment of the present invention.
- FIG. 1 shows a block diagram of a celestial observation system for a vehicle according to an exemplary embodiment of the present invention.
- the celestial observation system 1 includes a camera 2, an output module 3, a vehicle positioning module 12, and a control module 4, and these modules are connected to each other via a communication technique.
- the camera 2 is movably mounted on the vehicle and configured to capture a sky image outside the vehicle.
- the camera 2 is a wide-angle camera, and is arranged on a vehicle roof so that an initial viewing angle thereof is pointed to a partial sky above the vehicle.
- the camera 2 may also be an environmental perception camera provided on the vehicle and originally used to support the driving assistance function or the autonomous driving function, and in this case the initial viewing angle of the camera is pointed to, for example, a vehicle travel direction.
- a plurality of cameras may also be arranged at different positions on a vehicle body, so as to synthesize images captured individually thereby, and then provide a celestial observation view.
- the output module 3 is configured to output the celestial observation view in the vehicle.
- the output module 3 is configured to be a projection apparatus, and can project, on a specified projection region in a vehicle cabin, content to be projected, or display the same in the vehicle by using a holographic projection technique.
- a projection region may be, for example, a vehicle ceiling, an inner wall of the vehicle cabin, a skylight, or a vehicle window.
- the output module 3 may also be configured to be a display of the vehicle.
- Such a display includes, for example, a head unit screen, an intelligent instrument screen, a front/rear row multifunctional smart tablet, etc.
- the celestial observation view may particularly be in the form of an image and a graphic.
- the celestial observation view may be, for example, a photograph or a celestial pictogram. It is also possible that the celestial observation view includes a text or graphical annotation or includes a vocal commentary. Therefore, the celestial observation view is not necessarily static, but may vary dynamically over time. In this way, the celestial observation view may also include a video sequence consisting of a plurality of single-frame images.
- a vehicle user may be, for example, a driver, a front passenger, or another passenger of the vehicle.
- a specified observer may also be located outside the vehicle, and observe from outside the celestial observation view projected to the vehicle window.
- the vehicle positioning module 12 includes, for example, an on-board GPS sensor, an inertial navigation apparatus, etc., and is configured to acquire positional information of the vehicle.
- positional information includes, for example, geographical coordinate and latitude and longitude information of a location of the vehicle.
- the positional information of the vehicle further includes, for example, attitude information (pitch, yaw, etc.) of the vehicle.
- the positional information may also carry a timestamp.
- the control module 4 is configured to determine, according to the positional information of the vehicle, a positional relationship of at least one celestial object to be observed relative to the vehicle, and adjust an orientation of the camera relative to the vehicle according to the positional relationship, so that the at least one celestial object is within a viewing angle range of the camera.
- the control module 4 for example, can access a database storing celestial position data of all celestial objects, and determine, with reference to the vehicle position and an internal time of the system, all candidate celestial object information that can be observed in a current location of the vehicle.
- the control module may directly determine that all candidate celestial objects are celestial objects to be observed.
- control module may determine, according to a default configuration of the system, that a specific celestial object (e.g., the moon) of a preset category is a celestial object to be observed.
- a celestial object of interest may also be selected, on the basis of a user input, from all candidate celestial objects that may be observed, and the celestial object of interest is used as a celestial object to be observed.
- control module 4 may be further divided into a computing unit and a motion execution unit.
- the computing unit calculates, according to the positional information of the vehicle, a positional relationship of the celestial object to be observed relative to the vehicle (e.g., a certain celestial object is located on the left, right, front, or rear side of the vehicle), thereby generating a control signal for controlling motion of the camera 2.
- a control signal is provided to the motion execution unit coupled to the camera 2.
- the motion execution unit is configured to be, for example, an electrically driven rotary joint, and can, for example, drive the camera 2 to rotate horizontally by 360 degrees and rotate vertically by 180 degrees.
- the motion execution unit is controlled to adjust the location and/or angle of the camera 2 relative to the vehicle body, so as to cause the viewing angle range thereof to be aimed at the celestial object to be observed.
- a motion status of the vehicle may further be taken into account during control of a change in the viewing angle of the camera.
- the celestial observation system 1 further optionally includes a vehicle motion determination module 13.
- the vehicle motion determination module 13 includes, for example, a wheel speed sensor of the vehicle and a gyroscope, so that a travel speed and a travel direction of the vehicle can be determined.
- the control module 4 may determine a change in the positional relationship of the celestial object to be observed relative to the vehicle over time, and additionally adjust the orientation of the camera relative to the vehicle according to such a dynamic change.
- the location of the vehicle on the earth is relatively fixed, so that the position of the celestial object to be observed relative to a geographic region where the vehicle is located does not change suddenly.
- the celestial object to be observed may change from one side of the vehicle to another side, and in this case, if a shooting angle of the camera is not adjusted in a timely manner, a target celestial object may disappear from a projected image in the vehicle. Therefore, the camera needs to be adjusted dynamically according to the motion status of the vehicle so as to adapt an observation angle of the camera to the motion of the vehicle.
- a pre-planned travel route of the vehicle may be taken into account during the control of the change in the viewing angle of the camera.
- the celestial observation system 1 further optionally includes a navigation module 14.
- the control module 4 may, for example, read the pre-planned travel route of the vehicle from the navigation module 14, and estimate a change trend of the positional relationship of at least one celestial object relative to the vehicle in a determined time period. Then, the control module 4 may generate an orientation adjustment scheme of the camera 2 according to the change trend, and then fine-tune the angle and location of the camera according to the positional relationship while following the orientation adjustment scheme.
- the orientation adjustment scheme may be, for example, an adjustment step sequence and an adjustment parameter sequence (e.g., rotating leftwards by 15° - rotating rightwards by 20° - translating to the left side of the vehicle by 3 cm) of the camera predicted for a determined road segment or time period. If the vehicle travels according to the pre-planned travel route, adjustment to the camera 2 does not substantially deviate from such a preliminary scheme. On that basis, the preliminary scheme only needs to be fine-tuned with reference to the specific position and the motion status of the vehicle, so that the celestial object to be observed can be tracked more accurately, and before a significant change in the direction or speed of the vehicle occurs, the camera is readied for such an upcoming sudden angle change.
- an adjustment parameter sequence e.g., rotating leftwards by 15° - rotating rightwards by 20° - translating to the left side of the vehicle by 3 cm
- the celestial observation system 1 further optionally includes an image recognition module 21.
- the image recognition module 21 may, for example, be integrated in the camera 2 or the control module 4.
- the image recognition module 21 is configured to identify at least one celestial object to be observed in the sky image captured via the camera 2.
- the control module 4 is, for example, further configured to control the motion of the camera 2 so as to cause the same to track and capture the identified at least one celestial object.
- the control module 4 may be further configured to re-adjust the orientation of the camera 2 relative to the vehicle in the case that the at least one celestial object to be observed is not identified, or output a celestial observation view in the vehicle on the basis of another source image in the case that the at least one celestial object to be observed is not identified within a predetermined number of times.
- control module 4 controls, at least in a live mode and a non-live mode, the output of the celestial observation view in the vehicle.
- live mode the control module 4 causes the output module 3 to output a celestial observation view in the vehicle at least on the basis of the sky image captured via the camera 2 in real time.
- non-live mode the control module 4 causes the output module 3 to output a celestial observation view in the vehicle on the basis of another source image, the other source image being different from the sky image captured via the camera in real time.
- the other source image may be, for example, a sky image captured or recorded via the camera 2 in advance (e.g., a few hours or a few days ago), and in this case, the other source image is not real-time.
- the other source image may also be image or video data received from the outside of the vehicle via a communication interface 5, and may include, for example:
- the control module 4 may control, in the case that a variety of factors are considered, enabling and disabling of a particular output mode and switching between different output modes.
- the celestial observation system 1 may further include, for example, a user input module 11, a weather acquisition module 15, a person monitoring module 16, and a vehicle function linkage module 17.
- the control module 4 is respectively connected to these modules, so as to receive a condition that may affect the output of the celestial observation view.
- the user input module 11 is configured to be, for example, a standalone interactive touch interface, or may be configured to be integrated in an interactive interface originally provided in the vehicle.
- the user input module 11 is configured to receive: a first specifying input of the user for the output mode of the celestial observation view, a second specifying input of the user for the output position of the celestial observation view in the vehicle, and a third specifying input of the user for a celestial object category to be observed. Then, the control module 4 controls the output of the celestial observation view on the basis of a user input detected by the user input module 11.
- control module 4 is configured to, for example, select, according to the first specifying input of the user, the live mode or the non-live mode to output the celestial observation view, and control the output module 3 according to the second specifying input of the user, so as to output the celestial observation view in the output position specified by the user.
- control module 4 may further control predetermined motion of the camera 2 and/or output of the output module 3 according to the third specifying input of the user, so that the celestial observation view output in the vehicle includes the celestial object category specified by the user.
- the celestial object category to be observed includes the moon, the starry sky, constellations, stars (satellites, comets, fixed stars, or planets), aurora, or one or a combination thereof.
- the control module 4 may also acquire the motion status of the vehicle from the vehicle motion determination module 13, and thereby control enabling permissions of different output modes or recommend a suitable output mode.
- the control module 4 is configured to prohibit the celestial observation view from being output in the vehicle or allow the celestial observation view to be output only in the non-live mode when the travel speed of the vehicle is greater than 30 km/h.
- the weather acquisition module 15 is configured to acquire weather information of the location where the vehicle is located, and the weather information includes not only weather conditions (e.g., clear, rainy, cloudy, etc.) but also visibility information (a pollution level).
- the control module 4 may determine, in the case of having learned the candidate celestial object information that can be observed in the current location of the vehicle in combination with the weather information, whether these celestial objects can be actually observed from the current location of the vehicle. For example, although some celestial objects are theoretically visible to an observer in the region where the vehicle is located, these celestial objects cannot be identified from the sky image captured in real time if the weather conditions do not satisfy a preset requirement. Therefore, for example, the celestial observation view is allowed to be output only in the non-live mode.
- the person monitoring module 16 includes, for example, one or more in-vehicle cameras arranged in the vehicle cabin, and is configured to monitor physical information of passengers, including, for example, age information, gender information, and mood information of persons. Moreover, the person monitoring module 16 may further include a seat occupancy status sensor to acquire a distribution of passengers in the vehicle. By learning this information, the control module 4 can recommend an output mode of the celestial observation view, and an output position and/or an output effect of the celestial observation view in the vehicle in a personalize manner. For example, if vehicle passengers include a child, a vocal commentary of the celestial observation view is enabled automatically, and rendering processing is performed on a real sky image to be output, so as to enrich an animation effect. If the vehicle is traveling, and a passenger is only in the rear row of the vehicle in addition to a driver, the celestial observation view may be projected to only the rear row of the vehicle.
- the vehicle function linkage module 17 is configured to acquire an activated state of at least one predefined function of the vehicle, and such an activated state is also provided to the control module 4, so that the control module 4 can control, in association with the predefined function, the output of the celestial observation view. Specifically, the control module 4 may automatically trigger the output of the celestial observation view in response to activation of a predefined function of the vehicle.
- a predefined function of the vehicle For example, a plurality of atmosphere functions (e.g., a proposing function, an anniversary reminder function, and a festival atmosphere function) are preset in the vehicle.
- control module 4 may automatically project the celestial observation view in the vehicle cabin in order to further create a romantic atmosphere, so that the passenger is in the sea of stars or the shadow of the moon.
- control module 4 may also use the communication interface 5 to upload the sky image captured via the camera together with additional information such as geographical coordinates, a time, a date, the weather, etc., transmit the same to a mobile terminal device of the vehicle user, and/or share the same with another vehicle.
- control module 4 may also receive a celestial observation sharing request from the outside of the vehicle via the communication interface 5, and generate, in response to reception of such a sharing request, a prompt for enabling the non-live mode or a prompt for switching from the live mode to the nonlive mode.
- a friend of the vehicle user initiates real-time sharing via social media, so as to share a starry sky image captured thereby during traveling abroad.
- the vehicle user is viewing the local night sky in the live mode, so that the system pushes, to the user, a prompt “XX initiated a request to view the moon in real time. Whether to switch to the non-live mode?”. If a positive response to the request is received from the vehicle user, a moon image shared by the friend in real time may be output in the vehicle.
- the control module 4 may further control the output module 3 and at least one vehicle cabin component 31, 32, 33 of the vehicle in a coordinated manner, so that a status change of the at least one vehicle cabin component is triggered in temporal association with the output of the celestial observation view.
- the at least one vehicle cabin component includes a seat 31, an audio system 32, ambient lighting 33, and/or an air conditioner of the vehicle.
- the control module 4 may control the vehicle seat to move backwards/forwards as a whole, and cause the backrest of the vehicle seat to pivot backwards to a preset position, so that the passenger enjoys celestial observation in a more comfortable state.
- the immersive feel in celestial observation may also be improved by decreasing brightness of the ambient lighting in the vehicle cabin or changing the hue thereof, and by changing the direction and air volume of output air of the air conditioner, and controlling the audio system to play soothing music.
- FIGs. 2a-2c show schematic diagrams of outputting a celestial observation view in a vehicle in exemplary scenarios.
- the camera 2 is movably mounted on the vehicle roof of the vehicle 100. Driven by the motion execution unit, the camera 2 not only can move vertically in a height direction of the vehicle and horizontally, but also can, for example, rotate, within a preset angle, about each of x, y, and z axes exemplarily shown in FIG. 2a.
- a celestial observation function has not been triggered in the vehicle 100.
- the camera 2 is, for example, held in an initial orientation relative to the vehicle 100, and correspondingly, the direction of a viewing angle range 210 of the camera 2 in the initial orientation is also shown.
- the camera 2 is, for example, in a standby or dormant state, and the position and the backrest of the seat 31 of the vehicle user 51 are both in a state originally set by the user.
- the celestial observation function is triggered when the vehicle 100 is in a stopped state. It should be noted that the triggering of the celestial observation function may be initiated by the vehicle user 51, or may be triggered by the system (e.g., in a linked manner on the basis of the activated state of another predefined function of the vehicle), or may be triggered automatically when a vehicle configuration satisfies a preset requirement.
- the control module calculates, with reference to the positional information of the vehicle, a positional relationship of the moon 300 relative to the vehicle, and thereby controls the camera 2 to perform adjustment to change from the initial orientation shown in FIG. 2a to a target orientation shown in FIG. 2b.
- the viewing angle range 210 thereof substantially remains parallel to the vehicle roof, and in this case, the moon 300 to be observed does not fall into the viewing angle range 210 thereof.
- the viewing angle range 210 thereof is aimed at a region where the moon 300 is located, so that the moon 300 can be captured.
- the control module is configured to skip turning on the camera 2 in the case that the camera 2 has not reached the target orientation, and turn on the camera 2 only after the camera 2 reaches the target orientation, and control the same to perform capturing. After a sky image is captured via the camera 2, the sky image may be projected to a specific region 301 of the vehicle ceiling via the output module 3.
- the seat 31 of the vehicle 2 may also be controlled in a coordinated manner, to cause the backrest of the seat 31 to pivot backwards by a preset angle, so that the vehicle user 51 can be held in a comfortable attitude to view the shape of the moon.
- the celestial observation function is triggered when the vehicle 100 is in a moving state.
- the travel direction changes constantly when the vehicle is traveling, so that the positional relationship of the moon 300 relative to the vehicle 100 changes accordingly.
- Such a change in the relative positional relationship is shown in FIG. 2c via, for example, a dotted line 300'.
- the control module additionally adjusts the orientation of the camera 2 according to the travel speed, the travel direction, and the pre-planned travel route of the vehicle, so as to dynamically adapt the viewing angle range 210 thereof to the change in the positional relationship of the celestial object 300 to be observed relative to the vehicle 100.
- the vehicle 100 is in motion, so that for the sake of safety, the backrest of the seat 31 of the vehicle user 51 (e.g., the driver) in the front row is not lowered, and instead, only an angle of inclination of a backrest of a seat 31' of a vehicle user 52 in the rear row is adjusted.
- the projection region 302 of the celestial observation view in the vehicle 100 is further adjusted via the output module 30, so that a projection position is as close as possible to a rear portion of the vehicle ceiling, and safe driving of the vehicle driver 51 is therefore not affected.
- FIGs. 3a-3f show schematic diagrams of an interface of a celestial observation view output in a vehicle in the live mode and the non-live mode.
- a sky image captured by the camera in real time is directly output as the celestial observation view.
- the vehicle user specifies that the shape of the moon 301 is to be observed in the vehicle, so that the control module controls the camera to capture the moon 301 in the night sky, and a sky image including the moon 301 is displayed in an interface 41.
- the vehicle user specifies that stars 302 constituting the Cancer constellation are to be observed in the vehicle, so that a real-time sky image including the stars 302 of the Cancer constellation is shown in the interface 41.
- the control module acquires a local segment from a sky image captured by the camera, and then only the local segment is displayed on the interface 41.
- the celestial objects 301, 302 to be observed are, for example, located in the center of the image, and an image ratio thereof exceeds a preset value.
- the celestial observation view is also output in the live mode.
- the originally captured sky image is not output directly, and instead, the sky image is preprocessed, and is then output.
- FIG. 3 c a geometric figure is superimposed on the moon 301 in an original image, so that the shape of the moon can be identified more clearly.
- FIG. 3d in addition to that the original sky image including the stars 302 of the Cancer constellation is output on the interface 41, the individual starts are connected one by one by means of connecting lines 303 so as to draw the outline of the constellation, and the corresponding constellation name is marked nearby via text 304.
- control module may annotate the celestial object in the celestial observation view in the following aspects:
- a celestial object category e.g., the moon, the starry sky, constellations, stars (satellites, comets, fixed stars, or planets), and aurora;
- celestial object name e.g., the moon, Polaris, etc.
- FIG. 3e and FIG. 3f a celestial observation view of the vehicle is shown in the non-live mode.
- the target celestial object can no longer be observed in the live mode due to occlusion of cloud and fog, so that switching to the non-live mode is performed.
- a moon phase diagram 305 acquired according to lunar solar terms is shown in the interface 41, and is accompanied by an audio commentary.
- the shapes and popular science introductions of a plurality of constellations 306 are shown in the interface 41.
- FIGs. 4a-4d show schematic diagrams showing that a user changes an output effect of a celestial observation view via gesture interaction.
- the user performs a gesture 61 on the celestial observation view shown in the interface 41, so as to select a celestial portion that the user expects to be acquainted with.
- the user performs the gesture 61 by, for example, pointing the index finger to a specific region.
- the user input module detects the gesture 61 of the user, and then the control module interprets a corresponding intent.
- the control module changes the output effect of the celestial observation view, so that a celestial object corresponding to the selected region is highlighted and annotated with introductory information. This is correspondingly shown in FIG. 4b.
- the user performs a gesture 62 on the celestial observation view shown in the interface 41.
- the user performs the gesture 62 by pinching in two fingers.
- the control module interprets the “zooming out the image” intent of the user, and therefore controls motion or focal length adjustment of the camera, so as to perform a zooming operation on the image captured in real time. This is correspondingly shown in FIG. 4d.
- a rotation, translation, or zooming operation corresponding to an image change intent of the user may also be performed on the celestial observation view to be output via appropriate control performed on an output unit.
- FIG. 5 shows a schematic diagram of an interface of the user input module according to an exemplary embodiment of the present invention.
- a plurality of input options in the form of virtual keys 111, 112, 113, 114, and 115 are shown in an interface 110 of the user input module 11.
- the vehicle user can activate the celestial observation function in the vehicle, and customize an output effect liked thereby.
- Operating the key 111 can perform selection between the “non-live mode” and the “live mode”, thereby determining an image source of the celestial observation view output in the vehicle.
- Operating the key 112 can select a celestial object category to be observed from options “observe the moon”, “observe the starry sky”, “observe a constellation”, and “observe aurora”.
- a celestial object category to be observed from options “observe the moon”, “observe the starry sky”, “observe a constellation”, and “observe aurora”.
- the “observe the starry sky” option for example, all stars that can be observed in the current location of the vehicle are selected as celestial objects to be observed.
- the “observe a constellation” option a plurality of start clusters constituting a specific complete constellation are selected as celestial objects to be observed.
- Operating the key 113 can select an expected output effect from “scene viewing”, “atmosphere”, and “popular science” options. For example, if the user selects the “scene viewing” option, only an originally captured sky image may be output. If the user selects the “atmosphere” option, rendering or virtualization processing may be performed on an originally captured sky image. Alternatively, if the user has selected the “non-live mode” before, a starry atmosphere image may be downloaded via the communication interface. If the user selects the “popular science” option, then regardless of the output mode, annotation processing can be performed on several celestial objects in the celestial observation view, and an audio or text instruction of a celestial object to be observed can be provided.
- Operating the key 114 can select, from “vehicle ceiling”, “HUD”, and “left vehicle glass” options, an expected projection position of the celestial observation view output in the vehicle.
- Operating the key 115 can control enabling, suspension, and exiting of the celestial observation function in the vehicle. If the user does not select the key 111, 112, 113, or 114 to perform personalization, but directly presses an “enable” option in the key 115 instead, then the output mode, the celestial object to be observed, the output effect, and the output position may be selected according to a preset configuration of the system. In addition, as described in detail with reference to FIG. 1, the output mode, the output effect, and the output position may also be intelligently recommended to the user according to a variety of factors such as a weather condition, a vehicle motion status, an image recognition result, a person monitoring result, etc.
- FIG. 6 shows a flowchart of a celestial observation method for a vehicle according to an exemplary embodiment of the present invention.
- the method includes, for example, steps S01-S60, and may be implemented, for example, in the case that the celestial observation system 1 shown in FIG. 1 is used.
- step SOI acquiring an output mode for outputting a celestial observation view in a vehicle.
- an output effect and an output position of the celestial observation view, and a category of a celestial object to be observed may also be acquired according to a user input.
- step S10 to S60 controlling, at least in a live mode and a non-live mode, output of the celestial observation view in the vehicle.
- the live mode an output module is caused to output the celestial observation view in the vehicle on the basis of a sky image captured via a camera in real time
- the non-live mode the output module is caused to output the celestial observation view in the vehicle on the basis of another source image, the other source image being different from the sky image captured via the camera in real time.
- the live mode is used as an example to introduce a process of outputting the celestial observation view in the vehicle.
- step S10 acquiring positional information of the vehicle.
- a travel speed, a travel direction, and a pre-planned travel route of the vehicle may also be acquired in this step.
- step S20 determining, according to the positional information of the vehicle, a positional relationship of at least one celestial object to be observed relative to the vehicle.
- step S30 adjusting an orientation of a camera relative to the vehicle according to the positional relationship, so that the at least one celestial object is within a viewing angle range of the camera.
- a rotation angle of the camera may be calculated, and the camera is controlled to rotate by the angle to cause the viewing angle range to be aimed at a celestial object to be observed.
- step S40 capturing a sky image outside the vehicle via the camera. For example, only after the camera is set in a target orientation relative to the vehicle, the camera is turned on, and the sky image is captured. It is also possible that the camera is always in an activated state, and captures sky images continuously.
- step S50 checking whether the at least one celestial object to be observed can be identified in the sky image captured via the camera. In the case that the at least one celestial object to be observed is not identified, steps S20 to S40 may be repeated so as to re-adjust the orientation of the camera relative to the vehicle according to the positional information of the vehicle.
- step S60 outputting a celestial observation view in the vehicle on the basis of the sky image captured via the camera.
- the celestial observation view may be output to a region specified by a user.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Human Computer Interaction (AREA)
- Astronomy & Astrophysics (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
Abstract
The present invention relates to the field of intelligent cabins. Provided in the present invention is a celestial observation system (1) for a vehicle (100), including: a camera (2), movably mounted on the vehicle (100) and configured to capture a sky image outside the vehicle (100), a vehicle positioning module (12), configured to acquire positional information of the vehicle (100), a control module (4), configured to determine, according to the positional information of the vehicle (100), a positional relationship of at least one celestial object (300) to be observed relative to the vehicle (100), and adjust an orientation of the camera (2) relative to the vehicle (100) according to the positional relationship, so that the at least one celestial object (300) is within a viewing angle range (210) of the camera (2) and an output module (30), configured to output a celestial observation view in the vehicle at least on the basis of the sky image captured via the camera (2). The present invention also relates to a celestial observation method for a vehicle and a machine readable storage medium. In the case that the vehicle position is considered, an orientation of a camera is adjusted to perform dynamic adaptation on a celestial observation angle, thereby improving celestial observation experience of a vehicle user (51).
Description
CELESTIAL OBSERVATION SYSTEM FOR VEHICLE AND CELESTIAL OBSERVATION METHOD FOR VEHICLE
TECHNICAL FIELD
The present invention relates to a celestial observation system for a vehicle, and the present invention also relates to a celestial observation method for a vehicle and a machine readable storage medium.
BACKGROUND
With the progression of science and technology in modern society and the advancement of vehicle intelligence, vehicles are no longer simple means of transportation, and people also have higher requirements of vehicle interaction functionality in addition to safety and comfort being ensured. The successful return of the Shenzhou- 13 manned spaceship has initiated a new round of popular interest in aerospace knowledge. Nowadays, more and more vehicle users expect to enjoy, within a limited vehicle space, an immersive observation experience of stars in the universe.
Currently, starry sky ceilings are usually used to simulate the effect of stars at night. However, such simulated starlight lacks realism, and deteriorates the user experience. In addition, the conventional in-vehicle projection solution is relatively simple. Typically, the camera footage outside a vehicle is directly input to the inside of the vehicle. Projected content is greatly limited by the location and orientation of the vehicle, and targeted screening cannot be performed on celestial objects to be observed, thereby limiting the enjoyment and popular science experience among users.
In this context, it is desirable to provide a celestial observation solution for a vehicle, so as to track and capture a target celestial object via dynamic adaptation performed on an input source, thereby allowing vehicle users to have a more realistic celestial observation experience.
SUMMARY
An objective of the present invention is to provide a celestial observation system for a vehicle, comprising: a camera, movably mounted on the vehicle and configured to capture a sky image outside the vehicle; a vehicle positioning module, configured to acquire positional information of the vehicle;
a control module, configured to determine, according to the positional information of the vehicle, a positional relationship of at least one celestial object to be observed relative to the vehicle, and adjust an orientation of the camera relative to the vehicle according to the positional relationship, so that the at least one celestial object is within a viewing angle range of the camera; and an output module, configured to output a celestial observation view in the vehicle at least on the basis of the sky image captured via the camera.
The present invention particularly comprises the following technical idea: the relative orientation of the camera is adjusted to perform dynamic adaptation on a celestial observation angle, thereby capturing a target celestial object precisely and reliably regardless of the vehicle position. In this way, a condition of real celestial observation is created for a passenger in the vehicle as if the passenger is in the nature, thereby allowing a user to learn the current celestial object shape and change in real time, and expanding space knowledge reserve of the user in addition to enriching the sensory experience thereof.
Optionally, the celestial observation system further comprises a vehicle motion determination module configured to acquire a travel direction and/or a travel speed of the vehicle; wherein the control module is further configured to determine a dynamic change in the positional relationship of the at least one celestial object relative to the vehicle according to the travel direction and/or the travel speed of the vehicle, and additionally adjust the orientation of the camera relative to the vehicle according to the dynamic change in the positional relationship.
Optionally, the celestial observation system further comprises an image recognition module configured to identify the at least one celestial object to be observed in the sky image captured via the camera; wherein the control module is further configured to adjust the orientation of the camera relative to the vehicle according to an identification result of the image recognition module so as to cause the camera to track and capture the identified at least one celestial object.
In this way, dynamic tracking of the target celestial object is achieved by using an image recognition technique, thereby improving observation stability in a traveling state.
Optionally, the celestial observation system further comprises an image recognition module configured to identify the at least one celestial object to be observed in the sky image captured via the camera;
wherein the control module is further configured to:
- re-adjust the orientation of the camera relative to the vehicle in the case that the at least one celestial object to be observed is not identified, and/or
- output a celestial observation view in the vehicle on the basis of another source image in the case that the at least one celestial object to be observed is not identified within a predetermined number of times, the other source image being different from the sky image captured via the camera in real time.
Optionally, the control module is further configured to acquire a local segment from the sky image captured via the camera, so that the local segment comprises only the at least one celestial object to be observed, and/or an image ratio of the at least one celestial object to be observed in the local segment exceeds a preset value; and/or the output module is further configured to output a celestial observation view on the basis of a local segment acquired from the sky image.
Optionally, the control module is further configured to control the output module to output the celestial observation view according to a determined output effect, wherein the outputting the celestial observation view according to the determined output effect comprises:
- directly outputting, as the celestial observation view, the sky image captured via the camera;
- annotating, with respect to the shape and/or the category, the at least one celestial object in the sky image captured via the camera, and outputting the annotated sky image as the celestial observation view; and/or
- outputting introductory information in the form of a text and/or in the form of a speech about the at least one celestial object to be observed in synchronization with output of the sky image.
Optionally, the control module is further configured to control, at least in a live mode and a nonlive mode, the output of the celestial observation view in the vehicle, wherein in the live mode, the output module is caused to output the celestial observation view in the vehicle on the basis of the sky image captured via the camera in real time, and in the non-live mode, the output module is caused to output the celestial observation view in the vehicle on the basis of another source image, the other source image being different from the sky image captured via the camera in real time.
The output of the celestial observation view is controlled in the live mode and the non-live mode, so that free switching between celestial observation sources can be performed for different traveling statuses, user requirements, and weather conditions. In an aspect, the passenger can be provided with real-time immersive observation experience. In another aspect, an atmosphere of being on a sea of stars can be created in the vehicle cabin, and the passenger is provided with the opportunity to learn popular aerospace science knowledge.
Optionally, in the non-live mode, the other source image comprises: a sky image captured or recorded via the camera in advance; and/or a sky image, a starry sky atmosphere image, and/or a celestial popular science image stored locally on the vehicle and/or received from the outside of the vehicle. The vehicle entertainment and teaching functions are newly added. The user experience in the vehicle cabin is enriched, and the sense of science and technology of the vehicle is enhanced.
Optionally, the control module is further configured to control, according to a motion status of the vehicle, the output of the celestial observation view by the output module, wherein the control module is configured to prohibit the celestial observation view from being output in the vehicle or allow the celestial observation view to be output only in the non-live mode when the travel speed of the vehicle is greater than a threshold. Flexible switching between celestial observation sources is achieved, and traveling safety can be fully ensured.
Optionally, the control module is further configured to control, according to a weather condition, the output of the celestial observation view by the output module, wherein the control module is configured to allow the celestial observation view to be output only in the non-live mode when the weather condition does not satisfy a preset requirement.
Optionally, the control module is further configured to recommend an output mode of the celestial observation view, and an output position and/or an output effect of the celestial observation view in the vehicle in a personalize manner according to identity information of a vehicle user.
Optionally, the control module is further configured to control the output module and at least one vehicle cabin component of the vehicle in a coordinated manner, so that a status change of the at least one vehicle cabin component is triggered in temporal association with the output of the celestial observation view, the at least one vehicle cabin component comprising a seat, ambient lighting, an audio system, and/or an air conditioner of the vehicle. Such coordinated control allows
the user to observe the celestial object in the most comfortable posture and angle, and multi-modal interaction enriches the sensory experience of the user.
Optionally, the celestial observation system further comprises a user input module configured to receive:
- a first specifying input of the vehicle user for the output mode of the celestial observation view,
- a second specifying input of the vehicle user for the output position of the celestial observation view in the vehicle, and/or
- a third specifying input of the vehicle user for a celestial object category to be observed; wherein the control module is further configured to:
- select, according to the first specifying input of the vehicle user, the live mode or the nonlive mode to output the celestial observation view,
- control the output module according to the second specifying input of the vehicle user, so as to output the celestial observation view in the output position specified by the vehicle user, and/or
- control the orientation of the camera relative to the vehicle and/or output of the output module according to the third specifying input of the vehicle user, so that the celestial observation view output in the vehicle comprises the celestial object category specified by the vehicle user.
Optionally, the celestial observation system further comprises a user input module configured to receive a gesture input of the vehicle user for the celestial observation view already output in the vehicle, wherein the control module is further configured to:
- identify, according to the gesture input of the vehicle user, a selected region of the vehicle user in the celestial observation view, and control the output module to output introductory information of a celestial object corresponding to the selected region; and/or
- identify an image change intent according to the gesture input of the vehicle user, and control motion of the camera and/or output of the output module, so that a rotation, translation,
and/or zooming operation corresponding to the image change intent is performed on the output celestial observation view.
Optionally, the celestial observation system further comprises a communication interface configured to upload the sky image captured via the camera to cloud, transmit the same to a mobile terminal device of the vehicle user, and/or share the same with another vehicle.
According to a second aspect of the present invention, provided is a celestial observation method for a vehicle, comprising the following steps: acquiring positional information of the vehicle; determining, according to the positional information of the vehicle, a positional relationship of at least one celestial object to be observed relative to the vehicle, and adjusting an orientation of a camera relative to the vehicle according to the positional relationship, so that the at least one celestial object is within a viewing angle range of the camera; and capturing a sky image outside the vehicle via the camera; and outputting a celestial observation view in the vehicle at least on the basis of the sky image captured via the camera.
Optionally, the celestial observation method further comprises the following steps: controlling, at least in a live mode and a non-live mode, the output of the celestial observation view in the vehicle, wherein in the live mode, the output module is caused to output the celestial observation view in the vehicle on the basis of the sky image captured via the camera in real time, and in the non-live mode, the output module is caused to output the celestial observation view in the vehicle on the basis of another source image, the other source image being different from the sky image captured via the camera in real time.
According to a third aspect of the present invention, provided is a machine readable storage medium, storing a computer program used to, when run on a computer, perform the celestial observation method according to the first aspect of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
The principles, features, and advantages of the present invention can be better understood via further detailed description of the present invention provided with reference to the accompanying drawings. The accompanying drawings include:
FIG. 1 shows a block diagram of a celestial observation system for a vehicle according to an exemplary embodiment of the present invention;
FIGs. 2a-2c show schematic diagrams of outputting a celestial observation view in a vehicle in exemplary scenarios;
FIGs. 3a-3f show schematic diagrams of an interface of a celestial observation view output in a vehicle in a live mode and a non-live mode;
FIGs. 4a-4d show schematic diagrams showing that a user changes an output effect of a celestial observation view via gesture interaction;
FIG. 5 shows a schematic diagram of an interface of a user input module according to an exemplary embodiment of the present invention; and
FIG. 6 shows a flowchart of a celestial observation method for a vehicle according to an exemplary embodiment of the present invention.
DETAILED DESCRIPTION
In order to make the technical problems to be solved by the present invention, the technical solutions, and the beneficial technical effects clearer, the present invention will be described below in further detail with reference to the accompanying drawings and a plurality of exemplary embodiments. It should be understood that specific embodiments described herein are merely used to explain the present invention, but are not used to limit the scope of protection of the present invention.
FIG. 1 shows a block diagram of a celestial observation system for a vehicle according to an exemplary embodiment of the present invention.
As shown in FIG. 1, the celestial observation system 1 includes a camera 2, an output module 3, a vehicle positioning module 12, and a control module 4, and these modules are connected to each other via a communication technique.
The camera 2 is movably mounted on the vehicle and configured to capture a sky image outside the vehicle. In an example, the camera 2 is a wide-angle camera, and is arranged on a vehicle roof so that an initial viewing angle thereof is pointed to a partial sky above the vehicle. In another example, the camera 2 may also be an environmental perception camera provided on the vehicle and originally used to support the driving assistance function or the autonomous driving function, and in this case the initial viewing angle of the camera is pointed to, for example, a vehicle travel direction. In addition, a plurality of cameras may also be arranged at different positions on a vehicle body, so as to synthesize images captured individually thereby, and then provide a celestial observation view.
The output module 3 is configured to output the celestial observation view in the vehicle. In an example, the output module 3 is configured to be a projection apparatus, and can project, on a specified projection region in a vehicle cabin, content to be projected, or display the same in the vehicle by using a holographic projection technique. Such a projection region may be, for example, a vehicle ceiling, an inner wall of the vehicle cabin, a skylight, or a vehicle window. In another example, the output module 3 may also be configured to be a display of the vehicle. Such a display includes, for example, a head unit screen, an intelligent instrument screen, a front/rear row multifunctional smart tablet, etc. Herein, the celestial observation view may particularly be in the form of an image and a graphic. This may be, for example, a photograph or a celestial pictogram. It is also possible that the celestial observation view includes a text or graphical annotation or includes a vocal commentary. Therefore, the celestial observation view is not necessarily static, but may vary dynamically over time. In this way, the celestial observation view may also include a video sequence consisting of a plurality of single-frame images.
A vehicle user may be, for example, a driver, a front passenger, or another passenger of the vehicle. However, in some cases, a specified observer may also be located outside the vehicle, and observe from outside the celestial observation view projected to the vehicle window.
The vehicle positioning module 12 includes, for example, an on-board GPS sensor, an inertial navigation apparatus, etc., and is configured to acquire positional information of the vehicle. Such positional information includes, for example, geographical coordinate and latitude and longitude information of a location of the vehicle. Additionally, the positional information of the vehicle
further includes, for example, attitude information (pitch, yaw, etc.) of the vehicle. In some cases, the positional information may also carry a timestamp.
The control module 4 is configured to determine, according to the positional information of the vehicle, a positional relationship of at least one celestial object to be observed relative to the vehicle, and adjust an orientation of the camera relative to the vehicle according to the positional relationship, so that the at least one celestial object is within a viewing angle range of the camera. To that end, the control module 4, for example, can access a database storing celestial position data of all celestial objects, and determine, with reference to the vehicle position and an internal time of the system, all candidate celestial object information that can be observed in a current location of the vehicle. In an example, the control module may directly determine that all candidate celestial objects are celestial objects to be observed. In another example, the control module may determine, according to a default configuration of the system, that a specific celestial object (e.g., the moon) of a preset category is a celestial object to be observed. In another example, a celestial object of interest may also be selected, on the basis of a user input, from all candidate celestial objects that may be observed, and the celestial object of interest is used as a celestial object to be observed.
In an example, the control module 4 may be further divided into a computing unit and a motion execution unit. The computing unit calculates, according to the positional information of the vehicle, a positional relationship of the celestial object to be observed relative to the vehicle (e.g., a certain celestial object is located on the left, right, front, or rear side of the vehicle), thereby generating a control signal for controlling motion of the camera 2. Such a control signal is provided to the motion execution unit coupled to the camera 2. The motion execution unit is configured to be, for example, an electrically driven rotary joint, and can, for example, drive the camera 2 to rotate horizontally by 360 degrees and rotate vertically by 180 degrees. The motion execution unit is controlled to adjust the location and/or angle of the camera 2 relative to the vehicle body, so as to cause the viewing angle range thereof to be aimed at the celestial object to be observed.
In order to more stably track the celestial object to be observed, a motion status of the vehicle may further be taken into account during control of a change in the viewing angle of the camera. To that end, the celestial observation system 1 further optionally includes a vehicle motion determination module 13. The vehicle motion determination module 13 includes, for example, a
wheel speed sensor of the vehicle and a gyroscope, so that a travel speed and a travel direction of the vehicle can be determined. Upon learning this information, the control module 4 may determine a change in the positional relationship of the celestial object to be observed relative to the vehicle over time, and additionally adjust the orientation of the camera relative to the vehicle according to such a dynamic change. For example, the location of the vehicle on the earth is relatively fixed, so that the position of the celestial object to be observed relative to a geographic region where the vehicle is located does not change suddenly. However, after the vehicle changes the travel direction, the celestial object to be observed may change from one side of the vehicle to another side, and in this case, if a shooting angle of the camera is not adjusted in a timely manner, a target celestial object may disappear from a projected image in the vehicle. Therefore, the camera needs to be adjusted dynamically according to the motion status of the vehicle so as to adapt an observation angle of the camera to the motion of the vehicle.
In order to more precisely track the celestial object to be observed, a pre-planned travel route of the vehicle may be taken into account during the control of the change in the viewing angle of the camera. To that end, the celestial observation system 1 further optionally includes a navigation module 14. The control module 4 may, for example, read the pre-planned travel route of the vehicle from the navigation module 14, and estimate a change trend of the positional relationship of at least one celestial object relative to the vehicle in a determined time period. Then, the control module 4 may generate an orientation adjustment scheme of the camera 2 according to the change trend, and then fine-tune the angle and location of the camera according to the positional relationship while following the orientation adjustment scheme. The orientation adjustment scheme may be, for example, an adjustment step sequence and an adjustment parameter sequence (e.g., rotating leftwards by 15° - rotating rightwards by 20° - translating to the left side of the vehicle by 3 cm) of the camera predicted for a determined road segment or time period. If the vehicle travels according to the pre-planned travel route, adjustment to the camera 2 does not substantially deviate from such a preliminary scheme. On that basis, the preliminary scheme only needs to be fine-tuned with reference to the specific position and the motion status of the vehicle, so that the celestial object to be observed can be tracked more accurately, and before a significant change in the direction or speed of the vehicle occurs, the camera is readied for such an upcoming sudden angle change.
In addition, the celestial observation system 1 further optionally includes an image recognition module 21. The image recognition module 21 may, for example, be integrated in the camera 2 or the control module 4. The image recognition module 21 is configured to identify at least one celestial object to be observed in the sky image captured via the camera 2. In this case, the control module 4 is, for example, further configured to control the motion of the camera 2 so as to cause the same to track and capture the identified at least one celestial object. The control module 4 may be further configured to re-adjust the orientation of the camera 2 relative to the vehicle in the case that the at least one celestial object to be observed is not identified, or output a celestial observation view in the vehicle on the basis of another source image in the case that the at least one celestial object to be observed is not identified within a predetermined number of times.
In an example, the control module 4 controls, at least in a live mode and a non-live mode, the output of the celestial observation view in the vehicle. In the live mode, the control module 4 causes the output module 3 to output a celestial observation view in the vehicle at least on the basis of the sky image captured via the camera 2 in real time. In the non-live mode, the control module 4 causes the output module 3 to output a celestial observation view in the vehicle on the basis of another source image, the other source image being different from the sky image captured via the camera in real time. Specifically, the other source image may be, for example, a sky image captured or recorded via the camera 2 in advance (e.g., a few hours or a few days ago), and in this case, the other source image is not real-time. In another example, the other source image may also be image or video data received from the outside of the vehicle via a communication interface 5, and may include, for example:
- a sky image captured in real time by another camera within a predetermined range around the vehicle;
- a sky image shared by a social media friend of the vehicle user;
- a celestial image or a sky image that is acquired by accessing an astronomical website and can be observed in the current location of the vehicle;
- a starry sky atmosphere image; and/or
- a celestial popular science image.
In an example, the control module 4 may control, in the case that a variety of factors are considered, enabling and disabling of a particular output mode and switching between different output modes. To that end, the celestial observation system 1 may further include, for example, a user input module 11, a weather acquisition module 15, a person monitoring module 16, and a vehicle function linkage module 17. The control module 4 is respectively connected to these modules, so as to receive a condition that may affect the output of the celestial observation view.
The user input module 11 is configured to be, for example, a standalone interactive touch interface, or may be configured to be integrated in an interactive interface originally provided in the vehicle. The user input module 11 is configured to receive: a first specifying input of the user for the output mode of the celestial observation view, a second specifying input of the user for the output position of the celestial observation view in the vehicle, and a third specifying input of the user for a celestial object category to be observed. Then, the control module 4 controls the output of the celestial observation view on the basis of a user input detected by the user input module 11. Specifically, the control module 4 is configured to, for example, select, according to the first specifying input of the user, the live mode or the non-live mode to output the celestial observation view, and control the output module 3 according to the second specifying input of the user, so as to output the celestial observation view in the output position specified by the user. In addition, the control module 4 may further control predetermined motion of the camera 2 and/or output of the output module 3 according to the third specifying input of the user, so that the celestial observation view output in the vehicle includes the celestial object category specified by the user. As an example, the celestial object category to be observed includes the moon, the starry sky, constellations, stars (satellites, comets, fixed stars, or planets), aurora, or one or a combination thereof.
The control module 4 may also acquire the motion status of the vehicle from the vehicle motion determination module 13, and thereby control enabling permissions of different output modes or recommend a suitable output mode. For example, the control module 4 is configured to prohibit the celestial observation view from being output in the vehicle or allow the celestial observation view to be output only in the non-live mode when the travel speed of the vehicle is greater than 30 km/h.
The weather acquisition module 15 is configured to acquire weather information of the location where the vehicle is located, and the weather information includes not only weather conditions (e.g.,
clear, rainy, cloudy, etc.) but also visibility information (a pollution level). In this way, the control module 4 may determine, in the case of having learned the candidate celestial object information that can be observed in the current location of the vehicle in combination with the weather information, whether these celestial objects can be actually observed from the current location of the vehicle. For example, although some celestial objects are theoretically visible to an observer in the region where the vehicle is located, these celestial objects cannot be identified from the sky image captured in real time if the weather conditions do not satisfy a preset requirement. Therefore, for example, the celestial observation view is allowed to be output only in the non-live mode.
The person monitoring module 16 includes, for example, one or more in-vehicle cameras arranged in the vehicle cabin, and is configured to monitor physical information of passengers, including, for example, age information, gender information, and mood information of persons. Moreover, the person monitoring module 16 may further include a seat occupancy status sensor to acquire a distribution of passengers in the vehicle. By learning this information, the control module 4 can recommend an output mode of the celestial observation view, and an output position and/or an output effect of the celestial observation view in the vehicle in a personalize manner. For example, if vehicle passengers include a child, a vocal commentary of the celestial observation view is enabled automatically, and rendering processing is performed on a real sky image to be output, so as to enrich an animation effect. If the vehicle is traveling, and a passenger is only in the rear row of the vehicle in addition to a driver, the celestial observation view may be projected to only the rear row of the vehicle.
The vehicle function linkage module 17 is configured to acquire an activated state of at least one predefined function of the vehicle, and such an activated state is also provided to the control module 4, so that the control module 4 can control, in association with the predefined function, the output of the celestial observation view. Specifically, the control module 4 may automatically trigger the output of the celestial observation view in response to activation of a predefined function of the vehicle. For example, a plurality of atmosphere functions (e.g., a proposing function, an anniversary reminder function, and a festival atmosphere function) are preset in the vehicle. When one of these preset functions is activated by the user, the control module 4 may automatically project the celestial observation view in the vehicle cabin in order to further create a romantic atmosphere, so that the passenger is in the sea of stars or the shadow of the moon.
In addition, the control module 4 may also use the communication interface 5 to upload the sky image captured via the camera together with additional information such as geographical coordinates, a time, a date, the weather, etc., transmit the same to a mobile terminal device of the vehicle user, and/or share the same with another vehicle. In addition, the control module 4 may also receive a celestial observation sharing request from the outside of the vehicle via the communication interface 5, and generate, in response to reception of such a sharing request, a prompt for enabling the non-live mode or a prompt for switching from the live mode to the nonlive mode. For example, a friend of the vehicle user initiates real-time sharing via social media, so as to share a starry sky image captured thereby during traveling abroad. However, the vehicle user is viewing the local night sky in the live mode, so that the system pushes, to the user, a prompt “XX initiated a request to view the moon in real time. Whether to switch to the non-live mode?”. If a positive response to the request is received from the vehicle user, a moon image shared by the friend in real time may be output in the vehicle.
Referring to FIG. 1, the control module 4 may further control the output module 3 and at least one vehicle cabin component 31, 32, 33 of the vehicle in a coordinated manner, so that a status change of the at least one vehicle cabin component is triggered in temporal association with the output of the celestial observation view. The at least one vehicle cabin component includes a seat 31, an audio system 32, ambient lighting 33, and/or an air conditioner of the vehicle. For example, in synchronization with or in temporal succession with the output of the celestial observation view, the control module 4 may control the vehicle seat to move backwards/forwards as a whole, and cause the backrest of the vehicle seat to pivot backwards to a preset position, so that the passenger enjoys celestial observation in a more comfortable state. As another example, during the output of the celestial observation view, the immersive feel in celestial observation may also be improved by decreasing brightness of the ambient lighting in the vehicle cabin or changing the hue thereof, and by changing the direction and air volume of output air of the air conditioner, and controlling the audio system to play soothing music.
FIGs. 2a-2c show schematic diagrams of outputting a celestial observation view in a vehicle in exemplary scenarios.
Referring to FIGs. 2a-2c, the camera 2 is movably mounted on the vehicle roof of the vehicle 100. Driven by the motion execution unit, the camera 2 not only can move vertically in a height
direction of the vehicle and horizontally, but also can, for example, rotate, within a preset angle, about each of x, y, and z axes exemplarily shown in FIG. 2a.
In the scenario shown in FIG. 2a, a celestial observation function has not been triggered in the vehicle 100. In this case, the camera 2 is, for example, held in an initial orientation relative to the vehicle 100, and correspondingly, the direction of a viewing angle range 210 of the camera 2 in the initial orientation is also shown. In this case, the camera 2 is, for example, in a standby or dormant state, and the position and the backrest of the seat 31 of the vehicle user 51 are both in a state originally set by the user.
In the scenario shown in FIG. 2b, the celestial observation function is triggered when the vehicle 100 is in a stopped state. It should be noted that the triggering of the celestial observation function may be initiated by the vehicle user 51, or may be triggered by the system (e.g., in a linked manner on the basis of the activated state of another predefined function of the vehicle), or may be triggered automatically when a vehicle configuration satisfies a preset requirement. In this case, for example, it is learned that the vehicle user 51 expects to view, in the vehicle cabin, the shape of the moon of that night, so that the control module (not shown for simplicity) calculates, with reference to the positional information of the vehicle, a positional relationship of the moon 300 relative to the vehicle, and thereby controls the camera 2 to perform adjustment to change from the initial orientation shown in FIG. 2a to a target orientation shown in FIG. 2b. In the case that the camera 2 is in the initial orientation, the viewing angle range 210 thereof substantially remains parallel to the vehicle roof, and in this case, the moon 300 to be observed does not fall into the viewing angle range 210 thereof. When the camera 2 is in the target orientation relative to the vehicle 100, the viewing angle range 210 thereof is aimed at a region where the moon 300 is located, so that the moon 300 can be captured. In an example, the control module is configured to skip turning on the camera 2 in the case that the camera 2 has not reached the target orientation, and turn on the camera 2 only after the camera 2 reaches the target orientation, and control the same to perform capturing. After a sky image is captured via the camera 2, the sky image may be projected to a specific region 301 of the vehicle ceiling via the output module 3. In addition, after the celestial observation function is triggered, the seat 31 of the vehicle 2 may also be controlled in a coordinated manner, to cause the backrest of the seat 31 to pivot backwards by a preset angle, so that the vehicle user 51 can be held in a comfortable attitude to view the shape of the moon.
In the scenario shown in FIG. 2c, the celestial observation function is triggered when the vehicle 100 is in a moving state. The travel direction changes constantly when the vehicle is traveling, so that the positional relationship of the moon 300 relative to the vehicle 100 changes accordingly. Such a change in the relative positional relationship is shown in FIG. 2c via, for example, a dotted line 300'. In this case, the control module additionally adjusts the orientation of the camera 2 according to the travel speed, the travel direction, and the pre-planned travel route of the vehicle, so as to dynamically adapt the viewing angle range 210 thereof to the change in the positional relationship of the celestial object 300 to be observed relative to the vehicle 100. It can also be seen from FIG. 2c that the vehicle 100 is in motion, so that for the sake of safety, the backrest of the seat 31 of the vehicle user 51 (e.g., the driver) in the front row is not lowered, and instead, only an angle of inclination of a backrest of a seat 31' of a vehicle user 52 in the rear row is adjusted. In addition, the projection region 302 of the celestial observation view in the vehicle 100 is further adjusted via the output module 30, so that a projection position is as close as possible to a rear portion of the vehicle ceiling, and safe driving of the vehicle driver 51 is therefore not affected.
FIGs. 3a-3f show schematic diagrams of an interface of a celestial observation view output in a vehicle in the live mode and the non-live mode.
In FIG. 3a and FIG. 3b, a sky image captured by the camera in real time is directly output as the celestial observation view. In FIG. 3 a, the vehicle user specifies that the shape of the moon 301 is to be observed in the vehicle, so that the control module controls the camera to capture the moon 301 in the night sky, and a sky image including the moon 301 is displayed in an interface 41. In FIG. 3b, the vehicle user specifies that stars 302 constituting the Cancer constellation are to be observed in the vehicle, so that a real-time sky image including the stars 302 of the Cancer constellation is shown in the interface 41. To highlight the celestial objects 301, 302 that the user expects to observe, the control module, for example, acquires a local segment from a sky image captured by the camera, and then only the local segment is displayed on the interface 41. In the local segment, the celestial objects 301, 302 to be observed are, for example, located in the center of the image, and an image ratio thereof exceeds a preset value.
In FIG. 3c and FIG. 3d, the celestial observation view is also output in the live mode. However, in this case, the originally captured sky image is not output directly, and instead, the sky image is preprocessed, and is then output. As shown in FIG. 3 c, a geometric figure is superimposed on the
moon 301 in an original image, so that the shape of the moon can be identified more clearly. In FIG. 3d, in addition to that the original sky image including the stars 302 of the Cancer constellation is output on the interface 41, the individual starts are connected one by one by means of connecting lines 303 so as to draw the outline of the constellation, and the corresponding constellation name is marked nearby via text 304.
In general, the control module may annotate the celestial object in the celestial observation view in the following aspects:
- a celestial object category, e.g., the moon, the starry sky, constellations, stars (satellites, comets, fixed stars, or planets), and aurora;
- a celestial object name (e.g., the moon, Polaris, etc.);
- a celestial object outline or a connecting line;
- a moon phase shape (new moon, first quarter, full moon, etc.);
- a constellation name; and
- an introduction to a constellation.
In FIG. 3e and FIG. 3f, a celestial observation view of the vehicle is shown in the non-live mode. In the scenario shown in FIG. 3e, the target celestial object can no longer be observed in the live mode due to occlusion of cloud and fog, so that switching to the non-live mode is performed. In turn, a moon phase diagram 305 acquired according to lunar solar terms is shown in the interface 41, and is accompanied by an audio commentary. In FIG. 3f, the shapes and popular science introductions of a plurality of constellations 306 are shown in the interface 41.
FIGs. 4a-4d show schematic diagrams showing that a user changes an output effect of a celestial observation view via gesture interaction.
Referring to FIG. 4a, the user performs a gesture 61 on the celestial observation view shown in the interface 41, so as to select a celestial portion that the user expects to be acquainted with. In this case, the user performs the gesture 61 by, for example, pointing the index finger to a specific region. The user input module detects the gesture 61 of the user, and then the control module interprets a corresponding intent. In response to identifying the “select” intent of the user, the control module changes the output effect of the celestial observation view, so that a celestial object
corresponding to the selected region is highlighted and annotated with introductory information. This is correspondingly shown in FIG. 4b.
Referring to FIG. 4c, the user performs a gesture 62 on the celestial observation view shown in the interface 41. In this case, the user performs the gesture 62 by pinching in two fingers. Likewise, the control module interprets the “zooming out the image” intent of the user, and therefore controls motion or focal length adjustment of the camera, so as to perform a zooming operation on the image captured in real time. This is correspondingly shown in FIG. 4d. In addition, a rotation, translation, or zooming operation corresponding to an image change intent of the user may also be performed on the celestial observation view to be output via appropriate control performed on an output unit.
FIG. 5 shows a schematic diagram of an interface of the user input module according to an exemplary embodiment of the present invention.
A plurality of input options in the form of virtual keys 111, 112, 113, 114, and 115 are shown in an interface 110 of the user input module 11. By operating these virtual keys, the vehicle user can activate the celestial observation function in the vehicle, and customize an output effect liked thereby.
Operating the key 111 can perform selection between the “non-live mode” and the “live mode”, thereby determining an image source of the celestial observation view output in the vehicle.
Operating the key 112 can select a celestial object category to be observed from options “observe the moon”, “observe the starry sky”, “observe a constellation”, and “observe aurora”. In the case of the “observe the starry sky” option, for example, all stars that can be observed in the current location of the vehicle are selected as celestial objects to be observed. In the case of the “observe a constellation” option, a plurality of start clusters constituting a specific complete constellation are selected as celestial objects to be observed.
Operating the key 113 can select an expected output effect from “scene viewing”, “atmosphere”, and “popular science” options. For example, if the user selects the “scene viewing” option, only an originally captured sky image may be output. If the user selects the “atmosphere” option, rendering or virtualization processing may be performed on an originally captured sky image. Alternatively, if the user has selected the “non-live mode” before, a starry atmosphere image may
be downloaded via the communication interface. If the user selects the “popular science” option, then regardless of the output mode, annotation processing can be performed on several celestial objects in the celestial observation view, and an audio or text instruction of a celestial object to be observed can be provided.
Operating the key 114 can select, from “vehicle ceiling”, “HUD”, and “left vehicle glass” options, an expected projection position of the celestial observation view output in the vehicle.
Operating the key 115 can control enabling, suspension, and exiting of the celestial observation function in the vehicle. If the user does not select the key 111, 112, 113, or 114 to perform personalization, but directly presses an “enable” option in the key 115 instead, then the output mode, the celestial object to be observed, the output effect, and the output position may be selected according to a preset configuration of the system. In addition, as described in detail with reference to FIG. 1, the output mode, the output effect, and the output position may also be intelligently recommended to the user according to a variety of factors such as a weather condition, a vehicle motion status, an image recognition result, a person monitoring result, etc.
FIG. 6 shows a flowchart of a celestial observation method for a vehicle according to an exemplary embodiment of the present invention. In the embodiment shown in FIG. 6, the method includes, for example, steps S01-S60, and may be implemented, for example, in the case that the celestial observation system 1 shown in FIG. 1 is used.
In optional step SOI, acquiring an output mode for outputting a celestial observation view in a vehicle. Optionally, in this step, an output effect and an output position of the celestial observation view, and a category of a celestial object to be observed may also be acquired according to a user input.
In the following step, controlling, at least in a live mode and a non-live mode, output of the celestial observation view in the vehicle. In the live mode, an output module is caused to output the celestial observation view in the vehicle on the basis of a sky image captured via a camera in real time, and in the non-live mode, the output module is caused to output the celestial observation view in the vehicle on the basis of another source image, the other source image being different from the sky image captured via the camera in real time.
In steps S10 to S60, the live mode is used as an example to introduce a process of outputting the celestial observation view in the vehicle.
In step S10, acquiring positional information of the vehicle. In an example, a travel speed, a travel direction, and a pre-planned travel route of the vehicle may also be acquired in this step.
In step S20, determining, according to the positional information of the vehicle, a positional relationship of at least one celestial object to be observed relative to the vehicle.
In step S30, adjusting an orientation of a camera relative to the vehicle according to the positional relationship, so that the at least one celestial object is within a viewing angle range of the camera. For example, a rotation angle of the camera may be calculated, and the camera is controlled to rotate by the angle to cause the viewing angle range to be aimed at a celestial object to be observed.
In step S40, capturing a sky image outside the vehicle via the camera. For example, only after the camera is set in a target orientation relative to the vehicle, the camera is turned on, and the sky image is captured. It is also possible that the camera is always in an activated state, and captures sky images continuously.
In optional step S50, checking whether the at least one celestial object to be observed can be identified in the sky image captured via the camera. In the case that the at least one celestial object to be observed is not identified, steps S20 to S40 may be repeated so as to re-adjust the orientation of the camera relative to the vehicle according to the positional information of the vehicle.
In the case that the at least one celestial object to be observed is identified, in step S60, outputting a celestial observation view in the vehicle on the basis of the sky image captured via the camera. For example, the celestial observation view may be output to a region specified by a user.
Although particular embodiments of the present invention are described in detail herein, the particular embodiments are merely for the purpose of explanation, and should not be considered to limit the scope of the present invention. Various substitutions, alterations, and modifications may be conceived without departing from the spirit and scope of the present invention.
Claims
1. A celestial observation system (1) for a vehicle, comprising: a camera (2), movably mounted on the vehicle (100) and configured to capture a sky image outside the vehicle (100); a vehicle positioning module (12), configured to acquire positional information of the vehicle (100); a control module (4), configured to determine, according to the positional information of the vehicle (100), a positional relationship of at least one celestial object to be observed relative to the vehicle (100), and adjust an orientation of the camera (2) relative to the vehicle (100) according to the positional relationship, so that the at least one celestial object (300) is within a viewing angle range (210) of the camera (2); and an output module (3), configured to output a celestial observation view in the vehicle (100) at least on the basis of the sky image captured via the camera (2).
2. The celestial observation system (1) according to claim 1, wherein the celestial observation system (1) further comprises a vehicle motion determination module (13) configured to acquire a travel direction and/or a travel speed of the vehicle (100); wherein the control module (4) is further configured to determine a dynamic change in the positional relationship of the at least one celestial object (300) relative to the vehicle (100) according to the travel direction and/or the travel speed of the vehicle (100), and additionally adjust the orientation of the camera (2) relative to the vehicle (100) according to the dynamic change in the positional relationship.
3. The celestial observation system (1) according to claim 1 or 2, wherein the celestial observation system (1) further comprises an image recognition module (21) configured to identify the at least one celestial object (300) to be observed in the sky image captured via the camera (2);
wherein the control module (4) is further configured to adjust the orientation of the camera (2) relative to the vehicle (100) according to an identification result of the image recognition module (21) so as to cause the camera (2) to track and capture the identified at least one celestial object (300).
4. The celestial observation system (1) according to any one of claims 1 to 3, wherein the celestial observation system (1) further comprises an image recognition module (21) configured to identify the at least one celestial object (300) to be observed in the sky image captured via the camera (2); wherein the control module (4) is further configured to:
- re-adjust the orientation of the camera (2) relative to the vehicle (100) in the case that the at least one celestial object (300) to be observed is not identified, and/or
- output a celestial observation view in the vehicle (100) on the basis of another source image in the case that the at least one celestial object (300) to be observed is not identified within a predetermined number of times, the other source image being different from the sky image captured via the camera (2) in real time.
5. The celestial observation system (1) according to any one of claims 1 to 4, wherein the control module (4) is further configured to acquire a local segment from the sky image captured via the camera (2), so that the local segment comprises only the at least one celestial object (300) to be observed, and/or an image ratio of the at least one celestial object (300) to be observed in the local segment exceeds a preset value; and/or the output module (3) is further configured to output a celestial observation view on the basis of a local segment acquired from the sky image.
6. The celestial observation system (1) according to any one of claims 1 to 5, wherein
the control module (4) is further configured to control the output module (3) to output the celestial observation view according to a determined output effect, wherein the outputting the celestial observation view according to the determined output effect comprises:
- directly outputting, as the celestial observation view, the sky image captured via the camera (2);
- annotating, with respect to the shape and/or the category, the at least one celestial object (300) in the sky image captured via the camera (2), and outputting the annotated sky image as the celestial observation view; and/or
- outputting introductory information in the form of a text and/or in the form of a speech about the at least one celestial object (300) to be observed in synchronization with output of the sky image.
7. The celestial observation system (1) according to any one of claims 1 to 6, wherein the control module (4) is further configured to control, at least in a live mode and a nonlive mode, the output of the celestial observation view in the vehicle (100), wherein in the live mode, the output module (3) is caused to output the celestial observation view in the vehicle (100) on the basis of the sky image captured via the camera (2) in real time, and in the non-live mode, the output module (3) is caused to output the celestial observation view in the vehicle (100) on the basis of another source image, the other source image being different from the sky image captured via the camera (2) in real time.
8. The celestial observation system (1) according to any one of claims 1 to 7, wherein in the non-live mode, the other source image comprises: a sky image captured or recorded via the camera (2) in advance; and/or a sky image, a starry sky atmosphere image, and/or a celestial popular science image stored locally on the vehicle (100) and/or received from the outside of the vehicle (100).
9. The celestial observation system (1) according to claim 7 or 8, wherein
the control module (4) is further configured to control, according to a motion status of the vehicle (100), the output of the celestial observation view by the output module (3), wherein the control module (4) is configured to prohibit the celestial observation view from being output in the vehicle (100) or allow the celestial observation view to be output only in the non-live mode when the travel speed of the vehicle (100) is greater than a threshold.
10. The celestial observation system (1) according to any one of claims 7 to 9, wherein the control module (4) is further configured to control, according to a weather condition, the output of the celestial observation view by the output module (3), wherein the control module (4) is configured to allow the celestial observation view to be output only in the non-live mode when the weather condition does not satisfy a preset requirement.
11. The celestial observation system (1) according to any one of claims 7 to 10, wherein the control module (4) is further configured to recommend an output mode of the celestial observation view, and an output position and/or an output effect of the celestial observation view in the vehicle (100) in a personalize manner according to identity information of a vehicle user (51).
12. The celestial observation system (1) according to any one of claims 1 to 11, wherein the control module (4) is further configured to control the output module (3) and at least one vehicle cabin component (31) of the vehicle (100) in a coordinated manner, so that a status change of the at least one vehicle cabin component (31) is triggered in temporal association with the output of the celestial observation view, the at least one vehicle cabin component (31) comprising a seat, ambient lighting, an audio system, and/or an air conditioner of the vehicle (100).
13. The celestial observation system (1) according to any one of claims 1 to 12, wherein the celestial observation system (1) further comprises a user input module (11) configured to receive:
- a first specifying input of the vehicle user (51 ) for the output mode of the celestial observation view,
- a second specifying input of the vehicle user (51) for the output position of the celestial observation view in the vehicle (100), and/or
- a third specifying input of the vehicle user (51) for a celestial object category to be observed; wherein the control module (4) is further configured to:
- select, according to the first specifying input of the vehicle user (51), the live mode or the non-live mode to output the celestial observation view,
- control the output module (3) according to the second specifying input of the vehicle user (51), so as to output the celestial observation view in the output position specified by the vehicle user (51), and/or
- control the orientation of the camera (2) relative to the vehicle (100) and/or output of the output module (3) according to the third specifying input of the vehicle user (51), so that the celestial observation view output in the vehicle (100) comprises the celestial object category specified by the vehicle user (51).
14. The celestial observation system (1) according to any one of claims 1 to 13, wherein the celestial observation system (1) further comprises a user input module (11) configured to receive a gesture input (61) of the vehicle user (51) for the celestial observation view already output in the vehicle (100), wherein the control module (4) is further configured to:
- identify, according to the gesture input (61) of the vehicle user (51), a selected region of the vehicle user (51) in the celestial observation view, and control the output module (3) to output introductory information of a celestial object corresponding to the selected region; and/or
- identify an image change intent according to the gesture input (61) of the vehicle user (51), and control motion of the camera (2) and/or output of the output module (3), so
that a rotation, translation, and/or zooming operation corresponding to the image change intent is performed on the output celestial observation view.
15. The celestial observation system (1) according to any one of claims 1 to 14, further comprising a communication interface (5) configured to upload the sky image captured via the camera (2) to cloud, transmit the same to a mobile terminal device of the vehicle user (51), and/or share the same with another vehicle.
16. A celestial observation method for a vehicle (100), comprising the following steps: acquiring positional information of the vehicle (100); determining, according to the positional information of the vehicle (100), a positional relationship of at least one celestial object (300) to be observed relative to the vehicle (100), and adjusting an orientation of a camera (2) relative to the vehicle (100) according to the positional relationship, so that the at least one celestial object (300) is within a viewing angle range (210) of the camera (2); and capturing a sky image outside the vehicle (100) via the camera (2); and outputting a celestial observation view in the vehicle (100) at least on the basis of the sky image captured via the camera (2).
17. The celestial observation method according to claim 16, further comprising the following steps: controlling, at least in a live mode and a non-live mode, the output of the celestial observation view in the vehicle (100), wherein in the live mode, the output module (3) is caused to output the celestial observation view in the vehicle (100) on the basis of the sky image captured via the camera (2) in real time, and in the non-live mode, the output module (3) is caused to output the celestial observation view in the vehicle (100) on the basis of another source image, the other source image being different from the sky image captured via the camera (2) in real time.
18. A machine readable storage medium, storing a computer program used to, when run on a computer, perform the celestial observation method according to claim 16 or 17.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211259014.6 | 2022-10-14 | ||
CN202211259014.6A CN115520101A (en) | 2022-10-14 | 2022-10-14 | Celestial body observation system for vehicle and celestial body observation method for vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024078740A1 true WO2024078740A1 (en) | 2024-04-18 |
Family
ID=84702351
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2023/025427 WO2024078740A1 (en) | 2022-10-14 | 2023-10-10 | Celestial observation system for vehicle and celestial observation method for vehicle |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN115520101A (en) |
WO (1) | WO2024078740A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008064889A (en) * | 2006-09-05 | 2008-03-21 | Toyota Motor Corp | Astronomical observation device |
US20190369241A1 (en) * | 2018-06-05 | 2019-12-05 | Pony.ai, Inc. | Systems and methods for implementing a tracking camera system onboard an autonomous vehicle |
CN114290998A (en) * | 2021-12-30 | 2022-04-08 | 上海洛轲智能科技有限公司 | Skylight display control device, method and equipment |
-
2022
- 2022-10-14 CN CN202211259014.6A patent/CN115520101A/en active Pending
-
2023
- 2023-10-10 WO PCT/EP2023/025427 patent/WO2024078740A1/en unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008064889A (en) * | 2006-09-05 | 2008-03-21 | Toyota Motor Corp | Astronomical observation device |
US20190369241A1 (en) * | 2018-06-05 | 2019-12-05 | Pony.ai, Inc. | Systems and methods for implementing a tracking camera system onboard an autonomous vehicle |
CN114290998A (en) * | 2021-12-30 | 2022-04-08 | 上海洛轲智能科技有限公司 | Skylight display control device, method and equipment |
Also Published As
Publication number | Publication date |
---|---|
CN115520101A (en) | 2022-12-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112369051B (en) | Shared environment for vehicle occupants and remote users | |
US20210286838A1 (en) | Infotainment based on vehicle navigation data | |
US11112266B2 (en) | Method for motion-synchronized AR or VR entertainment experience | |
CN109716266B (en) | Immersive virtual display | |
CN110849386B (en) | Method for providing image to vehicle and electronic device thereof | |
US10546560B2 (en) | Systems and methods for presenting virtual content in a vehicle | |
EP3508381B1 (en) | Moodroof for augmented media experience in a vehicle cabin | |
JP6485654B2 (en) | Active windows for vehicle informatics and virtual reality | |
US20170352185A1 (en) | System and method for facilitating a vehicle-related virtual reality and/or augmented reality presentation | |
US9108642B2 (en) | Mobile object, system, and storage medium | |
WO2019124158A1 (en) | Information processing device, information processing method, program, display system, and moving body | |
EP3401642A2 (en) | Physical navigation guided via story-based augmented and/or mixed reality experiences | |
WO2023060528A1 (en) | Display method, display device, steering wheel, and vehicle | |
CN110015236A (en) | A kind of vehicle display device, method and vehicle | |
JP2018038009A (en) | Image output device and image output method | |
WO2024078740A1 (en) | Celestial observation system for vehicle and celestial observation method for vehicle | |
CN112297842A (en) | Autonomous vehicle with multiple display modes | |
CN111923918A (en) | Method for assisting virtual reality in a vehicle and associated control device | |
GB2526515A (en) | Image capture system | |
CN118770060A (en) | Interactive system for vehicle, interactive method and corresponding vehicle | |
KR20130023465A (en) | Robot arm 3d tv |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23793235 Country of ref document: EP Kind code of ref document: A1 |