WO2023197913A1 - 一种图像处理方法及相关设备 - Google Patents
一种图像处理方法及相关设备 Download PDFInfo
- Publication number
- WO2023197913A1 WO2023197913A1 PCT/CN2023/086213 CN2023086213W WO2023197913A1 WO 2023197913 A1 WO2023197913 A1 WO 2023197913A1 CN 2023086213 W CN2023086213 W CN 2023086213W WO 2023197913 A1 WO2023197913 A1 WO 2023197913A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- electronic device
- route
- crosswalk
- user
- angle
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 12
- 238000000034 method Methods 0.000 claims abstract description 52
- 230000002159 abnormal effect Effects 0.000 claims abstract description 13
- 230000033001 locomotion Effects 0.000 claims description 31
- 238000012545 processing Methods 0.000 claims description 14
- 238000004590 computer program Methods 0.000 claims description 9
- 230000014509 gene expression Effects 0.000 claims description 8
- 230000007423 decrease Effects 0.000 claims description 5
- 230000008569 process Effects 0.000 abstract description 8
- 230000000007 visual effect Effects 0.000 abstract description 3
- 230000036544 posture Effects 0.000 description 31
- 238000004891 communication Methods 0.000 description 23
- 230000006870 function Effects 0.000 description 21
- 238000010586 diagram Methods 0.000 description 17
- 238000007726 management method Methods 0.000 description 15
- 230000001771 impaired effect Effects 0.000 description 11
- 238000010295 mobile communication Methods 0.000 description 11
- 210000000988 bone and bone Anatomy 0.000 description 10
- 230000005236 sound signal Effects 0.000 description 10
- 230000000694 effects Effects 0.000 description 7
- 206010011878 Deafness Diseases 0.000 description 6
- 241000283070 Equus zebra Species 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 230000005856 abnormality Effects 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 5
- 229920001621 AMOLED Polymers 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 230000002829 reductive effect Effects 0.000 description 3
- 208000034819 Mobility Limitation Diseases 0.000 description 2
- 230000033228 biological regulation Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 230000036961 partial effect Effects 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 239000004984 smart glass Substances 0.000 description 2
- 230000001755 vocal effect Effects 0.000 description 2
- 208000012661 Dyskinesia Diseases 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000010009 beating Methods 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000003862 health status Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 239000010985 leather Substances 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/005—Traffic control systems for road vehicles including pedestrian guidance indicator
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72454—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/52—Details of telephonic subscriber devices including functional features of a camera
Definitions
- the embodiments of the present application relate to the field of terminal technology, and in particular, to an image processing method and related equipment.
- visually impaired people mainly rely on blind canes when traveling, using which they can judge obstacles on the road. Since the blind cane mainly targets obstacles ahead, in scenarios such as crosswalks with no obstacles or fewer obstacles, visually impaired people usually move based on their travel experience.
- visually impaired people generally have difficulty walking in a straight line due to poor sense of direction. Especially in scenarios such as crosswalks, the road conditions are complex and various motor vehicles and non-motor vehicles are running around. Based on their travel experience, visually impaired people face greater safety risks when passing through crosswalks and other scenarios.
- Embodiments of the present application provide an image processing method and related equipment.
- the safety of the user passing the crosswalk can be improved through the output of the first instruction information.
- the first aspect of the embodiment of the present application provides an image processing method.
- the method is applied to an electronic device.
- the electronic device is used to obtain multiple frames of images from the user's front perspective.
- the method includes: if a crosswalk is detected from the multiple frames of images, Determine the first route, which is a safe traveling route on the crosswalk; obtain the second route actually walked by the user based on multi-frame images; if the angle between the first route and the second route in the same reference system is greater than or equal to the preset angle , output first indication information to the user, and the first indication information is used to indicate that the second route is abnormal.
- the above condition for outputting the first indication information may also be: the angle between the first route and the second route in the same reference system is greater than or equal to the preset angle within a period of time. That is, it prevents inaccurate judgments caused by measurement errors or shaking in a certain frame and avoids frequent output of the first indication information.
- the above method can be applied to scenarios where users pass through crosswalks, such as blind people crossing the street, deaf people crossing the street, etc.
- the reference system mentioned above may refer to a virtual reference system, a plane (such as the ground) reference system, a camera reference system of an electronic device, etc., and is not specifically limited here.
- the angle between the first route and the second route in the same reference system can be understood as the angle between the projection of the first route onto the ground and the second route.
- the first route can be understood as the safe movement route of the user/electronic device on the ground
- the second route can be understood as the actual movement route of the user/electronic device on the ground.
- the first indication information is output to the user to indicate the user's actual traveling direction.
- abnormality and the user deviates from the crosswalk during movement
- the first instruction information output is used to help the user pass the crosswalk. road safety.
- the first posture when the above-mentioned electronic device acquires multiple frames of images includes a first pitch angle and a first tilt angle, and the value range of the first pitch angle is: 60 to 100 degrees, and the first tilt angle ranges from 60 to 120 degrees.
- the first pitch angle in the first attitude ranges from 60 to 100 degrees
- the first tilt angle ranges from 60 to 120 degrees.
- the pose of the electronic device meets the conditions multiple frames of images can be acquired to ensure that the acquired multiple frames of images are reasonable.
- the judgment between the route and the second route is more accurate.
- the above steps further include: acquiring a second pose of the electronic device, the second pitch angle in the second pose being different from the first pitch angle, or the second pitch angle in the second pose being different from the first pitch angle.
- the second inclination angle in the two postures is different from the first inclination angle; second instruction information is output, and the second instruction information is used for the user to adjust the second posture to the first posture.
- the second instruction information allows the user to adjust the posture of the electronic device to adjust the second posture of the electronic device to meet the conditions for taking pictures.
- the first pose or preset pose interval of the condition makes the subsequently acquired images more reasonable and the judgment of the first route and the second route more accurate.
- the above-mentioned first route is the bisector of the angle formed by the extension directions of both edges of the crosswalk, or the first route is parallel to the edge of the crosswalk. Further, the first route is parallel to the edge of the short side of the crosswalk center line (or zebra crossing), or the edge extensions of the two short sides of the zebra crossing intersect to form an angle, and the direction of the angle bisector pointing to the intersection point is the second.
- the first route, or the first route is perpendicular to the direction of the zebra crossing, etc., can be set according to actual needs, and there is no specific limit here.
- This possible implementation provides multiple ways of determining the first route.
- the first route can be determined.
- the electronic device determines that the area of the crosswalk in the multi-frame image is gradually decreasing, ensuring that when the user is standing at the crosswalk, it can determine whether the user really needs to pass the crosswalk or walk on it. zebra crossing.
- the moving route of the electronic device when collecting multiple frames of images is the second route, and the moving route is parallel to the ground where the crosswalk is located.
- the route parallel to the ground when the electronic device collects multiple frames of images is the second route, or the movement of the electronic device in a dimension parallel to the ground is the second route, which provides the determination of the second route. Way.
- the method further includes: if the new image acquired by the electronic device does not include a crosswalk, outputting a third indication information to the user. Indication information, the third indication information is used to indicate that the user has passed the pedestrian passage.
- a third instruction message is issued to remind the user that the user has passed the crosswalk, thereby improving the performance of the user. user experience.
- the above-mentioned multi-frame images also include traffic light information, the length of the crosswalk, and motor vehicle information related to the crosswalk;
- the traffic light information includes the color and duration of the traffic light, the length of the motor vehicle
- the information includes the driving direction and distance of the motor vehicle relative to the crosswalk;
- the first indication information is also used to indicate the user's movement path on the crosswalk.
- the multi-frame images may also include traffic light information, the length of the crosswalk, and vehicle information related to the crosswalk, thereby ensuring user safety by outputting the first instruction information.
- the method before determining the first route, further includes: starting a first application based on the user's operation, and the first application is used by the electronic device to obtain the direction directly in front of the user. Multiple frames of images from different perspectives.
- the acquisition of multiple frames of images from the user's front perspective can be started through user operations, thereby reducing energy consumption in scenarios where crosswalks are not passed.
- the expression form of the above-mentioned first indication information includes voice and/or mark information.
- the first instruction information may be voice, that is, the blind person can safely pass the crosswalk through the voice of the first instruction information.
- the first indication information may be marking information, that is, the deaf user or the user who looks down at the electronic device can safely pass the crosswalk through the marking information of the first indication information.
- a second aspect of the embodiment of the present application provides an electronic device, which is used to acquire multiple frames of images from the perspective of the user directly in front of the user.
- the electronic device includes: a determining unit configured to determine the first crosswalk if a crosswalk is detected from the multiple frames of images.
- a route the first route is a safe traveling route on a crosswalk;
- the acquisition unit is used to acquire the second route actually walked by the user based on multi-frame images;
- the output unit is used to obtain the second route if the first route and the second route are in the same reference system. If the included angle is greater than or equal to the preset angle, the first indication information is output to the user, and the first indication information is used to indicate that the second route is abnormal.
- the first posture when the above-mentioned electronic device acquires multiple frames of images includes a first pitch angle and a first tilt angle, and the value range of the first pitch angle is is 60 to 100 degrees, and the first tilt angle ranges from 60 to 120 degrees.
- the above-described acquisition unit is also used to acquire a second pose of the electronic device, where the second pitch angle in the second pose is different from the first pitch angle. , or the second inclination angle in the second posture is different from the first inclination angle; the output unit is also used to output second instruction information, and the second instruction information is used for the user to adjust the second posture to the first posture.
- the above-mentioned first route is the angular bisector of the angle formed by the extension directions of both edges of the crosswalk, or the first route is parallel to the edge of the crosswalk.
- the above-mentioned determining unit is also used to determine that the area of the crosswalk in the multi-frame image gradually decreases.
- the moving route of the above-mentioned electronic device when collecting multiple frames of images is the second route, and the moving route is parallel to the ground where the crosswalk is located.
- the above-mentioned output unit is also configured to output third indication information to the user if the new image acquired by the electronic device does not include a crosswalk, and the third indication information is Indicates that the user has passed through the pedestrian passage.
- the above-mentioned multi-frame images also include traffic light information, the length of the crosswalk, and motor vehicle information related to the crosswalk;
- the traffic light information includes the color and duration of the traffic light, the length of the motor vehicle
- the information includes the driving direction and distance of the motor vehicle relative to the crosswalk;
- the first indication information is also used to indicate the user's movement path on the crosswalk.
- the above-mentioned electronic device further includes: an opening unit configured to open a first application based on the user's operation, and the first application is used by the electronic device to obtain the user's direct front perspective. of multi-frame images.
- the expression form of the above-mentioned first indication information includes voice and/or mark information.
- the third aspect of the embodiment of the present application provides an electronic device, including: a processor, the processor is coupled to a memory, and the memory is used to store programs or instructions.
- the electronic device implements the above-mentioned A method in any possible implementation of an aspect or first aspect.
- the fourth aspect of the embodiments of the present application provides a computer-readable medium on which a computer program or instructions are stored.
- the computer program or instructions When the computer program or instructions are run on a computer, the computer can execute the first aspect or any possibility of the first aspect. method in the implementation.
- the fifth aspect of the embodiments of the present application provides a computer program product.
- the computer program product When the computer program product is executed on a computer, it causes the computer to execute the method in the foregoing first aspect or any possible implementation of the first aspect.
- the embodiments of the present application have the following advantages: In the embodiments of the present application, by determining the angle between the safe traveling route and the user's actual traveling route, when the included angle is greater than or equal to the preset angle, output the first instruction information to the user to indicate that the user's actual traveling direction is abnormal, and then when the user deviates from the crosswalk during movement, the safety of the user passing the crosswalk is improved through the output first instruction information.
- Figure 1 is a schematic diagram of an application scenario provided by an embodiment of the present invention.
- Figure 2 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
- Figure 3 is a schematic flow chart of the image processing method provided by the embodiment of the present application.
- Figure 4 is an example of an image provided by an embodiment of the present application.
- Figure 5 is an example diagram of another image provided by the embodiment of the present application.
- Figure 6 is an example diagram of a pedestrian crossing provided by an embodiment of the present application.
- Figure 7 is an example diagram of another crosswalk provided by the embodiment of the present application.
- Figure 8 is an example diagram of another crosswalk provided by the embodiment of the present application.
- Figure 9 is an example diagram of the attitude angle of the electronic device provided by the embodiment of the present application.
- Figure 10 is an example diagram of a first route and a second route provided by an embodiment of the present application.
- Figure 11 is an example diagram of another first route and a second route provided by the embodiment of the present application.
- Figure 12 is an example diagram of first indication information provided by an embodiment of the present application.
- Figure 13 is an example diagram of a scene in which an electronic device is in a second posture according to an embodiment of the present application
- Figure 14 is an example diagram of a scene in which an electronic device is in the first posture according to an embodiment of the present application
- Figure 15 is an example diagram of a smart recognition function in an electronic device provided by an embodiment of the present application.
- Figure 16 is another structural schematic diagram of an electronic device provided by an embodiment of the present application.
- Figure 17 is another schematic structural diagram of an electronic device provided by an embodiment of the present application.
- Embodiments of the present application provide an image processing method and related equipment.
- the safety of the user passing the crosswalk can be improved through the output of the first instruction information.
- visually impaired people mainly rely on blind canes when traveling, using which they can judge obstacles on the road. Since the blind cane mainly targets obstacles ahead, in scenarios such as crosswalks with no obstacles or fewer obstacles, visually impaired people usually move based on their travel experience. However, visually impaired people generally have difficulty walking in a straight line due to poor sense of direction. Especially in scenarios such as crosswalks, the road conditions are complex and various motor vehicles and non-motor vehicles are running around. Based on their travel experience, visually impaired people face greater safety risks when passing through crosswalks and other scenarios.
- embodiments of the present application provide an interface display method and an electronic device.
- the first instruction information is output to the user to indicate that the user's actual traveling direction is abnormal, and when the user deviates from the crosswalk during movement, the safety of the user passing the crosswalk is improved through the output first instruction information.
- the application scenario of the method provided by the embodiment of this application can be shown in Figure 1.
- the application scenario includes: user 001 and electronic device 002.
- user 001 uses electronic device 002 to assist in passing a crosswalk (also called a road, zebra crossing, etc.).
- the user 001 may be a visually impaired user or a normal user. In other words, the user can be a blind user, a deaf user, or a hearing user.
- Electronic device 002 equipped with an operating system and loaded with applications (for example, intelligent identification, etc.). Through the operation of user 001, images can be collected, crosswalks (for example, zebra crossings) in the images can be identified, instruction information (for example, warning information, correction information, etc.) can be issued, and so on.
- the above instruction information may be audio information or visual images, etc., and is not specifically limited here.
- the electronic device in the embodiment of the present application may be a mobile phone, a smart watch, smart glasses, a smart bracelet, a tablet computer (pad), a portable game console, or a personal digital assistant (PDA).
- PDA personal digital assistant
- laptop computers ultra mobile personal computers (UMPC)
- handheld computers netbooks
- car media playback devices wearable electronic devices (such as watches, bracelets, glasses), virtual reality (virtual reality, Digital display products such as VR) terminal equipment and augmented reality (AR) terminal equipment.
- VR virtual reality
- AR augmented reality
- FIG. 2 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
- the electronic device may include a processor 210, an external memory interface 220, an internal memory 221, a universal serial bus serial bus (USB) interface 230, charging management module 240, power management module 241, battery 242, antenna 1, antenna 2, mobile communication module 250, wireless communication module 260, audio module 270, speaker 270A, receiver 270B, microphone 270C, Headphone interface 270D, sensor module 280, button 290, motor 291, indicator 292, camera 293, display screen 294, and subscriber identification module (subscriber identification module, SIM) card interface 295, etc.
- SIM subscriber identification module
- the sensor module 280 may include a pressure sensor 280A, a gyro sensor 280B, an air pressure sensor 280C, a magnetic sensor 280D, an acceleration sensor 280E, a distance sensor 280F, a proximity light sensor 280G, a fingerprint sensor 280H, a temperature sensor 280J, a touch sensor 280K, and an environment.
- the structure illustrated in this embodiment does not constitute a specific limitation on the electronic device.
- the electronic device may include more or fewer components than illustrated, some components may be combined, some components may be separated, or components may be arranged differently.
- the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
- the processor 210 may include one or more processing units.
- the processor 210 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) wait.
- application processor application processor, AP
- modem processor graphics processing unit
- GPU graphics processing unit
- image signal processor image signal processor
- ISP image signal processor
- controller memory
- video codec digital signal processor
- DSP digital signal processor
- NPU neural-network processing unit
- different processing units can be independent devices or integrated in one or more processors.
- a controller can be the nerve center and command center of an electronic device.
- the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
- the processor 210 may also be provided with a memory for storing instructions and data.
- the memory in processor 210 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 210 . If the processor 210 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 210 is reduced, thus improving the efficiency of the system.
- processor 210 may include one or more interfaces. Interfaces may include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous receiver and transmitter (universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and /or universal serial bus (USB) interface, etc.
- I2C integrated circuit
- I2S integrated circuit built-in audio
- PCM pulse code modulation
- UART universal asynchronous receiver and transmitter
- MIPI mobile industry processor interface
- GPIO general-purpose input/output
- SIM subscriber identity module
- USB universal serial bus
- the interface connection relationships between the modules illustrated in this embodiment are only schematic illustrations and do not constitute structural limitations on the electronic equipment.
- the electronic device may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
- the charge management module 240 is used to receive charging input from the charger.
- the charger can be a wireless charger or a wired charger.
- the charging management module 240 may receive charging input from the wired charger through the USB interface 230 .
- the charging management module 240 may receive wireless charging input through a wireless charging coil of the electronic device. While charging the battery 242, the charging management module 240 can also provide power to the electronic device through the power management module 241.
- the power management module 241 is used to connect the battery 242, the charging management module 240 and the processor 210. Power management module 241 receives input from the battery 242 and/or the charging management module 240, and supplies power to the processor 210, internal memory 221, external memory, display screen 294, camera 293, wireless communication module 260, etc. The power management module 241 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters. In some other embodiments, the power management module 241 may also be provided in the processor 210 . In other embodiments, the power management module 241 and the charging management module 240 may also be provided in the same device.
- the wireless communication function of the electronic device can be realized through the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, the modem processor and the baseband processor.
- Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
- Each antenna in an electronic device can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
- Antenna 1 can be reused as a diversity antenna for a wireless LAN.
- antennas may be used in conjunction with tuning switches.
- the mobile communication module 250 can provide wireless communication solutions including 2G/3G/4G/5G applied to electronic devices.
- the mobile communication module 250 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
- the mobile communication module 250 can receive electromagnetic waves from the antenna 1, perform filtering, amplification and other processing on the received electromagnetic waves, and transmit them to the modem processor for demodulation.
- the mobile communication module 250 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation.
- at least part of the functional modules of the mobile communication module 250 may be disposed in the processor 210 .
- at least part of the functional modules of the mobile communication module 250 and at least part of the modules of the processor 210 may be provided in the same device.
- a modem processor may include a modulator and a demodulator.
- the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
- the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
- the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
- the application processor outputs sound signals through audio devices (not limited to speaker 270A, receiver 270B, etc.), or displays images or videos through display screen 294.
- the modem processor may be a stand-alone device.
- the modem processor may be independent of the processor 210 and may be provided in the same device as the mobile communication module 250 or other functional modules.
- the wireless communication module 260 can provide wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (BT), and global navigation satellite systems for use in electronic devices. (global navigation satellite system, GNSS), frequency modulation (FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
- the wireless communication module 260 may be one or more devices integrating at least one communication processing module.
- the wireless communication module 260 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 210 .
- the wireless communication module 260 can also receive the signal to be sent from the processor 210, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
- the antenna 1 of the electronic device is coupled to the mobile communication module 250, and the antenna 2 is coupled to the wireless communication module 260, so that the electronic device can communicate with the network and other devices through wireless communication technology.
- the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Wideband code division multiple access (WCDMA), time-division code division multiple access (TDSCDMA), long-term Evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technology, etc.
- the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
- GPS global positioning system
- GLONASS global navigation satellite system
- BDS Beidou navigation satellite system
- QZSS quasi-zenith satellite system
- SBAS satellite based augmentation systems
- the electronic device implements display functions through the GPU, display screen 294, and application processor.
- the GPU is an image processing microprocessor and is connected to the display screen 294 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
- Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
- the display screen 294 is used to display images, videos, etc.
- the display screen 294 includes a display panel.
- the display panel can use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode).
- LCD liquid crystal display
- OLED organic light-emitting diode
- AMOLED organic light-emitting diode
- FLED flexible light-emitting diode
- Miniled MicroLed, Micro-oLed, quantum dot light emitting diode (QLED), etc.
- the electronic device can realize the shooting function through ISP, camera 293, video codec, GPU, display screen 294 and application processor.
- the ISP is used to process the data fed back by the camera 293. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be provided in the camera 293.
- Camera 293 is used to capture still images or video.
- the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
- the photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
- CMOS complementary metal-oxide-semiconductor
- the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
- ISP outputs digital image signals to DSP for processing.
- DSP converts digital image signals into standard RGB, YUV and other format image signals.
- the electronic device may include 1 or N cameras 293, where N is a positive integer greater than 1.
- the camera 293 can also be used for the electronic device to provide users with personalized and situational business experiences based on the perceived external environment and user actions. Among them, the camera 293 can obtain rich and accurate information so that the electronic device can perceive the external environment and user's actions. Specifically, in this embodiment of the present application, the camera 293 can be used to identify whether the user of the electronic device is the first user or the second user.
- Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
- Video codecs are used to compress or decompress digital video.
- Electronic devices may support one or more video codecs. In this way, electronic devices can play or record videos in multiple encoding formats, such as: Moving Picture Experts Group (MPEG)1, MPEG2, MPEG3, MPEG4, etc.
- MPEG Moving Picture Experts Group
- MPEG2 MPEG2, MPEG3, MPEG4, etc.
- NPU is a neural network (NN) computing processor.
- NN neural network
- Intelligent cognitive applications of electronic devices can be realized through NPU, such as image recognition, face recognition, speech recognition, text understanding, etc.
- the external memory interface 220 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device.
- the external memory card communicates with the processor 210 through the external memory interface 220 to implement the data storage function. Such as saving music, videos, etc. files in external memory card.
- Internal memory 221 may be used to store computer executable program code, which includes instructions.
- the processor 210 executes instructions stored in the internal memory 221 to execute various functional applications and data processing of the electronic device. For example, in this embodiment of the present application, the processor 210 can respond to the user's operation on the display screen 294 by executing instructions stored in the internal memory 221 to display corresponding display content on the display screen.
- the internal memory 221 may include a program storage area and a data storage area. Among them, the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image playback function, etc.). The storage data area can store data created during the use of electronic equipment (such as audio data, phone books, etc.).
- the internal memory 221 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
- the electronic device can implement audio functions through the audio module 270, the speaker 270A, the receiver 270B, the microphone 270C, the headphone interface 270D, and the application processor. Such as music playback, recording, etc.
- the audio module 270 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. Audio module 270 may also be used to encode and decode audio signals. In some embodiments, the audio module 270 may be provided in the processor 210 , or some functional modules of the audio module 270 may be provided in the processor 210 . Speaker 270A, also called “speaker”, is used to convert audio electrical signals into sound signals. The electronic device can listen to music through speaker 270A, or listen to hands-free calls. Receiver 270B, also called “earpiece”, is used to convert audio electrical signals into sound signals. When the electronic device answers a call or a voice message, the voice can be heard by bringing the receiver 270B close to the human ear.
- Microphone 270C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
- the user can speak by approaching the microphone 270C with the human mouth and input the sound signal to the microphone 270C.
- the electronic device may be provided with at least one microphone 270C.
- the electronic device may be provided with two microphones 270C, which in addition to collecting sound signals, may also implement a noise reduction function.
- the electronic device can also be equipped with three, four or more microphones 270C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions, etc.
- the headphone interface 270D is used to connect wired headphones.
- the headphone interface 270D may be a USB interface 230, or may be a 3.5mm open mobile terminal platform (OMTP) standard interface, or a Cellular Telecommunications Industry Association of the USA (CTIA) standard interface.
- OMTP open mobile terminal platform
- CTIA Cellular Telecommunications Industry Association of the USA
- the pressure sensor 280A is used to sense pressure signals and can convert the pressure signals into electrical signals.
- pressure sensor 280A may be disposed on display screen 294.
- pressure sensors 280A such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc.
- a capacitive pressure sensor may include at least two parallel plates of conductive material.
- the electronic device detects the strength of the touch operation according to the pressure sensor 280A.
- the electronic device may also calculate the touched position based on the detection signal of the pressure sensor 280A.
- touch operations acting on the same touch location but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation whose intensity is less than the pressure threshold is applied to the short message application icon, execute Run the command to view short messages. When a touch operation with a touch operation intensity greater than or equal to the pressure threshold is applied to the short message application icon, the instruction to create a new short message is executed.
- the gyro sensor 280B can be used to determine the motion posture of the electronic device.
- the angular velocity of the electronic device about three axes may be determined by gyro sensor 280B.
- the gyro sensor 280B can be used for image stabilization. For example, when the shutter is pressed, the gyro sensor 280B detects the angle at which the electronic device shakes, and calculates the distance that the lens module needs to compensate based on the angle, so that the lens can offset the shake of the electronic device through reverse movement to achieve anti-shake.
- the gyro sensor 280B can also be used for navigation and somatosensory gaming scenarios.
- the gyro sensor 280B can also be used to measure the rotation amplitude or movement distance of the electronic device.
- Air pressure sensor 280C is used to measure air pressure. In some embodiments, the electronic device calculates the altitude through the air pressure value measured by the air pressure sensor 280C to assist positioning and navigation.
- Magnetic sensor 280D includes a Hall sensor.
- the electronic device can use the magnetic sensor 280D to detect the opening and closing of the flip holster.
- the electronic device may detect the opening and closing of the flip according to the magnetic sensor 280D. Then, based on the detected opening and closing status of the leather case or the opening and closing status of the flip cover, features such as automatic unlocking of the flip cover are set.
- the acceleration sensor 280E can detect the acceleration of the electronic device in various directions (generally three axes). When the electronic device is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices and be used in horizontal and vertical screen switching, pedometer and other applications. In addition, the acceleration sensor 280E can also be used to measure the orientation of the electronic device (ie, the direction vector of the orientation).
- Distance sensor 280F used to measure distance.
- Electronic devices can measure distance via infrared or laser. In some embodiments, when shooting a scene, the electronic device can utilize the distance sensor 280F to measure distance to achieve fast focusing.
- Proximity light sensor 280G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
- the light emitting diode may be an infrared light emitting diode.
- Electronic devices emit infrared light through light-emitting diodes.
- Electronic devices use photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device. When insufficient reflected light is detected, the electronic device can determine that there is no object near the electronic device.
- Electronic devices can use the proximity light sensor 280G to detect when the user is holding the electronic device close to the ear to talk, so that the screen can be automatically turned off to save power.
- the proximity light sensor 280G can also be used in holster mode, and pocket mode automatically unlocks and locks the screen.
- the ambient light sensor 280L is used to sense ambient light brightness.
- the electronic device can adaptively adjust the brightness of the display screen 294 based on perceived ambient light brightness.
- the ambient light sensor 280L can also be used to automatically adjust the white balance when taking pictures.
- the ambient light sensor 280L can also cooperate with the proximity light sensor 280G to detect whether the electronic device is in the pocket to prevent accidental touching.
- Fingerprint sensor 280H is used to collect fingerprints. Electronic devices can use the collected fingerprint characteristics to unlock fingerprints, access application locks, take photos with fingerprints, answer incoming calls with fingerprints, etc.
- Temperature sensor 280J is used to detect temperature.
- the electronic device uses the temperature detected by the temperature sensor 280J to execute the temperature processing strategy. For example, when the temperature reported by the temperature sensor 280J exceeds a threshold, the electronic device reduces the performance of a processor located near the temperature sensor 280J in order to reduce power consumption and implement thermal protection. In other embodiments, when the temperature is lower than another threshold, the electronic device heats the battery 242 to prevent the low temperature from causing the electronic device to shut down abnormally. In some other embodiments, when the temperature is lower than another threshold, the electronic device performs boosting on the output voltage of the battery 242 to avoid abnormal shutdown caused by low temperature.
- Touch sensor 280K also called “touch panel”.
- the touch sensor 280K can be disposed on the display screen 294.
- the touch sensor 280K and the display screen 294 form a touch screen, which is also called a "touch screen”.
- Touch sensor 280K is used to detect the effects on its Touch operations on or near.
- the touch sensor can pass the detected touch operation to the application processor to determine the touch event type.
- Visual output related to the touch operation may be provided through display screen 294.
- the touch sensor 280K may also be disposed on the surface of the electronic device at a different location from the display screen 294 .
- Bone conduction sensor 280M can acquire vibration signals.
- the bone conduction sensor 280M can acquire the vibration signal of the vibrating bone mass of the human body's vocal part.
- the bone conduction sensor 280M can also contact the human body's pulse and receive blood pressure beating signals.
- the bone conduction sensor 280M can also be provided in the earphone and combined into a bone conduction earphone.
- the audio module 270 can analyze the voice signal based on the vibration signal of the vocal vibrating bone obtained by the bone conduction sensor 280M to implement the voice function.
- the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 280M to implement the heart rate detection function.
- the buttons 290 include a power button, a volume button, etc.
- Key 290 may be a mechanical key. It can also be a touch button.
- the electronic device can receive key input and generate key signal input related to user settings and function control of the electronic device.
- the electronic device uses various sensors, buttons 290, and/or cameras 293 in the sensor module 280.
- the motor 291 can generate vibration prompts.
- the motor 291 can be used for vibration prompts for incoming calls and can also be used for touch vibration feedback.
- touch operations for different applications can correspond to different vibration feedback effects.
- Acting on touch operations in different areas of the display screen 294, the motor 291 can also correspond to different vibration feedback effects.
- Different application scenarios (such as time reminders, receiving information, alarm clocks, games, etc.) can also correspond to different vibration feedback effects.
- the touch vibration feedback effect can also be customized.
- the indicator 292 may be an indicator light, which may be used to indicate charging status, power changes, or may be used to indicate messages, missed calls, notifications, etc.
- the SIM card interface 295 is used to connect a SIM card.
- the SIM card can be inserted into the SIM card interface 295 or pulled out from the SIM card interface 295 to realize contact and separation from the electronic device.
- the electronic device can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
- SIM card interface 295 can support Nano SIM card, Micro SIM card, SIM card, etc. Multiple cards can be inserted into the same SIM card interface 295 at the same time. The types of the plurality of cards may be the same or different.
- the SIM card interface 295 is also compatible with different types of SIM cards.
- the SIM card interface 295 is also compatible with external memory cards.
- Electronic devices interact with the network through SIM cards to implement functions such as calls and data communications.
- the electronic device uses an eSIM, that is, an embedded SIM card.
- the eSIM card can be embedded in the electronic device and cannot be separated from the electronic device.
- the following describes the image processing method provided by the embodiment of the present application, taking the following electronic device as a mobile phone as an example.
- the method may be executed by an electronic device, or may be executed by a component of the electronic device (such as a processor, a chip, or a chip system, etc.).
- the electronic device is used to acquire multiple frames of images from the user's front perspective.
- the image processing method may include steps 301 to 303, which are described respectively below.
- Step 301 If a crosswalk is detected from multiple frames of images, determine the first route.
- the first route is a safe route on the crosswalk.
- the electronic device there are many ways for the electronic device to obtain multi-frame images, which may be through collection/photography, or by receiving transmissions from other devices (other devices connected to the electronic device), or through It is the method of selecting from the database, etc. There is no specific limit here.
- This article only takes the electronic device to obtain multiple frames of images from the user's front perspective in real time as an example for schematic explanation. In practical applications, the electronic device can periodically obtain multiple frames of images from the user's front perspective, etc., and the details are not limited here. .
- the electronic device may be a portable electronic device such as a mobile phone, a smart watch, or smart glasses carried by the pedestrian.
- a first route is determined, and the first route is a safe route on the crosswalk.
- the first route can be the extension direction of the crosswalk, or it can also be understood as the direction perpendicular to the horizontal line of the crosswalk, etc. The details are not limited here. It should be noted that the first route may not be just one route, but may be multiple routes within a certain range, for example.
- the first route is a number of routes leading outward from the intersection within the "angle" formed by the horizontal lines on both sides of the crosswalk.
- the determination of the first route is related to the completeness of the crosswalk in the image.
- the image includes a complete crosswalk, the edge extension lines on both sides of the crosswalk intersect to form an angle, and the direction of the angle bisector pointing to the intersection is the first route (that is, the first route is both sides of the crosswalk) The angle bisector of the angle formed by the direction of edge extension).
- the image includes an incomplete crosswalk, and the first route is parallel to the edge of the crosswalk, or the first route is perpendicular to the lines of the crosswalk and away from the location of the electronic device, etc. Specifically, this is There are no restrictions anywhere.
- the image when the image includes a complete crosswalk, the image may be as shown in Figure 4. In the case where the image includes part of the crosswalk, the image can be as shown in Figure 5. It can be understood that the multi-frame images acquired by the electronic device can include the images in Figure 4 and Figure 5, or can also include the image in Figure 4 or Figure 5. Specifically There are no limitations here.
- the crosswalk in the embodiment of the present application can be located in a variety of situations. It can be located at an intersection without traffic lights (for example, as shown in Figure 4), it can be located at an intersection with only one traffic light (for example, as shown in Figure 6), or it can be located at an intersection with only one traffic light and a An intersection with a safety island (for example, as shown in Figure 7), or an intersection with two traffic lights and a safety island in the middle (for example, as shown in Figure 8), etc.
- the specific conditions of the crosswalk can be set according to actual needs, and will not be discussed here. limited.
- the length, width, line spacing, etc. of the crosswalks in the embodiment of this application can refer to the regulations of the Urban Road Traffic Signs and Markings Setting Specifications (GB 51038-2015).
- the crosswalk lines should use a set of white parallel thick solid lines, the line width should be 40cm or 45cm, the line interval should be 60cm, and the maximum should not exceed 80cm.
- the width of pedestrian crossings should be greater than or equal to 3m, and should be widened in steps of 1m.
- the width, form, and location of the crosswalk should comply with the following regulations: when the length of the crosswalk is greater than 16m, a safety island should be set up at the separation zone or the dividing line of the opposite lane; the length of the safety island should not be less than the width of the crosswalk, and the safety island should not be less than the width of the crosswalk.
- the width should not be less than 2m, and should not be less than 1.5m under difficult circumstances; the safety island should be equipped with elastic traffic columns and safety protection facilities.
- the multi-frame images in the embodiment of the present application include objects, and the objects include crosswalks, which are used for pedestrians to cross the roadway. It can be understood that the images in the multi-frame images may include a complete crosswalk or a partial crosswalk, one image in the multi-frame images may include a complete crosswalk, and another image may include a partial crosswalk, etc., and the details are not limited here.
- the objects in the above-mentioned multi-frame images may also include traffic light information, the length of the crosswalk, motor vehicle information related to the crosswalk, etc.
- the traffic light information may include the color and duration of the traffic light, etc.
- the motor vehicle information may include the driving direction and distance of the motor vehicle relative to the roadway, etc.
- the area of the crosswalk in the multi-frame images can be gradually reduced. In this case, it can be understood that the user is deviating from the crosswalk.
- Step 302 Obtain the second route actually walked by the user based on the multi-frame images.
- the second route may be the moving route of the electronic device parallel to the ground where the crosswalk is located, or it may be the moving route of the user holding the electronic device, or it may be the direction of the electronic device.
- the route can also be a route on the center line of the image away from the location of the electronic device, etc. There is no specific limit here.
- the second route may be determined based on the posture of the electronic device when collecting images.
- the electronic device can detect the posture of the electronic device through a gyroscope and/or an acceleration sensor.
- the posture/orientation of the electronic device may refer to the posture angle of the electronic device.
- the posture angle may include an azimuth angle and an inclination angle (or tilt angle), or the posture angle may include an azimuth angle, an inclination angle, and a tilt angle.
- Pitch angle represents the angle around the z-axis
- the tilt angle represents the angle around the y-axis
- the pitch angle represents the angle around the x-axis.
- the first route and the second route can be as shown in Figure 10.
- the second route is the route where the electronic device is oriented, and the first route is perpendicular to the crosswalk. lines and away from where electronic equipment is located.
- the angle between the first route and the second route in the same reference system is 0 degrees.
- the first route and the second route can be as shown in Figure 11.
- the second route is the route where the electronic device is oriented, and the first route is perpendicular to the crosswalk. lines and away from where electronic equipment is located.
- the angle between the first route and the second route can be determined to be ⁇ .
- the first route and the second route in the embodiment of the present application can be presented on multiple frames of images, specifically, they can be carried on each image of the multiple frames of images.
- the first route and the second route can be presented to the user through dynamic/static images or voice, and the details are not limited here.
- Step 303 If the angle between the first route and the second route in the same reference system is greater than or equal to the preset angle, output first indication information to the user.
- the first indication information is used to indicate that the second route is abnormal.
- the electronic device After determining the angle between the first route and the second route in the same reference system, the electronic device outputs the first indication information to the user when the angle is greater than or equal to the preset angle.
- the first indication information may be used to indicate an abnormality in the movement direction, and/or to correct the movement direction (or be understood to provide the user with the correct movement direction, such as prompting the user to move to the left or right, etc.).
- the reference system mentioned above may refer to a virtual reference system, a plane (such as the ground) reference system, a camera reference system of an electronic device, etc., and is not specifically limited here.
- the plane is the ground where the crosswalk is located
- the angle between the first route and the second route in the same reference system can be understood as the angle between the projection of the first route onto the ground and the second route.
- the first route can be understood as the safe movement route of the user/electronic device on the ground
- the second route can be understood as the actual movement route of the user/electronic device on the ground.
- the electronic device collects multiple frames of images from the perspective of the user directly in front of the user.
- the angle between the first route and the second route in the same reference system is greater than or equal to the preset angle within the preset time period.
- the included angle is always greater than or equal to the preset angle, and then the first indication information is output to the user.
- multiple frames of images are used to determine whether the user's movement direction deviates from the crosswalk to prevent misjudgment caused by excessive deviation in a certain frame.
- the first indication information is output to the user.
- the electronic device determines that the area of the crosswalk in the multi-frame images gradually decreases. In this case, it is to ensure that the user stands at the intersection of the crosswalk. Determine whether the user actually wants to use the crosswalk.
- the preset angle in the embodiment of the present application can be set according to actual needs, for example: 45 degrees, 30 degrees, etc., and there is no specific limit here.
- the first indication information is output to the user.
- the first indication information can also be used to indicate the user's movement path on the crosswalk, traffic light information, and influence the user's movement on the crosswalk. moving obstacles, etc.
- the user can determine the traffic light information, the moving path on the crosswalk and/or the obstacles on the crosswalk, etc. based on the first instruction information.
- the expression form of the first indication information in the embodiment of the present application includes at least one of the following, which are described below:
- the first instruction information is voice.
- the user can determine whether there is an abnormality and correct the movement direction by listening to the first instruction information.
- the first instruction information when used to indicate an abnormal movement direction, the first instruction information may be the voice information of "Please note that the movement direction is abnormal" or the voice information of "Please note that you have deviated from the crosswalk” etc.
- the first instruction information may be a voice message: "You have deviated from the crosswalk, please move to the right", or "You have deviated from the crosswalk, please move to the left” Mobile” voice messages and so on.
- the first instruction information can also be continuously broadcast to the user until the angle between the user's actual movement route and the safe travel route is less than a certain angle.
- the objects in the image include traffic light information, crosswalk length, and motor vehicle information related to the crosswalk.
- the first indication information may be: "The light is currently red and will turn green in 3 seconds.” or "The light is currently on. At the crosswalk, there are still 2 meters before crossing the crosswalk", or "There is a pedestrian in front of you on the right, please pay attention", or "There is a motor vehicle coming on the right of you, please pay attention” and so on.
- the blind person can judge whether his or her moving direction is abnormal based on the voice, and/or how to correct the moving direction. In other words, blind people can use their voice to ensure their safety at crosswalks.
- the first indication information is mark information presented on the image.
- the user can determine whether there is an abnormality and correct the movement direction by looking down at the first indication information.
- the first indication information may be marking information presented on the image.
- the user determines the deviation of the moving direction by viewing the marking information, and then the actual moving direction can be corrected based on the deviation between the marking information and the actual moving direction. This will help users safely pass crosswalks while looking down at their electronic devices.
- the mark information can be displayed on the image to ensure the safety of the user who lowers his head or the deaf user when passing the crosswalk.
- the first indication information when the first indication information is mark information, the first indication information may be as shown in FIG. 12 .
- the first instruction information can include voice and mark information, that is, while the voice is broadcast to the user, the mark information is displayed to the user.
- the above situations are just examples.
- the first instruction information may also have other expression forms, which are not limited here.
- the first indication information is output to the user to indicate the user's actual traveling direction.
- abnormality and when the user deviates from the crosswalk during movement, the safety of the user passing the crosswalk is improved through the output of the first instruction information.
- the pose of the electronic device may also be judged and/or adjusted.
- the first posture when the electronic device acquires multiple frames of images includes a first pitch angle and a first tilt angle.
- the first pitch angle ranges from 60 to 100 degrees.
- the first tilt angle The value range is from 60 to 120 degrees.
- the electronic device before acquiring multiple frames of images, the electronic device first acquires the second pose of the electronic device carried by the user. And it is determined that the second pitch angle in the second posture is different from the first pitch angle, or the second tilt angle in the second posture is different from the first tilt angle. In other words, the second posture is not suitable for electronic equipment to collect images.
- the second instruction information may be output.
- the second pose is not in the preset pose range, which can be understood as a pose that does not meet the photography conditions.
- the above-mentioned second instruction information is used by the user to adjust the second posture of the electronic device to the first posture.
- the second posture of the electronic device is as shown in Figure 13. After the electronic device outputs the second instruction information to the user, as shown in Figure 14, the electronic device adjusts the second posture to the first posture.
- step 303 if the new image acquired by the electronic device does not include a crosswalk, third indication information is output to the user, and the third indication information is used to indicate that the user has passed the crosswalk.
- the expression forms of the first instruction information, the second instruction information, and the third instruction information in the embodiments of the present application may be voice, mark information, etc., and are not specifically limited here.
- the electronic device can start a first application based on the user's operation.
- the first application is used by the electronic device to obtain multiple frames of images from the user's front perspective.
- This function can also be called smart recognition.
- Smart recognition can be a separate application or a new function that is part of an existing application. For example, as a new working mode within the camera application.
- smart recognition can be started in a variety of ways. For example, after opening the application and selecting the smart recognition function, it can be started quickly on the secondary page of the home screen, through voice assistants, or through physical buttons. Considering the convenience for visually impaired users, you can consider voice assistant startup or physical button quick startup. After startup, the smart identification page is entered.
- Figure 15 shows an example of the smart identification page.
- An embodiment of the electronic device in the embodiment of the present application includes:
- the determination unit 1601 is used to determine a first route if a crosswalk is detected from multiple frames of images, and the first route is a safe traveling route on the crosswalk;
- the obtaining unit 1602 is used to obtain the second route actually walked by the user based on the multi-frame image
- the output unit 1603 is configured to output first indication information to the user if the angle between the first route and the second route in the same reference system is greater than or equal to a preset angle, and the first indication information is used to indicate that the second route is abnormal.
- the electronic device may further include: an opening unit 1604, configured to open the first application based on the user's operation.
- An application is used for an electronic device to obtain multiple frames of images from the perspective of the user directly in front of the user.
- each unit in the electronic device the operations performed by each unit in the electronic device are similar to those described in the aforementioned embodiments shown in FIGS. 1 to 14 , and will not be described again here.
- the output unit 1603 by determining the angle between the safe traveling route and the user's actual traveling route, when the angle is greater than or equal to the preset angle, the output unit 1603 outputs first indication information to the user to indicate the user's actual traveling route.
- the direction of travel is abnormal and the user deviates from the crosswalk during movement, the safety of the user passing the crosswalk is improved through the output of the first instruction information.
- the electronic device may include a processor 1701, a memory 1702, and a communication port 1703.
- the processor 1701, memory 1702 and communication port 1703 are interconnected through lines.
- the memory 1702 stores program instructions and data.
- the memory 1702 stores program instructions and data corresponding to the steps executed by the electronic device in the corresponding embodiments shown in FIGS. 1 to 15 .
- the processor 1701 is configured to perform the steps performed by the electronic device shown in any of the embodiments shown in FIGS. 1 to 15 .
- the communication port 1703 can be used to receive and send data, and to perform steps related to obtaining, sending, and receiving in any of the embodiments shown in FIG. 1 to FIG. 15 .
- the electronic device may include more or fewer components than in FIG. 17 , which is merely an illustrative description in this application and is not limiting.
- the disclosed systems, devices and methods can be implemented in other ways.
- the device embodiments described above are only illustrative.
- the division of units is only a logical function division. In actual implementation, there may be other division methods.
- multiple units or components may be combined or integrated. to another system, or some features can be ignored, or not implemented.
- the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
- the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
- each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.
- the above-mentioned integrated units can be implemented in whole or in part through software, hardware, firmware, or any combination thereof.
- the integrated unit When the integrated unit is implemented using software, it may be implemented in whole or in part in the form of a computer program product.
- the computer program product includes one or more computer instructions.
- the computer program instructions When the computer program instructions are loaded and executed on a computer, the processes or functions described in accordance with the embodiments of the present invention are generated in whole or in part.
- the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable device.
- the computer instructions may be stored in or transmitted from one computer-readable storage medium to another, e.g., the computer instructions may be transferred from a website, computer, server, or data center Through wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) To another website, computer, server or data center.
- the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more available media integrated.
- the available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, DVD), or semiconductor media (eg, solid state disk (SSD)), etc.
- At least one refers to one or more, and “plurality” refers to two or more.
- “And/or” describes the association of associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean: including the existence of A alone, the existence of A and B, and the existence of B alone, where A ,B can be singular or plural.
- the character "/” generally indicates that the related objects are in an "or” relationship.
- At least one of the following” or similar expressions thereof refers to any combination of these items, including any combination of a single item (items) or a plurality of items (items).
- at least one of a, b, or c can mean: a, b, c, ab, ac, bc, or abc, where a, b, c can be single or multiple .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Environmental & Geological Engineering (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Navigation (AREA)
Abstract
本申请实施例提供了一种图像处理方法,该方法应用于电子设备,该电子设备用于获取用户正前方视角的多帧图像,方法包括:若从多帧图像中检测到人行横道,确定第一路线,第一路线为人行横道上的安全行进路线;基于多帧图像获取用户实际行走的第二路线;若第一路线与第二路线在同一参考系的夹角大于或等于预设角度,向用户输出第一指示信息,第一指示信息用于指示第二路线异常。进而用户在移动过程中偏离人行横道的情况下,通过输出的第一指示信息提升用户通过人行横道的安全。
Description
本申请要求于2022年4月13日提交中国专利局、申请号为202210384608.3、发明名称为“一种图像处理方法及相关设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本申请实施例涉及终端技术领域,尤其涉及一种图像处理方法及相关设备。
视障人群最渴望的是能够向普通人一样拥有可以独立自主的正常日常生活。出行在他们的生活中占了很重要的部分。
目前,视障人群出行时主要依赖盲杖,通过盲杖判断道路上的障碍物。由于盲杖主要针对的是前方障碍物,在无障碍物或较少障碍物的人行横道等场景下,视障人群通常根据出行经验进行移动。
但是,由于视障人群普遍存在由于方向感较差导致的难以直线行走的问题。特别是在人行横道等场景下,道路情况复杂,各种机动车、非机动车窜行。视障人群根据出行经验通过人行横道等场景存在较大安全风险。
发明内容
本申请实施例提供了一种图像处理方法及相关设备。可以在用户移动过程中偏离人行横道的情况下,通过输出的第一指示信息提升用户通过人行横道的安全。
本申请实施例第一方面提供了一种图像处理方法,该方法应用于电子设备,该电子设备用于获取用户正前方视角的多帧图像,方法包括:若从多帧图像中检测到人行横道,确定第一路线,第一路线为人行横道上的安全行进路线;基于多帧图像获取用户实际行走的第二路线;若第一路线与第二路线在同一参考系的夹角大于或等于预设角度,向用户输出第一指示信息,第一指示信息用于指示第二路线异常。可选地,上述输出第一指示信息的条件,还可以是:第一路线与第二路线在同一参考系的夹角在一段时间内大于或等于预设角度。即防止由于某一帧的测量失误或晃动导致的判断不准确的情况,避免频繁输出第一指示信息。另外,上述方法可以适用于用户通过人行横道场景,例如盲人过街场景、失聪人员过街场景等等。上述所提的参考系可以是指虚拟的参考系、平面(例如地面)参考系、电子设备的相机参考系等等,具体此处不做限定。例如,若平面是或者人行横道所在的地面,第一路线与第二路线在同一参考系的夹角可以理解为是第一路线投影到地面与第二路线之间的夹角。另外,第一路线可以理解为是用户/电子设备在地面上的安全移动路线,第二路线可以理解为是用户/电子设备在地面上的实际移动路线。
本申请实施例中,通过确定安全行进路线与用户实际行进路线之间的夹角,在夹角大于或等于预设角度的情况下,向用户输出第一指示信息,以指示用户的实际行进方向异常,进而用户在移动过程中偏离人行横道的情况下,通过输出的第一指示信息提升用户通过人行横
道的安全。
可选地,在第一方面的一种可能的实现方式中,上述电子设备获取多帧图像时的第一位姿包括第一俯仰角与第一倾斜角,第一俯仰角的取值范围为60至100度,第一倾斜角的取值范围在60至120度。
该种可能的实现方式中,第一位姿中的第一俯仰角的取值范围为60至100度,第一倾斜角的取值范围在60至120度。使得电子设备采集图像时的位姿满足条件,或者理解为在上述条件下,电子设备采集的图像才对确定第一路线和/或第二路线有参考价值。或者说在获取多帧图像之前,可以先判断电子设备的位姿是否满足条件,在电子设备的位姿满足条件的情况下,才获取多帧图像,以保证获取的多帧图像合理,第一路线与第二路线的判断更加准确。
可选地,在第一方面的一种可能的实现方式中,上述步骤还包括:获取电子设备的第二位姿,第二位姿中的第二俯仰角与第一俯仰角不同,或者第二位姿中的第二倾斜角与第一倾斜角不同;输出第二指示信息,第二指示信息用于用户调整第二位姿至第一位姿。
该种可能的实现方式中,在电子设备的位姿不满足拍照条件的情况下,通过第二指示信息使得用户可以调整电子设备的位姿,以将电子设备的第二位姿调整至满足拍照条件的第一位姿或预设位姿区间,从而使得后续获取的图像更加合理,第一路线与第二路线的判断更加准确。
可选地,在第一方面的一种可能的实现方式中,上述的第一路线为人行横道两侧边缘延伸方向所形成的夹角所在的角平分线,或者第一路线平行于人行横道的边缘。进一步的,第一路线与人行横道中横线(或称为斑马线)中短边的边缘平行,或者斑马线两个短边的边缘延长线相交形成一个角,角平分线的指向交点的方向即为第一路线,又或者第一路线为垂直于斑马线的方向等等,可以根据实际需要设置,具体此处不做限定。
该种可能的实现方式中,提供了第一路线的多种确定方式,在图像含有全部或部分人行横道的情况下,可以确定第一路线。
可选地,在第一方面的一种可能的实现方式中,上述步骤:向用户输出第一指示信息之前,方法还包括:确定多帧图像中人行横道的面积逐渐减小。
该种可能的实现方式中,电子设备在发出第一指示信息之前,确定多帧图像中人行横道的面积逐渐减小,保证用户站在人行横道路口时,判断用户是否真正的需要通过人行横道,或走上斑马线。
可选地,在第一方面的一种可能的实现方式中,上述电子设备采集多帧图像时的移动路线为第二路线,移动路线平行于人行横道所在的地面。
该种可能的实现方式中,电子设备采集多帧图像时的平行地面的路线为第二路线,或者说电子设备在平行于地面的维度上的移动是第二路线,提供了第二路线的确定方式。
可选地,在第一方面的一种可能的实现方式中,上述步骤:向用户输出第一指示信息之后,方法还包括:若电子设备获取的新图像中不包括人行横道,向用户输出第三指示信息,第三指示信息用于指示用户已通过人行通道。
该种可能的实现方式中,在用户正常移动的过程中,通过新图像判断用户是否已通过人行横道,在新图像不包括人行横道的情况下,发出第三指示信息以提醒用户已通过人行横道,进而提升用户体验。
可选地,在第一方面的一种可能的实现方式中,上述的多帧图像还包括红绿灯信息、人行横道的长度以及与人行横道相关的机动车信息;红绿灯信息包括红绿灯的颜色与时长,机动车信息包括机动车相对于人行横道的行驶方向与距离;第一指示信息还用于指示用户在人行横道上的移动路径。
该种可能的实现方式中,多帧图像中还可以包括红绿灯信息、人行横道的长度以及与人行横道相关的机动车信息等,进而通过输出第一指示信息的方式保证用户安全。
可选地,在第一方面的一种可能的实现方式中,上述步骤:确定第一路线之前,方法还包括:基于用户的操作开启第一应用,第一应用用于电子设备获取用户正前方视角的多帧图像。
该种可能的实现方式中,可以通过用户的操作来启动获取用户前方视角的多帧图像,减少由于不通过人行横道场景下带来的能耗。
可选地,在第一方面的一种可能的实现方式中,上述的第一指示信息的表现形式包括语音和/或标记信息。
该种可能的实现方式中,在用户为盲人的情况下,第一指示信息可以是语音,即盲人可以通过第一指示信息的语音安全通过人行横道。在用户为失聪用户或低头看电子设备的用户的情况下,第一指示信息可以是标记信息,即失聪用户或低头看电子设备的用户可以通过第一指示信息的标记信息安全通过人行横道。
本申请实施例第二方面提供了一种电子设备,该电子设备用于获取用户正前方视角的多帧图像,电子设备包括:确定单元,用于若从多帧图像中检测到人行横道,确定第一路线,第一路线为人行横道上的安全行进路线;获取单元,用于基于多帧图像获取用户实际行走的第二路线;输出单元,用于若第一路线与第二路线在同一参考系的夹角大于或等于预设角度,向用户输出第一指示信息,第一指示信息用于指示第二路线异常。
可选地,在第二方面的一种可能的实现方式中,上述的电子设备获取多帧图像时的第一位姿包括第一俯仰角与第一倾斜角,第一俯仰角的取值范围为60至100度,第一倾斜角的取值范围在60至120度。
可选地,在第二方面的一种可能的实现方式中,上述的获取单元,还用于获取电子设备的第二位姿,第二位姿中的第二俯仰角与第一俯仰角不同,或者第二位姿中的第二倾斜角与第一倾斜角不同;输出单元,还用于输出第二指示信息,第二指示信息用于用户调整第二位姿至第一位姿。
可选地,在第二方面的一种可能的实现方式中,上述的第一路线为人行横道两侧边缘延伸方向所形成的夹角所在的角平分线,或者第一路线平行于人行横道的边缘。
可选地,在第二方面的一种可能的实现方式中,上述的确定单元,还用于确定多帧图像中人行横道的面积逐渐减小。
可选地,在第二方面的一种可能的实现方式中,上述的电子设备采集多帧图像时的移动路线为第二路线,移动路线平行于人行横道所在的地面。
可选地,在第二方面的一种可能的实现方式中,上述的输出单元,还用于若电子设备获取的新图像中不包括人行横道,向用户输出第三指示信息,第三指示信息用于指示用户已通过人行通道。
可选地,在第二方面的一种可能的实现方式中,上述的多帧图像还包括红绿灯信息、人行横道的长度以及与人行横道相关的机动车信息;红绿灯信息包括红绿灯的颜色与时长,机动车信息包括机动车相对于人行横道的行驶方向与距离;第一指示信息还用于指示用户在人行横道上的移动路径。
可选地,在第二方面的一种可能的实现方式中,上述的电子设备还包括:开启单元,用于基于用户的操作开启第一应用,第一应用用于电子设备获取用户正前方视角的多帧图像。
可选地,在第二方面的一种可能的实现方式中,上述的第一指示信息的表现形式包括语音和/或标记信息。
本申请实施例第三方面提供了一种电子设备,包括:处理器,处理器与存储器耦合,存储器用于存储程序或指令,当程序或指令被处理器执行时,使得该电子设备实现上述第一方面或第一方面的任意可能的实现方式中的方法。
本申请实施例第四方面提供了一种计算机可读介质,其上存储有计算机程序或指令,当计算机程序或指令在计算机上运行时,使得计算机执行前述第一方面或第一方面的任意可能的实现方式中的方法。
本申请实施例第五方面提供了一种计算机程序产品,该计算机程序产品在计算机上执行时,使得计算机执行前述第一方面或第一方面的任意可能的实现方式中的方法。
其中,第二、第三、第四、第五方面或者其中任一种可能实现方式所带来的技术效果可参见第一方面或第一方面不同可能实现方式所带来的技术效果,此处不再赘述。
从以上技术方案可以看出,本申请实施例具有以下优点:本申请实施例中,通过确定安全行进路线与用户实际行进路线之间的夹角,在夹角大于或等于预设角度的情况下,向用户输出第一指示信息,以指示用户的实际行进方向异常,进而用户在移动过程中偏离人行横道的情况下,通过输出的第一指示信息提升用户通过人行横道的安全。
图1为本发明实施例提供的一种应用场景示意图;
图2为本申请实施例提供的电子设备的结构示意图;
图3为本申请实施例提供的图像处理方法一个流程示意图;
图4为本申请实施例提供的一种图像的示例图;
图5为本申请实施例提供的另一种图像的示例图;
图6为本申请实施例提供的一种人行横道的示例图;
图7为本申请实施例提供的另一种人行横道的示例图;
图8为本申请实施例提供的另一种人行横道的示例图;
图9为本申请实施例提供的电子设备的姿态角的示例图;
图10为本申请实施例提供的一种第一路线与第二路线的示例图;
图11为本申请实施例提供的另一种第一路线与第二路线的示例图;
图12为本申请实施例提供的一种第一指示信息的示例图;
图13为本申请实施例提供的一种电子设备处于第二位姿下的场景示例图;
图14为本申请实施例提供的一种电子设备处于第一位姿下的场景示例图;
图15为本申请实施例提供的一种电子设备中智慧识别功能的一种示例图;
图16为本申请实施例提供的电子设备另一个结构示意图;
图17为本申请实施例提供的电子设备另一个结构示意图。
本申请实施例提供了一种图像处理方法及相关设备。可以在用户移动过程中偏离人行横道的情况下,通过输出的第一指示信息提升用户通过人行横道的安全。
目前,视障人群出行时主要依赖盲杖,通过盲杖判断道路上的障碍物。由于盲杖主要针对的是前方障碍物,在无障碍物或较少障碍物的人行横道等场景下,视障人群通常根据出行经验进行移动。但是,由于视障人群普遍存在由于方向感较差导致的难以直线行走的问题。特别是在人行横道等场景下,道路情况复杂,各种机动车、非机动车窜行。视障人群根据出行经验通过人行横道等场景存在较大安全风险。
为了解决上述技术问题,本申请实施例提供了一种界面显示方法及电子设备,通过确定安全行进路线与用户实际行进路线之间的夹角,在夹角大于或等于预设角度的情况下,向用户输出第一指示信息,以指示用户的实际行进方向异常,进而用户在移动过程中偏离人行横道的情况下,通过输出的第一指示信息提升用户通过人行横道的安全。
在对本申请实施例所提供的方法进行描述之前,先对本申请实施例所提供的方法的应用场景进行描述。本申请实施例提供的方法的应用场景可以如图1所示。该应用场景包括:用户001与电子设备002。
其中,用户001:借助电子设备002来辅助通过人行横道(或称为马路、斑马线等)。该用户001可以是视障用户,也可以是正常用户等。换句话说,用户可以是盲人用户、失聪用户或正常用户。
电子设备002:搭载有操作系统,并装载有应用程序(例如,智能识别等)。可以通过用户001的操作采集图像,识别图像中的人行横道(例如,斑马线),发出指示信息(例如,预警信息、纠偏信息等)等等。上述的指示信息可以是音频信息或可视化的图像等等方式,具体此处不做限定。
下面结合附图详细说明本申请实施例提供了一种界面显示方法及电子设备。
首先,先对电子设备进行描述,本申请实施例中的电子设备可以是手机、智能手表、智能眼镜、智能手环、平板电脑(pad)、便携式游戏机、掌上电脑(personal digital assistant,PDA)、笔记本电脑、超级移动个人计算机(ultra mobile personal computer,UMPC)、手持计算机、上网本、车载媒体播放设备、可穿戴电子设备(例如:手表、手环、眼镜)、虚拟现实(virtua l rea lity,VR)终端设备、增强现实(a ugmented reality,AR)终端设备等数显产品。本申请实施例仅以电子设备是手机为例进行描述。
请参考图2,为本申请实施例提供的一种电子设备的结构示意图。如图2所示,电子设备可以包括处理器210,外部存储器接口220,内部存储器221,通用串行总线(universal
serial bus,USB)接口230,充电管理模块240,电源管理模块241,电池242,天线1,天线2,移动通信模块250,无线通信模块260,音频模块270,扬声器270A,受话器270B,麦克风270C,耳机接口270D,传感器模块280,按键290,马达291,指示器292,摄像头293,显示屏294,以及用户标识模块(subscriber identification module,SIM)卡接口295等。其中,传感器模块280可以包括压力传感器280A,陀螺仪传感器280B,气压传感器280C,磁传感器280D,加速度传感器280E,距离传感器280F,接近光传感器280G,指纹传感器280H,温度传感器280J,触摸传感器280K,环境光传感器280L,骨传导传感器280M等。
可以理解的是,本实施例示意的结构并不构成对电子设备的具体限定。在另一些实施例中,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器210可以包括一个或多个处理单元,例如:处理器210可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以是电子设备的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器210中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器210中的存储器为高速缓冲存储器。该存储器可以保存处理器210刚用过或循环使用的指令或数据。如果处理器210需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器210的等待时间,因而提高了系统的效率。
在一些实施例中,处理器210可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
可以理解的是,本实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备的结构限定。在另一些实施例中,电子设备也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块240用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块240可以通过USB接口230接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块240可以通过电子设备的无线充电线圈接收无线充电输入。充电管理模块240为电池242充电的同时,还可以通过电源管理模块241为电子设备供电。
电源管理模块241用于连接电池242,充电管理模块240与处理器210。电源管理模块
241接收电池242和/或充电管理模块240的输入,为处理器210,内部存储器221,外部存储器,显示屏294,摄像头293,和无线通信模块260等供电。电源管理模块241还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块241也可以设置于处理器210中。在另一些实施例中,电源管理模块241和充电管理模块240也可以设置于同一个器件中。
电子设备的无线通信功能可以通过天线1,天线2,移动通信模块250,无线通信模块260,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块250可以提供应用在电子设备上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块250可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块250可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块250还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块250的至少部分功能模块可以被设置于处理器210中。在一些实施例中,移动通信模块250的至少部分功能模块可以与处理器210的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器270A,受话器270B等)输出声音信号,或通过显示屏294显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器210,与移动通信模块250或其他功能模块设置在同一个器件中。
无线通信模块260可以提供应用在电子设备上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块260可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块260经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器210。无线通信模块260还可以从处理器210接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备的天线1和移动通信模块250耦合,天线2和无线通信模块260耦合,使得电子设备可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TDSCDMA),长期
演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备通过GPU,显示屏294,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏294和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器210可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏294用于显示图像,视频等。该显示屏294包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。
电子设备可以通过ISP,摄像头293,视频编解码器,GPU,显示屏294以及应用处理器等实现拍摄功能。
ISP用于处理摄像头293反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头293中。
摄像头293用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备可以包括1个或N个摄像头293,N为大于1的正整数。
摄像头293还可以用于根据感知到的外部的环境和用户的动作,电子设备向用户提供个性化的、情景化的业务体验。其中,摄像头293能够获取丰富、准确的信息使得电子设备感知外部的环境、用户的动作。具体的,本申请实施例中,摄像头293可以用于识别电子设备的使用者是第一用户还是第二用户。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备可以支持一种或多种视频编解码器。这样,电子设备可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口220可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备的存储能力。外部存储卡通过外部存储器接口220与处理器210通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器221可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器210通过运行存储在内部存储器221的指令,从而执行电子设备的各种功能应用以及数据处理。例如,在本申请实施例中,处理器210可以通过执行存储在内部存储器221中的指令,响应于用户在显示屏294的操作,在显示屏显示对应的显示内容。内部存储器221可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器221可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备可以通过音频模块270,扬声器270A,受话器270B,麦克风270C,耳机接口270D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块270用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块270还可以用于对音频信号编码和解码。在一些实施例中,音频模块270可以设置于处理器210中,或将音频模块270的部分功能模块设置于处理器210中。扬声器270A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备可以通过扬声器270A收听音乐,或收听免提通话。受话器270B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备接听电话或语音信息时,可以通过将受话器270B靠近人耳接听语音。麦克风270C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息或需要通过语音助手触发电子设备执行某些功能时,用户可以通过人嘴靠近麦克风270C发声,将声音信号输入到麦克风270C。电子设备可以设置至少一个麦克风270C。在另一些实施例中,电子设备可以设置两个麦克风270C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备还可以设置三个,四个或更多麦克风270C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口270D用于连接有线耳机。耳机接口270D可以是USB接口230,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器280A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器280A可以设置于显示屏294。压力传感器280A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器280A,电极之间的电容改变。电子设备根据电容的变化确定压力的强度。当有触摸操作作用于显示屏294,电子设备根据压力传感器280A检测所述触摸操作强度。电子设备也可以根据压力传感器280A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于压力阈值的触摸操作作用于短消息应用图标时,执
行查看短消息的指令。当有触摸操作强度大于或等于压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器280B可以用于确定电子设备的运动姿态。在一些实施例中,可以通过陀螺仪传感器280B确定电子设备围绕三个轴(即,x、y和z轴)的角速度。陀螺仪传感器280B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器280B检测电子设备抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备的抖动,实现防抖。陀螺仪传感器280B还可以用于导航,体感游戏场景。另外,陀螺仪传感器280B,还可以用于测量电子设备的旋转幅度或移动距离。
气压传感器280C用于测量气压。在一些实施例中,电子设备通过气压传感器280C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器280D包括霍尔传感器。电子设备可以利用磁传感器280D检测翻盖皮套的开合。在一些实施例中,当电子设备是翻盖机时,电子设备可以根据磁传感器280D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器280E可检测电子设备在各个方向上(一般为三轴)加速度的大小。当电子设备静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。另外,加速度传感器280E,还可以用于测量电子设备的朝向(即朝向的方向向量)。
距离传感器280F,用于测量距离。电子设备可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备可以利用距离传感器280F测距以实现快速对焦。
接近光传感器280G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备通过发光二极管向外发射红外光。电子设备使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备附近有物体。当检测到不充分的反射光时,电子设备可以确定电子设备附近没有物体。电子设备可以利用接近光传感器280G检测用户手持电子设备贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器280G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器280L用于感知环境光亮度。电子设备可以根据感知的环境光亮度自适应调节显示屏294亮度。环境光传感器280L也可用于拍照时自动调节白平衡。环境光传感器280L还可以与接近光传感器280G配合,检测电子设备是否在口袋里,以防误触。
指纹传感器280H用于采集指纹。电子设备可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器280J用于检测温度。在一些实施例中,电子设备利用温度传感器280J检测的温度,执行温度处理策略。例如,当温度传感器280J上报的温度超过阈值,电子设备执行降低位于温度传感器280J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备对电池242加热,以避免低温导致电子设备异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备对电池242的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器280K,也称“触控面板”。触摸传感器280K可以设置于显示屏294,由触摸传感器280K与显示屏294组成触摸屏,也称“触控屏”。触摸传感器280K用于检测作用于其
上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏294提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器280K也可以设置于电子设备的表面,与显示屏294所处的位置不同。
骨传导传感器280M可以获取振动信号。在一些实施例中,骨传导传感器280M可以获取人体声部振动骨块的振动信号。骨传导传感器280M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器280M也可以设置于耳机中,结合成骨传导耳机。音频模块270可以基于所述骨传导传感器280M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器280M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键290包括开机键,音量键等。按键290可以是机械按键。也可以是触摸式按键。电子设备可以接收按键输入,产生与电子设备的用户设置以及功能控制有关的键信号输入。
其中,电子设备通过传感器模块280中的各类传感器、按键290、和/或摄像头293等。
马达291可以产生振动提示。马达291可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏294不同区域的触摸操作,马达291也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器292可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口295用于连接SIM卡。SIM卡可以通过插入SIM卡接口295,或从SIM卡接口295拔出,实现和电子设备的接触和分离。电子设备可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口295可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口295可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口295也可以兼容不同类型的SIM卡。SIM卡接口295也可以兼容外部存储卡。电子设备通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备中,不能和电子设备分离。
下面将以下电子设备是手机为例对本申请实施例提供的图像处理方法进行描述。该方法可以由电子设备执行,也可以由电子设备的部件(例如处理器、芯片、或芯片系统等)执行,该电子设备用于获取用户正前方视角的多帧图像。如图3所示,该图像处理方法可以包括步骤301至步骤303,下面分别描述。
步骤301,若从多帧图像中检测到人行横道,确定第一路线,第一路线为人行横道上的安全进行路线。
本申请实施例中,电子设备获取多帧图像的方式有多种方式,可以是通过采集/拍摄的方式,也可以是通过接收其他设备(与电子设备连接的其他设备)发送的方式,还可以是从数据库中选取的方式等,具体此处不做限定。本文仅以电子设备实时获取用户正前方视角的多帧图像为例进行示意性说明,在实际应用中,电子设备可以周期性获取用户正前方视角的多帧图像等等,具体此处不做限定。
可选地,若应用于行人通过人行横道场景电子设备可以是行人携带的手机、智能手表、智能眼镜等便携式电子设备。
若从多帧图像中检测到人行横道,确定第一路线,第一路线为人行横道上的安全进行路线。
本申请实施例中的第一路线有多种情况,第一路线可以是人行横道的延伸方向,也可以理解为是垂直于人行横道中横线的方向等,具体此处不做限定。需要说明的是,第一路线可以不只是一条路线,可以是在一定范围内的多条路线,例如。第一路线是人行横道中两边横线组成的“角”范围内由交点向外引出的多条路线。
其中,在上述第一路线是人行横道的延伸方向的情况下,第一路线的确定与图像中人行横道的完整程度相关。在一种可能实现的方式中,图像中包括完整的人行横道,人行横道两边的边缘边缘延长线相交形成一个角,角平分线的指向交点的方向即为第一路线(即第一路线为人行横道两侧边缘延伸方向所形成的夹角所在的角平分线)。在另一种可能实现的方式中,图像中包括不完整的人行横道,第一路线平行于人行横道的边缘,或者垂直于人行横道的线条且远离电子设备所在位置的方向为第一路线等等,具体此处不做限定。
示例性的,在图像包括完整人行横道的情况,图像可以如图4所示。在图像包括部分人行横道的情况,图像可以如图5所示,可以理解的是,电子设备获取的多帧图像可以包括图4与图5的图像,也可以包括图4或图5的图像,具体此处不做限定。
本申请实施例中的人行横道有多种情况,可以位于没有红绿灯的路口(例如图4所示),可以位于只有一个红绿灯的路口(例如图6所示),也可以位于只有一个红绿灯且中间有安全岛的路口(例如图7所示),还可以位于有两个红绿灯且中间有安全岛的路口(例如图8所示)等等,对人行横道的具体情况可以根据实际需要设置,具体此处不做限定。
另外,本申请实施例中人行横道的长度、宽度、线间隔等可以参考城市道路交通标志和标线设置规范(GB 51038-2015)的规定。
可选地,人行横道线应采用一组白色平行粗实线,线宽宜为40cm或45cm,线间隔宜为60cm,最大不应超过80cm。人行横道线宽度应大于或等于3m,应以1m为一级加宽。
可选地,人行横道线的设置宽度、形式、位置应符合下列规定:当人行横道线长度大于16m时,应在分隔带或对向车道分界线出设置安全岛;安全岛长度不应小于人行横道线宽度,安全岛宽度不应小于2m,困难情况下不应小于1.5m;安全岛宜增设弹性交通柱及安全防护等设施。
本申请实施例中的多帧图像包括对象,该对象包括人行横道,人行横道用于行人穿越车行道。可以理解的是,多帧图像中的图像可以包括完整的人行横道或部分人行横道,多帧图像中的一个图像可以包括完整的人行横道,另一个图像可以包括部分人行横道等等,具体此处不做限定。
可选地,上述多帧图像中的对象还可以包括红绿灯信息、人行横道的长度、与人行横道相关的机动车信息等。其中,红绿灯信息可以包括红绿灯的颜色与时长等等,机动车信息包括机动车相对于车行道的行驶方向与距离等等。
可选地,多帧图像中人行横道的面积可以逐渐减小,该种情况下可以理解为是,用户正在偏离人行横道。
步骤302,基于多帧图像获取用户实际行走的第二路线。
本申请实施例中第二路线的确定有多种情况,第二路线可以是电子设备平行于人行横道所在地面的移动路线,也可以是手持电子设备的用户的移动路线,还可以是电子设备的朝向所在的路线,还可以是图像的中线上远离电子设备所在位置的路线等等,具体此处不做限定。
可选地,可以根据电子设备采集图像时的位姿确定第二路线。该种情况下,电子设备可以通过陀螺仪和/或加速度传感器等检测电子设备的位姿。
本申请实施例中电子设备的/位姿朝向可以是指电子设备的姿态角,该姿态角可以包括方位角与倾斜角(或称为倾侧角),或者该姿态角包括方位角、倾斜角以及俯仰角。其中,方位角代表绕z轴的角度,倾斜角代表绕y轴的角度,俯仰角代表绕x轴的角度。电子设备的朝向与x轴、y轴、z轴的关系可以如图9所示。
示例性的,延续上述如图4所示的图像为例,第一路线与第二路线可以如图10所示,第二路线为电子设备的朝向所在的路线,第一路线为垂直于人行横道的线条且远离电子设备所在位置的路线。进而可以确定第一路线与第二路线在同一参考系(例如:人行横道所在平面)之间的夹角为0度。
示例性的,延续上述如图5所示的图像为例,第一路线与第二路线可以如图11所示,第二路线为电子设备的朝向所在的路线,第一路线为垂直于人行横道的线条且远离电子设备所在位置的路线。进而可以确定第一路线与第二路线之间的夹角为α。
本申请实施例中的第一路线与第二路线可以呈现在多帧图像上,具体可以是承载在多帧图像的每张图像上。另外,第一路线与第二路线可以通过动态/静态的图像方式或语音方式向用户呈现,具体此处不做限定。
步骤303,若第一路线与第二路线在同一参考系的夹角大于或等于预设角度,向用户输出第一指示信息,第一指示信息用于指示第二路线异常。
电子设备在确定第一路线与第二路线在同一参考系的夹角之后,在夹角大于或等于预设角度的情况下,向用户输出第一指示信息。该第一指示信息可以用于指示移动方向异常,和/或用于纠正移动方向(或者理解为是为用户提供正确的移动方向,例如提示用户向左或右偏移多少进行移动等等)。
上述所提的参考系可以是指虚拟的参考系、平面(例如地面)参考系、电子设备的相机参考系等等,具体此处不做限定。例如,若平面是或者人行横道所在的地面,第一路线与第二路线在同一参考系的夹角可以理解为是第一路线投影到地面与第二路线之间的夹角。另外,第一路线可以理解为是用户/电子设备在地面上的安全移动路线,第二路线可以理解为是用户/电子设备在地面上的实际移动路线。
可选地,用户在移动过程中,电子设备采集用户正前方视角的多帧图像。第一路线与第二路线在同一参考系的夹角在预设时间段内大于或等于预设角度。该种情况下,可以理解为,在预设时间段内,夹角一直大于或等于预设角度,再向用户输出第一指示信息。该种情况下,通过多帧图像判断用户的移动方向是否偏移人行横道,防止某一帧的偏差过大导致的误判。
可选地,在多帧图像中人行横道的面积可以逐渐减小,且夹角大于或等于预设角度的情况下,向用户输出第一指示信息。换句话说,在向用户输出第一指示信息之前,电子设备确定多帧图像中人行横道的面积逐渐减小。该种情况,是为了保证用户站在人行横道的路口,
判断用户是否真正的要走人行横道。
本申请实施例中的预设角度可以根据实际需要设置,例如:45度、30度等等,具体此处不做限定。
示例性的,以图像是前述图11为例,若α大于或等于预设角度,向用户输出第一指示信息。
可选地,若图像中的对象包括红绿灯信息、人行横道长度以及与人行横道相关的机动车信息等,该第一指示信息还可以用于指示用户在人行横道上的移动路径、红绿灯信息、影响用户在人行横道上移动的障碍物等等。该种情况下,用户可以根据第一指示信息确定红绿灯信息、在人行横道上的移动路径和/或人行横道上的障碍物等。
本申请实施例中第一指示信息的表现形式包括以下至少一种,下面分别描述:
1、第一指示信息是语音。
在第一指示信息是语音的情况下,用户可以通过听第一指示信息判断是否异常以及纠正移动方向。
示例性的,在第一指示信息用于指示移动方向异常的情况下,第一指示信息可以是“请注意,移动方向异常”的语音信息,或者“请注意,您已偏离人行横道”的语音信息等等。在第一指示信息用于纠正移动方向的情况下,第一指示信息可以是:“您已偏离人行横道,请向右侧进行移动”的语音信息,或者“您已偏离人行横道,请向左侧进行移动”的语音信息等等。当然,也可以持续用第一指示信息向用户播报,直至用户的实际移动路线与安全行进路线之间的夹角小于某个角度。
示例性的,图像中的对象包括红绿灯信息、人行横道长度以及与人行横道相关的机动车信息等,第一指示信息可以是:“当前是红灯,还有3秒变为绿灯”,或者“当前在人行横道上,距离通过人行横道还有2米”,或者“您右前方有行人,请注意”,或者“您右方有机动车驶来,请注意”等等。
该种情况下,在用户为盲人的场景下,盲人可以根据语音判断自己的移动方向是否异常,和/或如何纠正移动方向。换句话说,盲人可以根据语音保证自己通过人行横道的安全。
2、第一指示信息是在图像上呈现的标记信息。
在第一指示信息是标记信息的情况下,用户可以通过低头查看第一指示信息的方式判断是否异常以及纠正移动方向。
示例性的,第一指示信息可以是在图像上呈现的标记信息,用户通过查看该标记信息确定移动方向的偏差,进而可以根据标记信息与实际移动方向之间的偏差对实际移动方向进行纠正,进而提升用户在低头查看电子设备的状态下安全通过人行横道。
该种情况下,可以适用于用户低头查看手机通过人行横道的场景或失聪人员通过人行横道的场景,可以通过查看图像上呈现标记信息的方式确保低头用户或失聪用户通过人行横道的安全。
示例性的,在第一指示信息是标记信息的情况下,第一指示信息可以如图12所示。
需要说明的是,上述几种情况还可以混合使用,例如,第一指示信息可以包括语音与标记信息,即在向用户播报语音的同时,向用户显示标记信息。另外,上述几种情况只是举例,在实际应用中,第一指示信息还可以有其他表达形式,具体此处不做限定。
本申请实施例中,通过确定安全行进路线与用户实际行进路线之间的夹角,在夹角大于或等于预设角度的情况下,向用户输出第一指示信息,以指示用户的实际行进方向异常,进而用户在移动过程中偏离人行横道的情况下,通过输出的第一指示信息提升用户通过人行横道的安全。
可选地,为了保证电子设备获取的图像更加合理,在步骤301之前,还可以对电子设备的位姿进行判断和/或调整。
在一种可能实现的方式中,电子设备获取多帧图像时的第一位姿包括第一俯仰角与第一倾斜角,第一俯仰角的取值范围为60至100度,第一倾斜角的取值范围在60至120度。
在另一种可能实现的方式中,电子设备在获取多帧图像之前,先获取用户携带的电子设备的第二位姿。并确定第二位姿中的第二俯仰角与第一俯仰角不同,或者第二位姿中的第二倾斜角与第一倾斜角不同。或者说第二位姿并不适合电子设备采集图像。该种情况下,可以输出第二指示信息。该第二位姿不在预设位姿区间,可以理解为是不满足拍照条件的位姿。上述的第二指示信息用于用户调整电子设备的第二位姿至第一位姿。
本申请实施例中,对于电子设备采集图像时的方位角不做限制。
示例性的,电子设备的第二位姿如图13所示,电子设备向用户输出第二指示信息之后,如图14所示,电子设备调整第二位姿至第一位姿。
可选地,在步骤303之后,若电子设备获取的新图像中不包括人行横道,向用户输出第三指示信息,第三指示信息用于指示用户已通过人行通道。
本申请实施例中的第一指示信息、第二指示信息、第三指示信息的表达形式可以是语音、标记信息等,具体此处不做限定。
另外,在步骤301之前,电子设备可以基于用户的操作开启第一应用,第一应用用于电子设备获取用户正前方视角的多帧图像。该功能也可以称为智慧识别,智慧识别可以是一个单独的应用程序,也可以是已有的应用程序的一部分新功能。例如,可以作为相机应用程序内一种新的工作模式。其中,智慧识别可以通过多种方式启动,例如,打开应用程序后选择智慧识别功能后启动、在主屏的副页快捷启动、通过语音助手启动、通过物理按键快捷启动。考虑到方便视障用户的使用,可以考虑语音助手启动或物理按键快速启动。启动后,进入智慧识别页面,图15示出了智慧识别页面的一种示例。
上面对本申请实施例中的图像处理方法进行了描述,下面对本申请实施例中的电子设备进行描述,请参阅图16,本申请实施例中电子设备的一个实施例包括:
确定单元1601,用于若从多帧图像中检测到人行横道,确定第一路线,第一路线为人行横道上的安全行进路线;
获取单元1602,用于基于多帧图像获取用户实际行走的第二路线;
输出单元1603,用于若第一路线与第二路线在同一参考系的夹角大于或等于预设角度,向用户输出第一指示信息,第一指示信息用于指示第二路线异常。
可选地,电子设备还可以包括:开启单元1604,用于基于用户的操作开启第一应用,第
一应用用于电子设备获取用户正前方视角的多帧图像。
本实施例中,电子设备中各单元所执行的操作与前述图1至图14所示实施例中描述的类似,此处不再赘述。
本实施例中,通过确定安全行进路线与用户实际行进路线之间的夹角,在夹角大于或等于预设角度的情况下,输出单元1603向用户输出第一指示信息,以指示用户的实际行进方向异常,进而用户在移动过程中偏离人行横道的情况下,通过输出的第一指示信息提升用户通过人行横道的安全。
参阅图17,本申请提供的另一种电子设备的结构示意图。该电子设备可以包括处理器1701、存储器1702和通信端口1703。该处理器1701、存储器1702和通信端口1703通过线路互联。其中,存储器1702中存储有程序指令和数据。
存储器1702中存储了前述图1至图15所示对应的实施方式中,由电子设备执行的步骤对应的程序指令以及数据。
处理器1701,用于执行前述图1至图15所示实施例中任一实施例所示的由电子设备执行的步骤。
通信端口1703可以用于进行数据的接收和发送,用于执行前述图1至图15所示实施例中任一实施例中与获取、发送、接收相关的步骤。
一种实现方式中,电子设备可以包括相对于图17更多或更少的部件,本申请对此仅仅是示例性说明,并不作限定。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统,装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。
当使用软件实现所述集成的单元时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本发明实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL))或无线(例如红外、无线、微波等)
方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的术语在适当情况下可以互换,这仅仅是描述本申请的实施例中对相同属性的对象在描述时所采用的区分方式。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,以便包含一系列单元的过程、方法、系统、产品或设备不必限于那些单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它单元。
本申请的说明书和权利要求书及上述附图中的“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:包括单独存在A,存在A和B,以及单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。
Claims (23)
- 一种图像处理方法,其特征在于,所述方法应用于电子设备,所述电子设备用于获取用户正前方视角的多帧图像,所述方法包括:若从所述多帧图像中检测到人行横道,确定第一路线,所述第一路线为所述人行横道上的安全行进路线;基于所述多帧图像获取所述用户实际行走的第二路线;若所述第一路线与所述第二路线在同一参考系的夹角大于或等于预设角度,向所述用户输出第一指示信息,所述第一指示信息用于指示所述第二路线异常。
- 根据权利要求1所述的方法,其特征在于,所述电子设备获取所述多帧图像时的第一位姿包括第一俯仰角与第一倾斜角,所述第一俯仰角的取值范围为60至100度,所述第一倾斜角的取值范围在60至120度。
- 根据权利要求2所述的方法,其特征在于,所述方法还包括:获取所述电子设备的第二位姿,所述第二位姿中的第二俯仰角与所述第一俯仰角不同,或者所述第二位姿中的第二倾斜角与所述第一倾斜角不同;输出第二指示信息,所述第二指示信息用于所述用户调整所述第二位姿至所述第一位姿。
- 根据权利要求2或3所述的方法,其特征在于,所述第一路线为所述人行横道两侧边缘延伸方向所形成的夹角所在的角平分线,或者所述第一路线平行于所述人行横道的边缘。
- 根据权利要求1至4中任一项所述的方法,其特征在于,所述向所述用户输出第一指示信息之前,所述方法还包括:确定所述多帧图像中人行横道的面积逐渐减小。
- 根据权利要求1至5中任一项所述的方法,其特征在于,所述电子设备采集所述多帧图像时的移动路线为所述第二路线,所述移动路线平行于所述人行横道所在的地面。
- 根据权利要求1至6中任一项所述的方法,其特征在于,所述向所述用户输出第一指示信息之后,所述方法还包括:若电子设备获取的新图像中不包括人行横道,向所述用户输出第三指示信息,所述第三指示信息用于指示所述用户已通过所述人行通道。
- 根据权利要求1至7中任一项所述的方法,其特征在于,所述多帧图像还包括红绿灯信息、所述人行横道的长度以及与所述人行横道相关的机动车信息;所述红绿灯信息包括红绿灯的颜色与时长,所述机动车信息包括机动车相对于所述人行横道的行驶方向与距离;所述第一指示信息还用于指示所述用户在所述人行横道上的移动路径。
- 根据权利要求1至8中任一项所述的方法,其特征在于,所述确定第一路线之前,所述方法还包括:基于所述用户的操作开启第一应用,所述第一应用用于所述电子设备获取所述用户正前方视角的多帧图像。
- 根据权利要求1至9中任一项所述的方法,其特征在于,所述第一指示信息的表现形式包括语音和/或标记信息。
- 一种电子设备,其特征在于,所述电子设备用于获取用户正前方视角的多帧图像,所述电子设备包括:确定单元,用于若从所述多帧图像中检测到人行横道,确定第一路线,所述第一路线为所述人行横道上的安全行进路线;获取单元,用于基于所述多帧图像获取所述用户实际行走的第二路线;输出单元,用于若所述第一路线与所述第二路线在同一参考系的夹角大于或等于预设角度,向所述用户输出第一指示信息,所述第一指示信息用于指示所述第二路线异常。
- 根据权利要求11所述的电子设备,其特征在于,所述电子设备获取所述多帧图像时的第一位姿包括第一俯仰角与第一倾斜角,所述第一俯仰角的取值范围为60至100度,所述第一倾斜角的取值范围在60至120度。
- 根据权利要求12所述的电子设备,其特征在于,所述获取单元,还用于获取所述电子设备的第二位姿,所述第二位姿中的第二俯仰角与所述第一俯仰角不同,或者所述第二位姿中的第二倾斜角与所述第一倾斜角不同;所述输出单元,还用于输出第二指示信息,所述第二指示信息用于所述用户调整所述第二位姿至所述第一位姿。
- 根据权利要求12或13所述的电子设备,其特征在于,所述第一路线为所述人行横道两侧边缘延伸方向所形成的夹角所在的角平分线,或者所述第一路线平行于所述人行横道的边缘。
- 根据权利要求11至14中任一项所述的电子设备,其特征在于,所述确定单元,还用于确定所述多帧图像中人行横道的面积逐渐减小。
- 根据权利要求11至15中任一项所述的电子设备,其特征在于,所述电子设备采集所述多帧图像时的移动路线为所述第二路线,所述移动路线平行于所述人行横道所在的地面。
- 根据权利要求11至16中任一项所述的电子设备,其特征在于,所述输出单元,还用于若电子设备获取的新图像中不包括人行横道,向所述用户输出第三指示信息,所述第三指示信息用于指示所述用户已通过所述人行通道。
- 根据权利要求11至17中任一项所述的电子设备,其特征在于,所述多帧图像还包括红绿灯信息、所述人行横道的长度以及与所述人行横道相关的机动车信息;所述红绿灯信息包括红绿灯的颜色与时长,所述机动车信息包括机动车相对于所述人行横道的行驶方向与距离;所述第一指示信息还用于指示所述用户在所述人行横道上的移动路径。
- 根据权利要求11至18中任一项所述的电子设备,其特征在于,所述电子设备还包括:开启单元,用于基于所述用户的操作开启第一应用,所述第一应用用于所述电子设备获取所述用户正前方视角的多帧图像。
- 根据权利要求11至19中任一项所述的电子设备,其特征在于,所述第一指示信息的表现形式包括语音和/或标记信息。
- 一种电子设备,其特征在于,包括:处理器,所述处理器与存储器耦合,所述存储器用于存储程序或指令,当所述程序或指令被所述处理器执行时,使得所述电子处理设备执行如权利要求1至10中任一项所述的方法。
- 一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在终端设备上运行时,使得所述终端设备执行如权利要求1至10中任一项所述的方法。
- 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得 所述计算机执行如权利要求1至10中任一项所述的方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP23787558.8A EP4435720A1 (en) | 2022-04-13 | 2023-04-04 | Image processing method and related device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210384608.3 | 2022-04-13 | ||
CN202210384608.3A CN116959228A (zh) | 2022-04-13 | 2022-04-13 | 一种图像处理方法及相关设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023197913A1 true WO2023197913A1 (zh) | 2023-10-19 |
Family
ID=88328923
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2023/086213 WO2023197913A1 (zh) | 2022-04-13 | 2023-04-04 | 一种图像处理方法及相关设备 |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP4435720A1 (zh) |
CN (1) | CN116959228A (zh) |
WO (1) | WO2023197913A1 (zh) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050275718A1 (en) * | 2004-06-11 | 2005-12-15 | Oriental Institute Of Technology And Far Eastern Memorial Hospital | Apparatus and method for identifying surrounding environment by means of image processing and for outputting the results |
CN106652505A (zh) * | 2017-01-22 | 2017-05-10 | 吉林大学 | 一种基于智能眼镜的视觉障碍行人过街引导系统 |
CN106821694A (zh) * | 2017-01-18 | 2017-06-13 | 西南大学 | 一种基于智能手机的移动导盲系统 |
CN106901956A (zh) * | 2017-03-03 | 2017-06-30 | 西南大学 | 一种用于人行横道线处的盲人导航方法及装置 |
CN108305458A (zh) * | 2018-03-10 | 2018-07-20 | 张定宇 | 一种引导盲人路口通行的系统及其装置 |
CN211461094U (zh) * | 2019-11-19 | 2020-09-11 | 科大讯飞股份有限公司 | 导盲杖 |
CN112674998A (zh) * | 2020-12-23 | 2021-04-20 | 北京工业大学 | 基于快速深度神经网络和移动智能设备的盲人交通路口辅助方法 |
CN113101155A (zh) * | 2021-03-31 | 2021-07-13 | 电子科技大学成都学院 | 一种基于机器视觉的红绿灯路口导盲方法及导盲装置 |
-
2022
- 2022-04-13 CN CN202210384608.3A patent/CN116959228A/zh active Pending
-
2023
- 2023-04-04 EP EP23787558.8A patent/EP4435720A1/en active Pending
- 2023-04-04 WO PCT/CN2023/086213 patent/WO2023197913A1/zh active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050275718A1 (en) * | 2004-06-11 | 2005-12-15 | Oriental Institute Of Technology And Far Eastern Memorial Hospital | Apparatus and method for identifying surrounding environment by means of image processing and for outputting the results |
CN106821694A (zh) * | 2017-01-18 | 2017-06-13 | 西南大学 | 一种基于智能手机的移动导盲系统 |
CN106652505A (zh) * | 2017-01-22 | 2017-05-10 | 吉林大学 | 一种基于智能眼镜的视觉障碍行人过街引导系统 |
CN106901956A (zh) * | 2017-03-03 | 2017-06-30 | 西南大学 | 一种用于人行横道线处的盲人导航方法及装置 |
CN108305458A (zh) * | 2018-03-10 | 2018-07-20 | 张定宇 | 一种引导盲人路口通行的系统及其装置 |
CN211461094U (zh) * | 2019-11-19 | 2020-09-11 | 科大讯飞股份有限公司 | 导盲杖 |
CN112674998A (zh) * | 2020-12-23 | 2021-04-20 | 北京工业大学 | 基于快速深度神经网络和移动智能设备的盲人交通路口辅助方法 |
CN113101155A (zh) * | 2021-03-31 | 2021-07-13 | 电子科技大学成都学院 | 一种基于机器视觉的红绿灯路口导盲方法及导盲装置 |
Also Published As
Publication number | Publication date |
---|---|
EP4435720A1 (en) | 2024-09-25 |
CN116959228A (zh) | 2023-10-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020177619A1 (zh) | 终端充电提醒方法、装置、设备及存储介质 | |
WO2020168965A1 (zh) | 一种具有折叠屏的电子设备的控制方法及电子设备 | |
WO2021213120A1 (zh) | 投屏方法、装置和电子设备 | |
WO2021052279A1 (zh) | 一种折叠屏显示方法及电子设备 | |
EP3961358B1 (en) | False touch prevention method for curved screen, and eletronic device | |
CN110798568B (zh) | 具有折叠屏的电子设备的显示控制方法及电子设备 | |
WO2021208723A1 (zh) | 全屏显示方法、装置和电子设备 | |
WO2021169515A1 (zh) | 一种设备间数据交互的方法及相关设备 | |
CN110012130A (zh) | 一种具有折叠屏的电子设备的控制方法及电子设备 | |
WO2021013106A1 (zh) | 一种折叠屏照明方法和装置 | |
CN111368765A (zh) | 车辆位置的确定方法、装置、电子设备和车载设备 | |
CN110742580A (zh) | 一种睡眠状态识别方法及装置 | |
CN114090102B (zh) | 启动应用程序的方法、装置、电子设备和介质 | |
WO2020237617A1 (zh) | 控屏方法、装置、设备及存储介质 | |
US20240338163A1 (en) | Multi-screen unlocking method and electronic device | |
CN110248037A (zh) | 一种身份证件扫描方法及装置 | |
EP4307692A1 (en) | Method and system for adjusting volume, and electronic device | |
CN114257920B (zh) | 一种音频播放方法、系统和电子设备 | |
WO2021223560A1 (zh) | 屏幕状态的控制方法及电子设备 | |
WO2023216930A1 (zh) | 基于穿戴设备的振动反馈方法、系统、穿戴设备和电子设备 | |
WO2023174161A1 (zh) | 一种消息传输的方法及相应终端 | |
EP4307168A1 (en) | Target user determination method, electronic device and computer-readable storage medium | |
WO2023197913A1 (zh) | 一种图像处理方法及相关设备 | |
CN115665632A (zh) | 音频电路、相关装置和控制方法 | |
WO2021204036A1 (zh) | 睡眠风险监测方法、电子设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23787558 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023787558 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2023787558 Country of ref document: EP Effective date: 20240620 |