US20210150233A1 - Information processing apparatus, image delivery system, information processing method, and computer-readable recording medium - Google Patents
Information processing apparatus, image delivery system, information processing method, and computer-readable recording medium Download PDFInfo
- Publication number
- US20210150233A1 US20210150233A1 US17/160,796 US202117160796A US2021150233A1 US 20210150233 A1 US20210150233 A1 US 20210150233A1 US 202117160796 A US202117160796 A US 202117160796A US 2021150233 A1 US2021150233 A1 US 2021150233A1
- Authority
- US
- United States
- Prior art keywords
- image
- vehicle
- taken
- information
- information processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/584—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
-
- G06K9/00825—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0283—Price estimation or determination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/017—Detecting movement of traffic to be counted or controlled identifying vehicles
- G08G1/0175—Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/123—Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G06K2209/23—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/08—Detecting or categorising vehicles
Definitions
- the present invention relates to an information processing apparatus, an image delivery system, an information processing method, and a computer-readable recording medium.
- an image taken (by photography) by a traveling vehicle is uploaded to a server; a scene at a place where the image is taken is reported to a user who is remote from the place.
- an image in the surroundings of the vehicle is uploaded to an image server that delivers the image to the user (for example, see Japanese Laid-Open Patent Application No. 2007-207260).
- an information processing apparatus includes at least one processor configured to determine, on the basis of information concerning traveling of a vehicle obtained when an image is taken by an image pickup unit of the vehicle, a scene present when the image is taken; and transmit information indicating the determined scene and the image to a terminal connected to the information processing apparatus via a network.
- FIG. 1 illustrates a configuration example of an image delivery system according to an embodiment
- FIG. 2 illustrates a hardware configuration example of a server according to the embodiment
- FIG. 3 is one example of a functional block diagram of the server and a terminal according to the embodiment
- FIG. 4 is a sequence diagram illustrating one example of processes of the image delivery system according to the embodiment.
- FIG. 5 illustrates one example of image information according to the embodiment.
- FIG. 6 illustrates an example of a display screen of a terminal for a user to purchase an image or the like.
- An embodiment of the present invention has been devised in consideration of this point, and an object of the embodiment is to provide a technology enabling a relatively easy selection of an image to be delivered from among a plurality of images.
- An information processing apparatus includes at least one processor configured to determine, on the basis of information concerning traveling of a vehicle obtained when an image is taken (by photography) by an image pickup unit of the vehicle, a scene present when the image is taken; and transmit information indicating the determined scene and the image to a terminal connected to the information processing apparatus via a network.
- a user can select an image to be delivered on the basis of information indicating a scene. Accordingly, a user can relatively easily select an image to be delivered from among a plurality of images.
- the information concerning traveling of the vehicle includes at least one of a traveling speed of the vehicle, an acceleration of the vehicle, information concerning a driver's driving operation of the vehicle, and information indicating actuating of a predetermined traveling function of the vehicle.
- a user can select an image to be delivered on the basis of information indicating a scene determined on the basis of information concerning traveling of the vehicle. Accordingly, a user can relatively easily select an image to be delivered from among a plurality of images.
- the at least one processor is further configured to determine the scene present when the image is taken on the basis of a result of image recognition of the image and information of surroundings of the vehicle corresponding to a date and time when the image is taken and a position where the image is taken.
- a user can select an image to be delivered on the basis of information indicating a scene determined on the basis of information of a disaster or the like occurring in the surroundings of the vehicle and an object detected from the image. Accordingly, a user can relatively easily select an image to be delivered from among a plurality of images.
- the at least one processor is further configured to decide a sales price of the image on the basis of the determined scene, and transmit, to the terminal, the image with the information indicating the determined scene and information indicating the sales price attached to the image.
- a user can purchase an image to be delivered at a price decided on the basis of information indicating a scene. Accordingly, a user can relatively easily select an image to be delivered from among a plurality of images, and also, can pay a price corresponding to a scene of the image to a user or the like who has provided the image.
- the at least one processor is further configured to decide the sales price on the basis of an occurrence frequency of the determined scene.
- a user can purchase an image to be delivered at a price decided on the basis of an occurrence frequency of a scene of the image. Accordingly, a user can relatively easily select an image to be delivered from among a plurality of images, and also, can pay a price corresponding to an occurrence frequency of a scene of the image to a user or the like who has provided the image.
- the at least one processor is further configured to decide a higher sales price as a date and time when the image is taken is later or as a difference between the date and time when the image is taken and a date and time when a predetermined event occurs at a position where the image is taken is smaller.
- a user can purchase an image to be delivered at a price decided on the basis of date and time when the image is taken. Accordingly, a user can relatively easily select an image to be delivered from among a plurality of images, and also, can pay a price corresponding to date and time when the image is taken to a user or the like who has provided the image.
- a user can relatively easily select an image to be delivered from among a plurality of images.
- FIG. 1 illustrates a configuration example of an image delivery system according to the embodiment.
- the image delivery system 1 includes a server 10 , terminals 20 - 1 and 20 - 2 (hereinafter, for a case where the terminals need not be distinguished therebetween, each of which will be simply referred to as a ⁇ terminal 20 ⁇ ), an in-vehicle apparatus 30 of a vehicle 301 , and an external server 40 .
- the number of the terminals 20 is not limited to two.
- the server 10 and each of the terminals 20 , the in-vehicle apparatus 30 , and the external server 40 are connected to perform communication therebetween through, for example, a network 50 such as the Internet, a mobile telephone network, a wireless LAN (Local Area Network), a LAN, or the like.
- a network 50 such as the Internet, a mobile telephone network, a wireless LAN (Local Area Network), a LAN, or the like.
- the in-vehicle apparatus 30 is, for example, an ECU (Electronic Control Unit) or the like installed in the vehicle 301 , and is connected to a drive recorder 60 (that is an example of an ⁇ image pickup unit ⁇ ), a communication apparatus, and so forth.
- the in-vehicle apparatus 30 stores added information such as the speed, the position, and so forth of the vehicle 301 and a moving image taken (by photography) by the drive recorder 60 in a recording medium such as a SD card, and uploads the added information and the moving image to the server 10 .
- the terminal 20 is, for example, an information processing apparatus (i.e., a computer) such as a smartphone, a tablet PC (Personal Computer), or a notebook-size PC.
- a terminal 20 transmits a frame selected by the user from among frames of a moving image taken (by photography) by the drive recorder 60 to the server 10 .
- a terminal 20 purchases from the server 10 an image that is searched on the basis of a tag indicated by the user from among a plurality of images uploaded to the server 10 .
- the server 10 is, for example, an information processing apparatus for a server, and provides a service such as image delivery to a terminal 20 .
- the server 10 attaches a tag to an image taken (by photography) by the vehicle 301 so that a terminal 20 can search for an image on the basis of a tag.
- the external server 40 responds to a request from the server 10 to deliver to the server 10 information of the climate at a predetermined date and time and place, information of a disaster (or an accident), or the like.
- FIG. 2 illustrates a hardware configuration example of the server 10 according to the embodiment.
- the server 10 of FIG. 2 includes a drive unit 100 , an auxiliary storage unit 102 , a memory unit 103 , a CPU 104 , an interface unit 105 , and so forth that are connected with each other by a bus B.
- An information processing program to implement processes of the server 10 is provided through, for example, a recording medium 101 .
- the information processing program In response to setting of the recording medium 101 to the drive unit 100 , the information processing program having been recorded in the recording medium 101 , the information processing program is installed in the auxiliary storage unit 102 from the recording medium 101 via the drive unit 100 .
- installing of the information processing program is not necessarily implemented through the recording medium 101 and may be downloaded from another computer via a network.
- the auxiliary storage unit 102 stores the installed information processing program, and also stores necessary files, data, and so forth.
- the memory unit 103 is, for example, a RAM (Random access memory), and, in response to an input of an instruction to start a program, reads the program from the auxiliary storage unit 102 and stores the program.
- the CPU 104 implements functions of the server 10 according to the program stored in the memory unit 103 .
- the interface unit 105 is used as an interface to connect to a network.
- examples of the recording medium 101 include portable recording media such as a CD-ROM, a DVD, and a USB memory.
- Examples of the auxiliary storage unit 102 include a HDD (Hard Disk Drive), a flash memory, and so forth.
- Each of the recording medium 101 and the auxiliary storage unit 102 corresponds to a computer-readable recording medium.
- the hardware configurations of the terminals 20 , the in-vehicle apparatus 30 , and the external server 40 may be the same as or similar to the hardware configuration of the server 10 .
- FIG. 3 is one example of a functional block diagram of the server 10 and a terminal 20 according to the embodiment.
- the server 10 includes a storage part 11 .
- the storage part 11 is implemented by, for example, the auxiliary storage unit 102 .
- the storage part 11 stores image information 111 and so forth. Data included in the image information 111 will be described later.
- the server 10 further includes an obtaining part 12 , a determination part 13 , a decision part 14 , a provision part 15 , and a transmission and reception part 16 .
- the obtaining part 12 , the determination part 13 , the decision part 14 , the provision part 15 , and the transmission and reception part 16 represent functions implemented by processes performed by the CPU 104 of the server 10 according to one or more programs installed in the server 10 .
- the obtaining part 12 obtains an image that is taken by an image pickup unit (i.e., the drive recorder 60 ) of the in-vehicle apparatus 30 , and obtains the date and time, position, and information concerning traveling of the vehicle 301 , in which the in-vehicle apparatus 30 is installed, at a time when the image is taken.
- an image pickup unit i.e., the drive recorder 60
- the obtaining part 12 obtains an image that is taken by an image pickup unit (i.e., the drive recorder 60 ) of the in-vehicle apparatus 30 , and obtains the date and time, position, and information concerning traveling of the vehicle 301 , in which the in-vehicle apparatus 30 is installed, at a time when the image is taken.
- the determination part 13 determines, on the basis of information concerning traveling of the vehicle 301 , a scene present when an image is taken (by photography).
- a scene present when an image is taken include, for example, scenes concerning driving and traveling of the vehicle 301 and scenes concerning traffic such as an accident and a traffic jam occurring in the surroundings of the vehicle 301 .
- the determination part 13 determines a scene present when an image is taken (by photography) on the basis of the image obtained by the obtaining part 12 and information of surroundings of the vehicle corresponding to the date and time and position at which the image is taken.
- the decision part 14 decides sales price of an image on the basis of a scene determined by the determination part 13 and so forth.
- the decision part 14 determines the sales price of an image, for example, on the basis of the degree of rarity of the image, the degree of rarity of the scene, and so forth.
- the provision part 15 sells an image to the user of a terminal 20 at a sales price decided by the decision part 14 .
- the provision part 15 transmits an image and a tag corresponding to the image to the user of a terminal 20 who purchases the image.
- the transmission and reception part 16 performs communication with a terminal 20 , the in-vehicle apparatus 30 , or the external server 40 .
- the transmission and reception part 16 transmits, for example, an image with information indicating a scene determined by the determination part 13 attached to the image to a terminal 20 .
- Each of the terminals 20 includes a reception part 21 , a control part 22 , and a transmission and reception part 23 . These parts represent functions implemented by processes performed by a CPU of the terminal 20 according to one or more programs installed in the terminal 20 .
- the reception part 21 receives various operations performed by the user.
- the reception part 21 receives, for example, an operation performed by the user to perform an adjustment on an image to be sold via the server 10 , a tag, and so forth.
- the reception part 21 receives, for example, an operation performed by the user to search for an image sold via the server 10 , an operation performed by the user to purchase the image, and so forth.
- the control part 22 performs, for example, a process to display, on the basis of information received from the server 10 , the information on a display screen. In addition, the control part 22 performs various processes, for example, in response to the user's operations received via the reception part 21 .
- the transmission and reception part 23 performs communication with the server 10 according to an instruction that is input from the control part 22 .
- FIG. 4 is a sequence diagram illustrating one example of processes of the image delivery system 1 according to the embodiment.
- step S 1 the in-vehicle apparatus 30 stores an image (a moving image or a static image) taken (by photography) and added information obtained upon the taking of the image in a recording medium such as a SD card, and uploads the image and the added information to the server 10 .
- the added information includes date and time information concerning the time at which the image is taken, position information concerning the position at which the image is taken, and information (i.e., vehicle information) concerning traveling of the vehicle 301 at the time when the image is taken.
- the position information may be, for example, information of a latitude and a longitude obtained from a GPS (Global Positioning System) or the like.
- Information concerning traveling of the vehicle 301 includes the speed of the vehicle 301 ; the acceleration of the vehicle 301 ; driving operation information such as information concerning a driver's brake operation, a driver's steering operation, a driver's accelerator operation, and so forth; and information concerning actuating of predetermined traveling functions such as actuating of a function of an ABS (Antilock Brake System) and a function of a TRC (TRaction Control) of the vehicle 301 .
- the in-vehicle apparatus 30 may start taking the moving image, for example, in response to turning on of the ACC (accessory) power supply, and may upload the moving image taken (by photography) to the server 10 in response to turning off of the ACC power supply.
- the in-vehicle apparatus 30 may upload the image to the server 10 .
- a detection of a predetermined event may be, for example, a detection of one or more events included in the vehicle information such as a detection of a predetermined driver's operation such as a sudden braking operation or an abrupt steering operation, or a detection of an acceleration greater than or equal to a predetermined threshold corresponding to a collision of the vehicle 301 or the like.
- a condition provided by the server 10 may be, for example, a date and time, a position, or the like indicated by the server 10 .
- a terminal 20 may upload an image and added information of the image to the server 10 .
- the terminal 20 may read data of images from a recording medium such as a SD card where images taken (by photography) by the in-vehicle apparatus 30 are recorded and may display the read images on a display screen, and may upload an image selected from the displayed images and the corresponding added information to the server 10 .
- the determination part 13 of the server 10 performs an image recognition process on the image, obtained by the obtaining part 12 , to detect a predetermined object from the image (step S 2 ). More specifically, the determination part 13 of the server 10 detects from the image an object such as a pedestrian, a vehicle, a road cone, a bicycle, a motorcycle, a traffic light, or a traffic sign. In a case where the image is a moving image, the determination part 13 of the server 10 may perform the processes that will be described below on one or more frames included in the moving image (for example, each key frame not compressed among frames).
- the obtaining part 12 of the server 10 obtains from the external server 40 information of surroundings of the vehicle 301 (hereinafter, referred to as ⁇ environmental information ⁇ as appropriate), obtained when the image is taken, corresponding to the date and time information and the position information included in the added information (step S 3 ).
- the obtaining part 12 of the server 10 obtains, as the environmental information, climate information, traffic information, disaster information, and facility information obtained at the date and time and position at which the image is taken.
- the climate information may include information such as information of the ambient temperature, the humidity, the weather, or a typhoon.
- the traffic information may include information such as information of, for example, an accident, a traffic jam, or road construction work.
- the disaster information may include, for example, information of an earthquake, falling of a bluff, a fire, a tsunami, or a flood.
- the facility information may include information of, for example, a nearby store.
- the determination part 13 of the server 10 determines a scene (step S 4 ).
- the determination part 13 of the server 10 uses, for example, AI (Artificial Intelligence) or the like to determine a scene present when the image is taken on the basis of at least one of the result of image recognition of the image obtained from step S 2 , the environmental information of the image obtained from step S 3 , and the added information of the image.
- AI Artificial Intelligence
- the determination part 13 of the server 10 may determine, for example, that the scene is a scene of an ⁇ accident of a rear-end collision during a standstill of the vehicle ⁇ , for a case where, on the basis of the information concerning traveling of the vehicle 301 included in the added information, an acceleration greater than or equal to a predetermined threshold corresponding to a collision of the vehicle 301 or the like is detected when a brake operation has been performed by the driver of the vehicle 301 or the speed of the vehicle 301 has been zero.
- the determination part 13 of the server 10 may determine, for example, that the scene is a scene of ⁇ a skid of the vehicle during traveling of the vehicle at a high speed ⁇ , for a case where, on the basis of the information concerning traveling of the vehicle 301 , the speed of the vehicle 301 is greater than or equal to a predetermined threshold, the ABS is actuated, and a brake operation is performed by the driver of the vehicle 301 with the strength greater than or equal to a predetermined threshold.
- the determination part 13 of the server 10 may determine, for example, that the scene is a scene of ⁇ a collision accident due to a skid of the vehicle ⁇ , for a case where the ABS is actuated and an acceleration greater than or equal to a predetermined threshold corresponding to a collision of the vehicle 301 or the like is detected.
- the determination part 13 of the server 10 may determine, for example, that the scene is a scene of ⁇ a rear-end collision accident due to drowsy driving or the like ⁇ for a case where the amount of a driver's steering operation is less than or equal to a predetermined threshold, the amount of a driver's brake operation is less than or equal to a predetermined threshold, and an acceleration greater than or equal to a predetermined threshold corresponding to a collision of the vehicle 301 is detected.
- the determination part 13 of the server 10 may determine, for example, that the scene is a scene of ⁇ an occurrence of an accident at the traffic intersection A ⁇ , for a case where a traffic accident is detected from image recognition and the image is taken at the traffic intersection A.
- the determination part 13 of the server 10 may determine, for example, that the scene is a scene of ⁇ an occurrence of a skid due to road surface freezing at the traffic intersection A ⁇ , for a case where the ambient temperature is minus 3 degrees according to the climate information included in the environmental information, the ABS is actuated according to the information concerning traveling of the vehicle 301 included in the added information, and a traffic accident is detected from image recognition.
- the determination part 13 of the server 10 may determine, for example, that the scene is a scene, for example, in which ⁇ there is a need to wait for 30 minutes to enter the shop B ⁇ , for a case where a line of persons is detected from image recognition and the image is taken at the shop B.
- the determination part 13 of the server 10 may determine, for example, that the scene is a scene, for example, in which ⁇ a lane restriction is implemented at the address D due to construction work ⁇ , for a case where construction work is detected from image recognition and the image is taken at the address D.
- the determination part 13 of the server 10 attaches, as a tag, the added information of the image, information of the object obtained from step S 2 , the environmental information obtained from step S 3 , and information of the scene obtained from step S 4 , to the image (step S 5 ).
- the decision part 14 of the server 10 decides an assessed price of a sales price (or a sales value) (step S 6 ).
- the decision part 14 of the server 10 may decide a higher assessed price as the degree of rarity of the image and the degree of freshness of the image are higher.
- a sales price may be a money amount, or points exchangeable with a predetermined service or a predetermined product by the server 10 .
- the decision part 14 of the server 10 may determine, for example, that the degree of rarity of the image is higher as the number of images registered as image information 111 that are similar to the image and correspond to positions included in a link (i.e., a road section) between two nodes (i.e., between two traffic intersections) in map data is smaller. In this case, the decision part 14 of the server 10 may determine that one image is similar to an other image for a case where the difference between an object and a scene detected from the one image and an object and a scene detected from the other image is less than or equal to a predetermined threshold.
- the decision part 14 of the server 10 may determine the degree of rarity of the image according to, for example, the scene of the image and previously set degrees of rarity of various scenes. In this case, the decision part 14 of the server 10 determines, for example, that an image of an accident has the degree of rarity ⁇ 10 ⁇ in a case where the previously set degree of rarity of a scene ⁇ an accident ⁇ is ⁇ 10 ⁇ .
- the decision part 14 of the server 10 may determine, for example, that the degree of freshness of the image is higher as the date and time at which the image is taken is later.
- the decision part 14 of the server 10 may determine, for example, that the degree of freshness of the image is higher as the difference between the date and time at which the image is taken and the date and time of an occurrence of an accident (that is an example of an event) or a disaster (that is another example of an event) such as an earthquake, falling of a bluff, a fire, a tsunami, or a flood, included in the environmental information obtained from step S 3 corresponding to the date and time at which the image is taken, is smaller.
- an accident that is an example of an event
- a disaster that is another example of an event
- a flood included in the environmental information obtained from step S 3 corresponding to the date and time at which the image is taken
- the decision part 14 of the server 10 may decide the assessed price of the sales price of the image according to, for example, the degree of credibility of a user who provides the image.
- the decision part 14 of the server 10 may decide the degree of credibility for the user on the basis of, for example, the contents of an adjustment performed by the user on an image of which the user had previously permitted selling and a corresponding tag.
- the decision part 14 may set a higher degree of credibility for a user who performed an adjustment on an image to degrade the visibility of personal information that had not been deleted in the image modified by the server 10 or for a user who performed an adjustment on a tag generated by the server 10 to improve the preciseness of the tag.
- the provision part 15 of the server 10 modifies the image for protecting personal information or the like (step S 7 ).
- the provision part 15 of the server 10 may perform, for example, a process to pixelate a face of a person, a license plate, or the like included in the image.
- the provision part 15 of the server 10 transmits the tag attached to the image in step S 5 , the assessed price of the sales price of the image determined in step S 6 , and the image modified in step S 7 to the terminal 20 - 1 of the user of the in-vehicle apparatus 30 (step S 8 ).
- control part 22 of the terminal 20 - 1 displays on the display screen, the tag of the image, the sales price of the image, and the modified image (step S 9 ).
- the reception part 21 of the terminal 20 - 1 receives the user's operation to perform an adjustment on the tag, the sales price, and the modified image (step S 10 ). Note that, in a case where the user of the terminal 20 - 1 determines not to perform such an adjustment, the user's operation to perform an adjustment on the tag, the sales price, and the modified image is not needed.
- control part 22 of the terminal 20 - 1 responds to the user's operation to perform an adjustment on the tag, the sales price, and the modified image; and transmits to the server 10 the tag, the sales price, and the modified image on each of which the adjustment has been performed (step S 11 ).
- the control part 22 of the terminal 20 - 1 sends information indicating the denial to the server 10 .
- the provision part 15 of the server 10 deletes the data concerning the modified image selling which is thus denied.
- FIG. 5 illustrates one example of image information 111 according to the present embodiment.
- the image, a tag, a sales price, and a user ID are stored where the image, tag, sales price, and user ID are associated with the image ID.
- An image ID is identification information to identify an image taken (by photography) by the in-vehicle apparatus 30 .
- An image is an image taken (by photography) by the in-vehicle apparatus 30 , a modified image obtained from step S 7 , or an image on which an adjustment is performed in step S 11 .
- a ⁇ user ID ⁇ is identification information to identify a user who has uploaded an image to the server 10 from the in-vehicle apparatus 30 or the like.
- FIG. 6 illustrates an example of a display screen 601 of the terminal 20 - 2 with which the user of the terminal 20 - 2 can purchase an image or the like.
- a thumbnail of an image 602 , a sales price 603 , a tag 604 , a ⁇ purchase ⁇ button 605 , and so forth are displayed.
- the provision part 15 of the server 10 responds to a pressing operation of the user of the terminal 20 - 2 on the purchase button 605 to transmit the image concerning the thumbnail 602 and the tag of the image to the terminal 20 - 2 .
- the terminal 20 - 2 may be able to previously register with the server 10 a search condition for an image.
- the provision part 15 of the server 10 may send this information to the terminal 20 - 2 .
- the server 10 may send information indicating a selling result of an image that has been uploaded by the user of the terminal 20 - 1 at a predetermined timing (for example, every month) to the terminal 20 - 1 .
- the information processing apparatus the image delivery system, the information processing method, and the computer-readable recording medium have been described as the illustrative embodiments.
- the present invention is not limited to the specifically disclosed embodiments, and various modifications and/or changes may be made within the claimed scope.
- the functional parts of the terminals 20 and the server 10 may be implemented, for example, through cloud computing using one or more computers.
- at least some of the functions of a terminal 20 may be included in the server 10 .
- at least some of the functions of the server 10 may be included in a terminal 20 .
- the server 10 is one example of an ⁇ information processing apparatus ⁇
- the provision part 15 is one example of a function ⁇ to transmit information and an image ⁇ .
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Development Economics (AREA)
- Finance (AREA)
- Theoretical Computer Science (AREA)
- Accounting & Taxation (AREA)
- Strategic Management (AREA)
- Multimedia (AREA)
- Marketing (AREA)
- Game Theory and Decision Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Economics (AREA)
- General Business, Economics & Management (AREA)
- Signal Processing (AREA)
- Mechanical Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Remote Sensing (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Traffic Control Systems (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Television Signal Processing For Recording (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
An information processing apparatus includes at least one processor configured to determine, on the basis of information concerning traveling of a vehicle obtained when an image is taken by an image pickup unit of the vehicle, a scene present when the image is taken; and transmit information indicating the determined scene and the image to a terminal connected to the information processing apparatus via a network.
Description
- This application is a continuation of U.S. application Ser. No. 16/266,473 filed Feb. 4, 2019, which is based on and claims priority under 35 U.S.C. 119 from Japanese Patent Application No. 2018-033527 filed on Feb. 27, 2018. The contents of the above applications are incorporated herein by reference.
- The present invention relates to an information processing apparatus, an image delivery system, an information processing method, and a computer-readable recording medium.
- According to the related art, an image taken (by photography) by a traveling vehicle is uploaded to a server; a scene at a place where the image is taken is reported to a user who is remote from the place. In this regard, also according to the related art, in response to a request from a user who is outside a vehicle, an image in the surroundings of the vehicle is uploaded to an image server that delivers the image to the user (for example, see Japanese Laid-Open Patent Application No. 2007-207260).
- According to an embodiment of the present invention, an information processing apparatus includes at least one processor configured to determine, on the basis of information concerning traveling of a vehicle obtained when an image is taken by an image pickup unit of the vehicle, a scene present when the image is taken; and transmit information indicating the determined scene and the image to a terminal connected to the information processing apparatus via a network.
- Other objects, features and advantages of the present invention will become more apparent from the following detailed description when read in conjunction with the accompanying drawings.
-
FIG. 1 illustrates a configuration example of an image delivery system according to an embodiment; -
FIG. 2 illustrates a hardware configuration example of a server according to the embodiment; -
FIG. 3 is one example of a functional block diagram of the server and a terminal according to the embodiment; -
FIG. 4 is a sequence diagram illustrating one example of processes of the image delivery system according to the embodiment; -
FIG. 5 illustrates one example of image information according to the embodiment; and -
FIG. 6 illustrates an example of a display screen of a terminal for a user to purchase an image or the like. - In the related art described above, it may be difficult for a user to select an image to be delivered from among a plurality of images.
- An embodiment of the present invention has been devised in consideration of this point, and an object of the embodiment is to provide a technology enabling a relatively easy selection of an image to be delivered from among a plurality of images.
- An information processing apparatus according to an embodiment of the present invention includes at least one processor configured to determine, on the basis of information concerning traveling of a vehicle obtained when an image is taken (by photography) by an image pickup unit of the vehicle, a scene present when the image is taken; and transmit information indicating the determined scene and the image to a terminal connected to the information processing apparatus via a network.
- As a result, for example, a user can select an image to be delivered on the basis of information indicating a scene. Accordingly, a user can relatively easily select an image to be delivered from among a plurality of images.
- In an information processing apparatus according to another embodiment of the present invention, the information concerning traveling of the vehicle includes at least one of a traveling speed of the vehicle, an acceleration of the vehicle, information concerning a driver's driving operation of the vehicle, and information indicating actuating of a predetermined traveling function of the vehicle.
- As a result, for example, a user can select an image to be delivered on the basis of information indicating a scene determined on the basis of information concerning traveling of the vehicle. Accordingly, a user can relatively easily select an image to be delivered from among a plurality of images.
- In an information processing apparatus according to yet another embodiment of the present invention, the at least one processor is further configured to determine the scene present when the image is taken on the basis of a result of image recognition of the image and information of surroundings of the vehicle corresponding to a date and time when the image is taken and a position where the image is taken.
- As a result, for example, a user can select an image to be delivered on the basis of information indicating a scene determined on the basis of information of a disaster or the like occurring in the surroundings of the vehicle and an object detected from the image. Accordingly, a user can relatively easily select an image to be delivered from among a plurality of images.
- In an information processing apparatus according to yet another embodiment of the present invention, the at least one processor is further configured to decide a sales price of the image on the basis of the determined scene, and transmit, to the terminal, the image with the information indicating the determined scene and information indicating the sales price attached to the image.
- As a result, for example, a user can purchase an image to be delivered at a price decided on the basis of information indicating a scene. Accordingly, a user can relatively easily select an image to be delivered from among a plurality of images, and also, can pay a price corresponding to a scene of the image to a user or the like who has provided the image.
- In an information processing apparatus according to a yet another embodiment of the present invention, the at least one processor is further configured to decide the sales price on the basis of an occurrence frequency of the determined scene.
- As a result, for example, a user can purchase an image to be delivered at a price decided on the basis of an occurrence frequency of a scene of the image. Accordingly, a user can relatively easily select an image to be delivered from among a plurality of images, and also, can pay a price corresponding to an occurrence frequency of a scene of the image to a user or the like who has provided the image.
- In an information processing apparatus according to a yet another embodiment of the present invention, the at least one processor is further configured to decide a higher sales price as a date and time when the image is taken is later or as a difference between the date and time when the image is taken and a date and time when a predetermined event occurs at a position where the image is taken is smaller.
- As a result, for example, a user can purchase an image to be delivered at a price decided on the basis of date and time when the image is taken. Accordingly, a user can relatively easily select an image to be delivered from among a plurality of images, and also, can pay a price corresponding to date and time when the image is taken to a user or the like who has provided the image.
- Other embodiments are implemented as an image delivery system, an information processing method, and a computer-readable recording medium.
- According to the embodiments of the present invention, a user can relatively easily select an image to be delivered from among a plurality of images.
- Below, an embodiment of the present invention will be described on the basis of drawings.
-
FIG. 1 illustrates a configuration example of an image delivery system according to the embodiment. InFIG. 1 , theimage delivery system 1 includes aserver 10, terminals 20-1 and 20-2 (hereinafter, for a case where the terminals need not be distinguished therebetween, each of which will be simply referred to as a □terminal 20□), an in-vehicle apparatus 30 of avehicle 301, and anexternal server 40. Note that the number of theterminals 20 is not limited to two. - The
server 10 and each of theterminals 20, the in-vehicle apparatus 30, and theexternal server 40 are connected to perform communication therebetween through, for example, anetwork 50 such as the Internet, a mobile telephone network, a wireless LAN (Local Area Network), a LAN, or the like. - The in-
vehicle apparatus 30 is, for example, an ECU (Electronic Control Unit) or the like installed in thevehicle 301, and is connected to a drive recorder 60 (that is an example of an □image pickup unit□), a communication apparatus, and so forth. The in-vehicle apparatus 30 stores added information such as the speed, the position, and so forth of thevehicle 301 and a moving image taken (by photography) by thedrive recorder 60 in a recording medium such as a SD card, and uploads the added information and the moving image to theserver 10. - The
terminal 20 is, for example, an information processing apparatus (i.e., a computer) such as a smartphone, a tablet PC (Personal Computer), or a notebook-size PC. Aterminal 20 transmits a frame selected by the user from among frames of a moving image taken (by photography) by thedrive recorder 60 to theserver 10. In addition, aterminal 20 purchases from theserver 10 an image that is searched on the basis of a tag indicated by the user from among a plurality of images uploaded to theserver 10. - The
server 10 is, for example, an information processing apparatus for a server, and provides a service such as image delivery to aterminal 20. Theserver 10 attaches a tag to an image taken (by photography) by thevehicle 301 so that aterminal 20 can search for an image on the basis of a tag. - The
external server 40 responds to a request from theserver 10 to deliver to theserver 10 information of the climate at a predetermined date and time and place, information of a disaster (or an accident), or the like. -
FIG. 2 illustrates a hardware configuration example of theserver 10 according to the embodiment. Theserver 10 ofFIG. 2 includes adrive unit 100, anauxiliary storage unit 102, amemory unit 103, aCPU 104, aninterface unit 105, and so forth that are connected with each other by a bus B. - An information processing program to implement processes of the
server 10 is provided through, for example, arecording medium 101. In response to setting of therecording medium 101 to thedrive unit 100, the information processing program having been recorded in therecording medium 101, the information processing program is installed in theauxiliary storage unit 102 from therecording medium 101 via thedrive unit 100. However, installing of the information processing program is not necessarily implemented through therecording medium 101 and may be downloaded from another computer via a network. Theauxiliary storage unit 102 stores the installed information processing program, and also stores necessary files, data, and so forth. - The
memory unit 103 is, for example, a RAM (Random access memory), and, in response to an input of an instruction to start a program, reads the program from theauxiliary storage unit 102 and stores the program. TheCPU 104 implements functions of theserver 10 according to the program stored in thememory unit 103. Theinterface unit 105 is used as an interface to connect to a network. - Note that, examples of the
recording medium 101 include portable recording media such as a CD-ROM, a DVD, and a USB memory. Examples of theauxiliary storage unit 102 include a HDD (Hard Disk Drive), a flash memory, and so forth. Each of therecording medium 101 and theauxiliary storage unit 102 corresponds to a computer-readable recording medium. - Note that the hardware configurations of the
terminals 20, the in-vehicle apparatus 30, and theexternal server 40 may be the same as or similar to the hardware configuration of theserver 10. - Next, with reference to
FIG. 3 , functional configurations of theterminals 20 and theserver 10 according to the embodiment will be described.FIG. 3 is one example of a functional block diagram of theserver 10 and a terminal 20 according to the embodiment. - The
server 10 includes astorage part 11. Thestorage part 11 is implemented by, for example, theauxiliary storage unit 102. Thestorage part 11stores image information 111 and so forth. Data included in theimage information 111 will be described later. - The
server 10 further includes an obtainingpart 12, adetermination part 13, adecision part 14, aprovision part 15, and a transmission andreception part 16. The obtainingpart 12, thedetermination part 13, thedecision part 14, theprovision part 15, and the transmission andreception part 16 represent functions implemented by processes performed by theCPU 104 of theserver 10 according to one or more programs installed in theserver 10. - The obtaining
part 12 obtains an image that is taken by an image pickup unit (i.e., the drive recorder 60) of the in-vehicle apparatus 30, and obtains the date and time, position, and information concerning traveling of thevehicle 301, in which the in-vehicle apparatus 30 is installed, at a time when the image is taken. - The
determination part 13 determines, on the basis of information concerning traveling of thevehicle 301, a scene present when an image is taken (by photography). In this regard, examples of a scene present when an image is taken (by photography) include, for example, scenes concerning driving and traveling of thevehicle 301 and scenes concerning traffic such as an accident and a traffic jam occurring in the surroundings of thevehicle 301. In addition, thedetermination part 13 determines a scene present when an image is taken (by photography) on the basis of the image obtained by the obtainingpart 12 and information of surroundings of the vehicle corresponding to the date and time and position at which the image is taken. - The
decision part 14 decides sales price of an image on the basis of a scene determined by thedetermination part 13 and so forth. Thedecision part 14 determines the sales price of an image, for example, on the basis of the degree of rarity of the image, the degree of rarity of the scene, and so forth. - The
provision part 15 sells an image to the user of a terminal 20 at a sales price decided by thedecision part 14. Theprovision part 15 transmits an image and a tag corresponding to the image to the user of a terminal 20 who purchases the image. - The transmission and
reception part 16 performs communication with a terminal 20, the in-vehicle apparatus 30, or theexternal server 40. The transmission andreception part 16 transmits, for example, an image with information indicating a scene determined by thedetermination part 13 attached to the image to a terminal 20. - «
Terminal 20» - Each of the
terminals 20 includes areception part 21, acontrol part 22, and a transmission andreception part 23. These parts represent functions implemented by processes performed by a CPU of the terminal 20 according to one or more programs installed in the terminal 20. - The
reception part 21 receives various operations performed by the user. Thereception part 21 receives, for example, an operation performed by the user to perform an adjustment on an image to be sold via theserver 10, a tag, and so forth. In addition, thereception part 21 receives, for example, an operation performed by the user to search for an image sold via theserver 10, an operation performed by the user to purchase the image, and so forth. - The
control part 22 performs, for example, a process to display, on the basis of information received from theserver 10, the information on a display screen. In addition, thecontrol part 22 performs various processes, for example, in response to the user's operations received via thereception part 21. - The transmission and
reception part 23 performs communication with theserver 10 according to an instruction that is input from thecontrol part 22. - <Processes>
- Next, with reference to
FIGS. 4-6 , processes of theimage delivery system 1 according to the embodiment will be described.FIG. 4 is a sequence diagram illustrating one example of processes of theimage delivery system 1 according to the embodiment. - In step S1, the in-
vehicle apparatus 30 stores an image (a moving image or a static image) taken (by photography) and added information obtained upon the taking of the image in a recording medium such as a SD card, and uploads the image and the added information to theserver 10. The added information includes date and time information concerning the time at which the image is taken, position information concerning the position at which the image is taken, and information (i.e., vehicle information) concerning traveling of thevehicle 301 at the time when the image is taken. The position information may be, for example, information of a latitude and a longitude obtained from a GPS (Global Positioning System) or the like. - Information concerning traveling of the
vehicle 301 includes the speed of thevehicle 301; the acceleration of thevehicle 301; driving operation information such as information concerning a driver's brake operation, a driver's steering operation, a driver's accelerator operation, and so forth; and information concerning actuating of predetermined traveling functions such as actuating of a function of an ABS (Antilock Brake System) and a function of a TRC (TRaction Control) of thevehicle 301. The in-vehicle apparatus 30 may start taking the moving image, for example, in response to turning on of the ACC (accessory) power supply, and may upload the moving image taken (by photography) to theserver 10 in response to turning off of the ACC power supply. - Note that in response to a detection of a predetermined event or in response to a satisfaction of a condition provided by the
server 10, the in-vehicle apparatus 30 may upload the image to theserver 10. A detection of a predetermined event may be, for example, a detection of one or more events included in the vehicle information such as a detection of a predetermined driver's operation such as a sudden braking operation or an abrupt steering operation, or a detection of an acceleration greater than or equal to a predetermined threshold corresponding to a collision of thevehicle 301 or the like. A condition provided by theserver 10 may be, for example, a date and time, a position, or the like indicated by theserver 10. - Note that a terminal 20 may upload an image and added information of the image to the
server 10. In this case, for example, the terminal 20 may read data of images from a recording medium such as a SD card where images taken (by photography) by the in-vehicle apparatus 30 are recorded and may display the read images on a display screen, and may upload an image selected from the displayed images and the corresponding added information to theserver 10. - Next, the
determination part 13 of theserver 10 performs an image recognition process on the image, obtained by the obtainingpart 12, to detect a predetermined object from the image (step S2). More specifically, thedetermination part 13 of theserver 10 detects from the image an object such as a pedestrian, a vehicle, a road cone, a bicycle, a motorcycle, a traffic light, or a traffic sign. In a case where the image is a moving image, thedetermination part 13 of theserver 10 may perform the processes that will be described below on one or more frames included in the moving image (for example, each key frame not compressed among frames). - Next, the obtaining
part 12 of theserver 10 obtains from theexternal server 40 information of surroundings of the vehicle 301 (hereinafter, referred to as □environmental information□ as appropriate), obtained when the image is taken, corresponding to the date and time information and the position information included in the added information (step S3). The obtainingpart 12 of theserver 10 obtains, as the environmental information, climate information, traffic information, disaster information, and facility information obtained at the date and time and position at which the image is taken. The climate information may include information such as information of the ambient temperature, the humidity, the weather, or a typhoon. The traffic information may include information such as information of, for example, an accident, a traffic jam, or road construction work. The disaster information may include, for example, information of an earthquake, falling of a bluff, a fire, a tsunami, or a flood. The facility information may include information of, for example, a nearby store. - Next, the
determination part 13 of theserver 10 determines a scene (step S4). In this regard, thedetermination part 13 of theserver 10 uses, for example, AI (Artificial Intelligence) or the like to determine a scene present when the image is taken on the basis of at least one of the result of image recognition of the image obtained from step S2, the environmental information of the image obtained from step S3, and the added information of the image. - The
determination part 13 of theserver 10 may determine, for example, that the scene is a scene of an □accident of a rear-end collision during a standstill of the vehicle□, for a case where, on the basis of the information concerning traveling of thevehicle 301 included in the added information, an acceleration greater than or equal to a predetermined threshold corresponding to a collision of thevehicle 301 or the like is detected when a brake operation has been performed by the driver of thevehicle 301 or the speed of thevehicle 301 has been zero. - The
determination part 13 of theserver 10 may determine, for example, that the scene is a scene of □a skid of the vehicle during traveling of the vehicle at a high speed□, for a case where, on the basis of the information concerning traveling of thevehicle 301, the speed of thevehicle 301 is greater than or equal to a predetermined threshold, the ABS is actuated, and a brake operation is performed by the driver of thevehicle 301 with the strength greater than or equal to a predetermined threshold. - The
determination part 13 of theserver 10 may determine, for example, that the scene is a scene of □a collision accident due to a skid of the vehicle□, for a case where the ABS is actuated and an acceleration greater than or equal to a predetermined threshold corresponding to a collision of thevehicle 301 or the like is detected. - The
determination part 13 of theserver 10 may determine, for example, that the scene is a scene of □a rear-end collision accident due to drowsy driving or the like□ for a case where the amount of a driver's steering operation is less than or equal to a predetermined threshold, the amount of a driver's brake operation is less than or equal to a predetermined threshold, and an acceleration greater than or equal to a predetermined threshold corresponding to a collision of thevehicle 301 is detected. - Moreover, the
determination part 13 of theserver 10 may determine, for example, that the scene is a scene of □an occurrence of an accident at the traffic intersection A□, for a case where a traffic accident is detected from image recognition and the image is taken at the traffic intersection A. - The
determination part 13 of theserver 10 may determine, for example, that the scene is a scene of □an occurrence of a skid due to road surface freezing at the traffic intersection A□, for a case where the ambient temperature is minus 3 degrees according to the climate information included in the environmental information, the ABS is actuated according to the information concerning traveling of thevehicle 301 included in the added information, and a traffic accident is detected from image recognition. - Moreover, the
determination part 13 of theserver 10 may determine, for example, that the scene is a scene, for example, in which □there is a need to wait for 30 minutes to enter the shop B□, for a case where a line of persons is detected from image recognition and the image is taken at the shop B. - The
determination part 13 of theserver 10 may determine, for example, that the scene is a scene, for example, in which □a lane restriction is implemented at the address D due to construction work□, for a case where construction work is detected from image recognition and the image is taken at the address D. - Next, the
determination part 13 of theserver 10 attaches, as a tag, the added information of the image, information of the object obtained from step S2, the environmental information obtained from step S3, and information of the scene obtained from step S4, to the image (step S5). - Next, on the basis of the degree of rarity (the occurrence frequency) of the image and the degree of freshness of the image, the
decision part 14 of theserver 10 decides an assessed price of a sales price (or a sales value) (step S6). In this regard, for example, thedecision part 14 of theserver 10 may decide a higher assessed price as the degree of rarity of the image and the degree of freshness of the image are higher. Note that a sales price may be a money amount, or points exchangeable with a predetermined service or a predetermined product by theserver 10. - The
decision part 14 of theserver 10 may determine, for example, that the degree of rarity of the image is higher as the number of images registered asimage information 111 that are similar to the image and correspond to positions included in a link (i.e., a road section) between two nodes (i.e., between two traffic intersections) in map data is smaller. In this case, thedecision part 14 of theserver 10 may determine that one image is similar to an other image for a case where the difference between an object and a scene detected from the one image and an object and a scene detected from the other image is less than or equal to a predetermined threshold. - Moreover, the
decision part 14 of theserver 10 may determine the degree of rarity of the image according to, for example, the scene of the image and previously set degrees of rarity of various scenes. In this case, thedecision part 14 of theserver 10 determines, for example, that an image of an accident has the degree of rarity □10□ in a case where the previously set degree of rarity of a scene □an accident□ is □10□. - Moreover, the
decision part 14 of theserver 10 may determine, for example, that the degree of freshness of the image is higher as the date and time at which the image is taken is later. Thedecision part 14 of theserver 10 may determine, for example, that the degree of freshness of the image is higher as the difference between the date and time at which the image is taken and the date and time of an occurrence of an accident (that is an example of an event) or a disaster (that is another example of an event) such as an earthquake, falling of a bluff, a fire, a tsunami, or a flood, included in the environmental information obtained from step S3 corresponding to the date and time at which the image is taken, is smaller. As a result, for example, it is possible to set a higher sales price for an image that is the first report of an accident or the like. - Moreover, the
decision part 14 of theserver 10 may decide the assessed price of the sales price of the image according to, for example, the degree of credibility of a user who provides the image. In this case, thedecision part 14 of theserver 10 may decide the degree of credibility for the user on the basis of, for example, the contents of an adjustment performed by the user on an image of which the user had previously permitted selling and a corresponding tag. For example, thedecision part 14 may set a higher degree of credibility for a user who performed an adjustment on an image to degrade the visibility of personal information that had not been deleted in the image modified by theserver 10 or for a user who performed an adjustment on a tag generated by theserver 10 to improve the preciseness of the tag. - Next, the
provision part 15 of theserver 10 modifies the image for protecting personal information or the like (step S7). In this regard, theprovision part 15 of theserver 10 may perform, for example, a process to pixelate a face of a person, a license plate, or the like included in the image. - Next, the
provision part 15 of theserver 10 transmits the tag attached to the image in step S5, the assessed price of the sales price of the image determined in step S6, and the image modified in step S7 to the terminal 20-1 of the user of the in-vehicle apparatus 30 (step S8). - Next, the
control part 22 of the terminal 20-1 displays on the display screen, the tag of the image, the sales price of the image, and the modified image (step S9). - Next, the
reception part 21 of the terminal 20-1 receives the user's operation to perform an adjustment on the tag, the sales price, and the modified image (step S10). Note that, in a case where the user of the terminal 20-1 determines not to perform such an adjustment, the user's operation to perform an adjustment on the tag, the sales price, and the modified image is not needed. - Next, the
control part 22 of the terminal 20-1 responds to the user's operation to perform an adjustment on the tag, the sales price, and the modified image; and transmits to theserver 10 the tag, the sales price, and the modified image on each of which the adjustment has been performed (step S11). Note that in response to receiving, if any, the user's operation to deny selling the modified image, thecontrol part 22 of the terminal 20-1 sends information indicating the denial to theserver 10. Then, theprovision part 15 of theserver 10 deletes the data concerning the modified image selling which is thus denied. - Next, the
provision part 15 of theserver 10 permits selling the modified image to the other terminal 20-2 or the like (step S12).FIG. 5 illustrates one example ofimage information 111 according to the present embodiment. In the example of theimage information 111 illustrated inFIG. 5 , for an image for which the user of a terminal 20 permits selling, the image, a tag, a sales price, and a user ID are stored where the image, tag, sales price, and user ID are associated with the image ID. An image ID is identification information to identify an image taken (by photography) by the in-vehicle apparatus 30. An image is an image taken (by photography) by the in-vehicle apparatus 30, a modified image obtained from step S7, or an image on which an adjustment is performed in step S11. A □user ID□ is identification information to identify a user who has uploaded an image to theserver 10 from the in-vehicle apparatus 30 or the like. - Next, the
provision part 15 of theserver 10 responds to an operation of the user of the terminal 20-2 to transmit the image indicated by the user, with the tag, the sales price, and so forth of the image to the terminal 20-2 (step S13).FIG. 6 illustrates an example of adisplay screen 601 of the terminal 20-2 with which the user of the terminal 20-2 can purchase an image or the like. In the example ofFIG. 6 , on thedisplay screen 601 of the terminal 20-2, a thumbnail of animage 602, asales price 603, a tag 604, a □purchase□button 605, and so forth are displayed. Theprovision part 15 of theserver 10 responds to a pressing operation of the user of the terminal 20-2 on thepurchase button 605 to transmit the image concerning thethumbnail 602 and the tag of the image to the terminal 20-2. Note that the terminal 20-2 may be able to previously register with the server 10 a search condition for an image. In this case, in response to the image that satisfies the registered condition becoming purchasable, theprovision part 15 of theserver 10 may send this information to the terminal 20-2. - Note that the
server 10 may send information indicating a selling result of an image that has been uploaded by the user of the terminal 20-1 at a predetermined timing (for example, every month) to the terminal 20-1. - Thus, the information processing apparatus, the image delivery system, the information processing method, and the computer-readable recording medium have been described as the illustrative embodiments. In this regard, the present invention is not limited to the specifically disclosed embodiments, and various modifications and/or changes may be made within the claimed scope.
- The functional parts of the
terminals 20 and theserver 10 may be implemented, for example, through cloud computing using one or more computers. In addition, at least some of the functions of a terminal 20 may be included in theserver 10. In addition, at least some of the functions of theserver 10 may be included in a terminal 20. Note that, in the embodiment, theserver 10 is one example of an □information processing apparatus□, and theprovision part 15 is one example of a function □to transmit information and an image□. -
- 1: image delivery system
- 10: server
- 11: storage part
- 12: obtaining part
- 13: determination part
- 14: decision part
- 15: provision part
- 16: transmission and reception part
- 20: terminal
- 21: reception part
- 22: control part
- 23: transmission and reception part
- 30: in-vehicle apparatus
- 40: external server
- 301: vehicle
- The present application is based on and claims priority to Japanese patent application No. 2018-033527, filed Feb. 27, 2018, the entire contents of which are hereby incorporated herein by reference.
Claims (20)
1. An information processing apparatus comprising:
at least one processor configured to:
determine, based on information concerning traveling of a vehicle obtained when an image including surroundings of the vehicle is taken, a scene present in the surroundings of the vehicle;
set a sales price of the image based on the determined scene; and
transmit, to a terminal, the image,
information indicating the determined scene, and the sales price.
2. The information processing apparatus as claimed in claim 1 , wherein
the information concerning traveling of the vehicle includes at least one of a traveling speed of the vehicle, an acceleration of the vehicle, information concerning a driver's driving operation of the vehicle, and information indicating actuating of a predetermined traveling function of the vehicle.
3. The information processing apparatus as claimed in claim 1 , wherein the at least one processor is further configured to:
determine the scene is present when the image is taken based on a result of image recognition of the image and information of the surroundings of the vehicle corresponding to a date and time when the image is taken and a position where the image is taken.
4. The information processing apparatus as claimed in claim 2 , wherein the at least one processor is further configured to:
determine the scene is present when the image is taken based on a result of image recognition of the image and information of the surroundings of the vehicle corresponding to a date and time when the image is taken and a position where the image is taken.
5. The information processing apparatus as claimed in claim 1 , wherein the at least one processor is further configured to:
set the sales price based on an occurrence frequency of the determined scene.
6. The information processing apparatus as claimed in claim 2 , wherein the at least one processor is further configured to:
set the sales price based on an occurrence frequency of the determined scene.
7. The information processing apparatus as claimed in claim 3 , wherein the at least one processor is further configured to:
set the sales price based on an occurrence frequency of the determined scene.
8. The information processing apparatus as claimed in claim 4 , wherein the at least one processor is further configured to:
set the sales price based on an occurrence frequency of the determined scene.
9. The information processing apparatus as claimed in claim 1 , wherein the at least one processor is further configured to:
determine a difference between the date and time when the image is taken and a date and time when a predetermined event occurs at a position where the image is taken; and
set the sales price such that the sales price increases as the determined difference decreases.
10. The information processing apparatus as claimed in claim 2 , wherein the at least one processor is further configured to:
determine a difference between the date and time when the image is taken and a date and time when a predetermined event occurs at a position where the image is taken; and
set the sales price such that the sales price increases as the determined difference decreases.
11. The information processing apparatus as claimed in claim 3 , wherein the at least one processor is further configured to:
determine a difference between the date and time when the image is taken and a date and time when a predetermined event occurs at a position where the image is taken; and
set the sales price such that the sales price increases as the determined difference decreases.
12. The information processing apparatus as claimed in claim 4 , wherein the at least one processor is further configured to:
determine a difference between the date and time when the image is taken and a date and time when a predetermined event occurs at a position where the image is taken; and
set the sales price such that the sales price increases as the determined difference decreases.
13. An image delivery system comprising:
a vehicle; and
an information processing apparatus, wherein
the vehicle transmits an image including surroundings of the vehicle and information concerning traveling of the vehicle obtained when the image is taken to the information processing apparatus, and
the information processing apparatus includes at least one processor configured to:
determine, based on the information concerning traveling of the vehicle obtained when the image is taken, a scene present in the surroundings of the vehicle;
set a sales price of the image based on the determined scene; and
transmit, to a terminal, the image, information indicating the determined scene, and the sales price.
14. The image delivery system as claimed in claim 13 , wherein
the information concerning traveling of the vehicle includes at least one of a traveling speed of the vehicle, an acceleration of the vehicle, information concerning a driver's driving operation of the vehicle, and information indicating actuating of a predetermined traveling function of the vehicle.
15. The image delivery system as claimed in claim 13 , wherein the at least one processor is further configured to:
determine the scene is present when the image is taken based on a result of image recognition of the image and information of the surroundings of the vehicle corresponding to a date and time when the image is taken and a position where the image is taken.
16. The image delivery system as claimed in claim 14 , wherein the at least one processor is further configured to:
determine the scene is present when the image is taken based on a result of image recognition of the image and information of the surroundings of the vehicle corresponding to a date and time when the image is taken and a position where the image is taken.
17. An information processing method implemented by an information processing apparatus, the information processing method comprising:
determining, by at least one processor of the information processing apparatus, based on information concerning traveling of a vehicle obtained when an image including surroundings of the vehicle is taken, a scene present in the surroundings of the vehicle;
setting a sales price of the image based on the determined scene; and
transmitting, to a terminal, the image, information indicating the determined scene, and the sales price.
18. The information processing method as claimed in claim 17 , wherein
the information concerning traveling of the vehicle includes at least one of a traveling speed of the vehicle, an acceleration of the vehicle, information concerning a driver's driving operation of the vehicle, and information indicating actuating of a predetermined traveling function of the vehicle.
19. The information processing method as claimed in claim 17 , wherein the information processing method further comprises:
determining the scene is present when the image is taken based on a result of image recognition of the image and information of the surroundings of the vehicle corresponding to a date and time when the image is taken and a position where the image is taken.
20. The information processing method as claimed in claim 18 , wherein the information processing method further comprises:
determining the scene is present when the image is taken based on a result of image recognition of the image and information of the surroundings of the vehicle corresponding to a date and time when the image is taken and a position where the image is taken.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/160,796 US20210150233A1 (en) | 2018-02-27 | 2021-01-28 | Information processing apparatus, image delivery system, information processing method, and computer-readable recording medium |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018033527A JP6981305B2 (en) | 2018-02-27 | 2018-02-27 | Information processing equipment, image distribution system, information processing method, and program |
JP2018-033527 | 2018-02-27 | ||
US16/266,473 US10943135B2 (en) | 2018-02-27 | 2019-02-04 | Information processing apparatus, image delivery system, information processing method, and computer-readable recording medium |
US17/160,796 US20210150233A1 (en) | 2018-02-27 | 2021-01-28 | Information processing apparatus, image delivery system, information processing method, and computer-readable recording medium |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/266,473 Continuation US10943135B2 (en) | 2018-02-27 | 2019-02-04 | Information processing apparatus, image delivery system, information processing method, and computer-readable recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210150233A1 true US20210150233A1 (en) | 2021-05-20 |
Family
ID=67685123
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/266,473 Active 2039-04-26 US10943135B2 (en) | 2018-02-27 | 2019-02-04 | Information processing apparatus, image delivery system, information processing method, and computer-readable recording medium |
US17/160,796 Abandoned US20210150233A1 (en) | 2018-02-27 | 2021-01-28 | Information processing apparatus, image delivery system, information processing method, and computer-readable recording medium |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/266,473 Active 2039-04-26 US10943135B2 (en) | 2018-02-27 | 2019-02-04 | Information processing apparatus, image delivery system, information processing method, and computer-readable recording medium |
Country Status (3)
Country | Link |
---|---|
US (2) | US10943135B2 (en) |
JP (1) | JP6981305B2 (en) |
CN (1) | CN110197590B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12028648B2 (en) | 2020-08-20 | 2024-07-02 | Honda Motor Co., Ltd. | Information processing apparatus, information processing method therefor, and computer-readable storage medium |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7020215B2 (en) * | 2018-03-19 | 2022-02-16 | 日本電気株式会社 | Extra findings determination device, extra findings determination system, extra findings determination method, program |
JP6844568B2 (en) * | 2018-03-27 | 2021-03-17 | 日本電気株式会社 | Extra findings determination device, extra findings determination system, extra findings determination method, program |
JP7264028B2 (en) * | 2019-12-05 | 2023-04-25 | トヨタ自動車株式会社 | Information providing system, information providing method, information terminal and information display method |
JP7506483B2 (en) * | 2020-02-10 | 2024-06-26 | キヤノン株式会社 | Management device, management method, communication device, control method, and program |
JP6997471B2 (en) * | 2020-03-09 | 2022-01-17 | ニューラルポケット株式会社 | Information processing system, information processing device, terminal device, server device, program, or method |
JP7438892B2 (en) * | 2020-08-20 | 2024-02-27 | 本田技研工業株式会社 | Information processing device, information processing method, and program |
EP4273828A4 (en) * | 2021-03-15 | 2024-02-14 | NEC Corporation | Driving information output device, driving information output system, driving information output method, and recording medium |
JP2022157556A (en) * | 2021-03-31 | 2022-10-14 | トヨタ自動車株式会社 | Information processing device, program, and information processing method |
JP7486110B2 (en) * | 2021-04-16 | 2024-05-17 | パナソニックIpマネジメント株式会社 | Image display system and image display method |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8611919B2 (en) * | 2002-05-23 | 2013-12-17 | Wounder Gmbh., Llc | System, method, and computer program product for providing location based services and mobile e-commerce |
JP2004062364A (en) * | 2002-07-26 | 2004-02-26 | Hitachi Ltd | Accident information processing system |
JP2005032027A (en) * | 2003-07-07 | 2005-02-03 | Nec Fielding Ltd | Traffic accident early solution system, accident detection system, and accident analysis server |
JP4765503B2 (en) | 2005-09-16 | 2011-09-07 | 株式会社日立製作所 | Communication terminal and navigation system |
JP2007193487A (en) * | 2006-01-18 | 2007-08-02 | Fujifilm Corp | Image sales price setting system, order receiving server, image sales price setting program and image sales price setting method |
US9067565B2 (en) * | 2006-05-22 | 2015-06-30 | Inthinc Technology Solutions, Inc. | System and method for evaluating driver behavior |
JP4434219B2 (en) | 2007-02-26 | 2010-03-17 | 株式会社デンソー | Image server |
JP5142809B2 (en) * | 2008-05-08 | 2013-02-13 | 株式会社東芝 | In-vehicle image recording apparatus and method |
US9558520B2 (en) * | 2009-12-31 | 2017-01-31 | Hartford Fire Insurance Company | System and method for geocoded insurance processing using mobile devices |
JP5370334B2 (en) | 2010-10-25 | 2013-12-18 | 株式会社デンソー | In-vehicle camera system |
JP5633494B2 (en) | 2011-09-20 | 2014-12-03 | トヨタ自動車株式会社 | Information processing system, information processing apparatus, and center server |
US8515824B2 (en) * | 2011-10-26 | 2013-08-20 | International Business Machines Corporation | Negotiation of product purchase with an electronic device |
JP6043094B2 (en) * | 2012-05-30 | 2016-12-14 | 辛東主 | Product display information aggregation system |
JP2015049598A (en) * | 2013-08-30 | 2015-03-16 | 株式会社ニコン | Price setting device, image providing system, and price setting program |
JP6394402B2 (en) * | 2015-01-14 | 2018-09-26 | オムロン株式会社 | Traffic violation management system and traffic violation management method |
KR101656808B1 (en) * | 2015-03-20 | 2016-09-22 | 현대자동차주식회사 | Accident information manage apparatus, vehicle having the same and method for managing accident information |
KR101623946B1 (en) * | 2015-03-25 | 2016-05-25 | 주식회사 인터파크 | Method, and computer program for virtual wearing |
US11157689B2 (en) * | 2015-11-02 | 2021-10-26 | Microsoft Technology Licensing, Llc | Operations on dynamic data associated with cells in spreadsheets |
JP2017116998A (en) * | 2015-12-21 | 2017-06-29 | セゾン自動車火災保険株式会社 | Information processing device, information processing system, information processing method, and information processing program |
JP6751882B2 (en) * | 2016-03-31 | 2020-09-09 | パナソニックIpマネジメント株式会社 | Product monitoring device, product monitoring system and product monitoring method |
CN205584337U (en) * | 2016-04-29 | 2016-09-14 | 李健 | Vehicle event data recorder intelligence video identification system |
-
2018
- 2018-02-27 JP JP2018033527A patent/JP6981305B2/en active Active
-
2019
- 2019-02-04 US US16/266,473 patent/US10943135B2/en active Active
- 2019-02-14 CN CN201910113992.1A patent/CN110197590B/en active Active
-
2021
- 2021-01-28 US US17/160,796 patent/US20210150233A1/en not_active Abandoned
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12028648B2 (en) | 2020-08-20 | 2024-07-02 | Honda Motor Co., Ltd. | Information processing apparatus, information processing method therefor, and computer-readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN110197590A (en) | 2019-09-03 |
US20190266424A1 (en) | 2019-08-29 |
US10943135B2 (en) | 2021-03-09 |
JP2019149016A (en) | 2019-09-05 |
JP6981305B2 (en) | 2021-12-15 |
CN110197590B (en) | 2021-09-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210150233A1 (en) | Information processing apparatus, image delivery system, information processing method, and computer-readable recording medium | |
US11798099B2 (en) | Systems and methods for automated accident analysis | |
US11212491B2 (en) | Data management of connected cars cameras for homeland security and smart cities | |
US11138876B2 (en) | Information system, information processing method, and non-transitory storage medium | |
KR101836990B1 (en) | Method for gathering of car accident, apparatus and system for the same | |
CN111145383A (en) | Alarm method, alarm device and computer storage medium | |
CN115004269A (en) | Monitoring device, monitoring method, and program | |
JP7274840B2 (en) | Data collection device, data collection system and data collection method | |
US11445143B2 (en) | Method and system for cooperatively collecting video data from driving recorders | |
JP7207916B2 (en) | In-vehicle device | |
CN113496213A (en) | Method, device and system for determining target perception data and storage medium | |
US11335136B2 (en) | Method for ascertaining illegal driving behavior by a vehicle | |
JP2015210713A (en) | Driving recorder and cloud road-information operation system using the same | |
JP7152299B2 (en) | Analysis system and server device | |
KR20170018699A (en) | Accident information collecting apparatus and control method for the same | |
JP6982875B2 (en) | Information provision system | |
JP2021124633A (en) | Map generation system and map generation program | |
KR102680326B1 (en) | Black box apparatus for vehicle with communication function and control method | |
CN111862607A (en) | Responsibility division method, device, equipment and storage medium | |
KR102709356B1 (en) | Vehicle accident analysis system and method, and user terminal | |
US20240331465A1 (en) | In-vehicle capability determining system and method of using | |
US20240330304A1 (en) | Data collection optimization system and method of using | |
JP7405795B2 (en) | In-vehicle information processing device, information processing device, and information processing method | |
JP7563445B2 (en) | DATA PROCESSING APPARATUS, DATA PROCESSING METHOD, AND PROGRAM | |
US20240330063A1 (en) | Rule prioritization system and method of using |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |