Advertisement

Image Processing Server

Started by September 20, 2017 07:28 PM
11 comments, last by alnite 7 years, 1 month ago
Quote

 

Use a message queue

 

 

No durable message queue is optimized for low latency. The original request was for setting up a system that spreads load across GPUs for real-time processing.

Presumably it's better to lose a few frames, than to have to process through old and outdated frames, when the system hiccups.

 

Quote

 

If a frame has multiple stages of processing, it might be wiser to do all of the stages on a single machine.

 

 

The original poster was not saying that multiple steps are spread across machines. The original requirement stated was that the results of the computer vision type analysis would be exported to some other machine.

 

enum Bool { True, False, FileNotFound };

I'll take a stab at this.

On 9/20/2017 at 12:28 PM, Quat said:

A product I am working on will have a computer with camera attached that will be taking real-time video in grayscale.  The frames will need to be sent over a fast network to an "image processing server" that will have multiple GPUs.  Each frame will be assigned to the next available GPU for processing.  Then the output will be sent to another computer in the network.  The idea is for the frames to be processed in real time for computer vision type application.

Client size application that captures the video should be responsible in breaking the video feed into individual frames to reduce server load. Client app then sends those frames to your GPU-loaded server over UDP. Do you need to recompose each frame back to videos? If you do, then you'd need to tag each frame with some sort of a video/frame identifier so they can recomposed later by another background job after the live feed ends.

This topic is closed to new replies.

Advertisement