Advertisement

opengl webserver

Started by November 02, 2000 04:37 PM
5 comments, last by rohar 24 years ago
I've been mucking around with this idea to render models with openGL within a webserver, and return the image as a jpeg back to the browser. The idea for the project is for a better opensource GIS mapserver. What I want to end up with is a GIS mapserver that uses Oracle Spatial for data storage, hardware opengl renders with an AOLserver c module, and returns the image back to the browser. There is no reason why it would have to be used only for GIS data... any vector data large or small would work just as well. (models, maps, cad... whatever) I've gotten this much to work: ¤ basic AOLServer mod opengl rendering a demo scene ¤ linux/GLX based and running on a voodoo/banshee card ¤ creates the jpeg on the fly, and dumps it back down the conn (no tmp file) ¤ fully multithreaded code (AOLserver is fully multithreaded) I have the demo running @ http://www.caferegina.com:8000 Is there anyone out there interested in this project? Edited by - rohar on 11/2/00 4:41:11 PM
The URL you gave times out. I''d love to see a demo of this though. It doesn''t seem scalable from an architectural point of view. The Webserver must serve 100s or thousands of simultaneous http requests. Most sites have problems with dynamic html never mind openGL images, and break it off into an app server. They also cache TONS of stuff. Unless you have a decent caching strategy this won''t scale. Instead of dumping it down the conn you should tag it with the input parameters that generated it and then save it to disk on the websever. I''d also architect it as an NSAPI/ISAPI plugin that brokers the request out to another machine to take the load of the webserver CPU. At least that way you can add machines to scale transactions.

http://www.CornflakeZone.com
//-- Modelling and animation in every dimension --//
Advertisement
Thanks for your input.

Caching doesn''t work very well when your parameters are floats, the odds of creating exactly the same image twice start to get pretty small.

That is the whole reason I am trying to do a real-time rendered approach, and not writing a tmp file is a performance thing.

There is a great document on scalability vs reliability here.

Caching the rendered images is probably something I would add as an API option, but it will only work if the data isn''t dynamic, or if your interface didn''t allow free movement. If your parameters have a wide variation, or if the data isn''t static, even if the parameters are the same, you couldn''t be sure the data hasn''t changed, you would have to do just as much work as just re-rendering it.

If the data is static and movement is limited, you could just pre-render a ton of images (hard drives are cheap) and this whole project doesn''t have much value.

The demo code will kick out almost 500 jpegs/minute on my lan with a pIII450 with a voodoo banshee. That would do for alot of applications. What I am going for is a fast and stable solution that would work for most sites on a mid-range machine. A pro video card would go along way in a scalable web service, or several cards.

I am only interested in the project as a platform independent open-source thing. AOLserver is open-source, multithreaded and probably about the best C code I have ever seen. It is ported to most of the platforms opengl is on. I would imagine an SGI or sparc box with a couple of pro cards would scale pretty well.

Take another shot at the link above. Other people have been using it ok, but it is on a cable modem, which is kinda flaky sometimes.
Sorry for my last post, I didn''t mean to shoot down your idea. In my real job I design and build large sites which need to scale to thousands of transactions a second, not minute. They also have to continue scaling. If you''re shooting at mid-range sites I think you''ll still want to revisit the architecture as it will bottom out at some point. Why not move the processing to the browser like flash? I''m just very wary of anything in the web world that centralises processing. Good luck with it.
fs

http://www.CornflakeZone.com
//-- Modelling and animation in every dimension --//
I had a similar idea to this a while ago, but never had the tech knowledge to code it.

My idea was based around a portal/cell engine with a server hosting each cell. When it is detected that a viewport will have a portal within its clip, the client sends a request to the server which runs that cell. The server then renders the view thru that portal and returns a jpg which the client machine then renders as a texture on the appropriate surface (eg a window which is the portal into the server cell).

It would maybe be possible to do this peer2peer , with people with slow machines/connections hosting their own small home worlds, and powerfull machines/connections hosting large areas. In this way an indoors map such as a castle could be lots of interconnected user portals, and (eg) a complex outdoor scene could be on a powerfull machine.

I originally thought that a central server would be needed to direct requests, but given that most cells would only connect to a few other cells (again outdoor scenes may be an exception) this could be done peer to peer(?).

Another possibility would be to have a few powerful servers with copys of each world cached. If a low bandwidth host is unable to deal with the number of requests it''s getting it could redirect the requesting client to the server with the cached version.

Does any of this make sense? Does it sound posible?

Dan
[size="1"]
Hello;
Here is an idea, You can use a java applet to stream the jpegs created by your progy to the web browser, so the user doesn''t have to refresh the browser each time to see a new image. just an idea.

By the way your progy is pretty cool.
JP.

==============================================
I feel like a kid in some kind of store...
==============================================
www.thejpsystem.com
==============================================I feel like a kid in some kind of store... ============================================== www.thejpsystem.com
Advertisement
The java client idea is good. They use a thing like that @ www.scorecard.org. That is a site by the Environmental Defense Fund. It uses an open source java based map server. The java client does a few neat tricks.
I kinda favour not having anything to support running on the client machines.

fshana: Your comments were really appreciated and the scalability thing IS the main issue. It is a tough problem to solve with dynamic requests based on float parameters. IMO, the best way to go is a central datastore (oracle) and a peer based opengl->jpeg server system. Letting Oracle handle the persistance across machines is a proven reliable and scalable method. The main application would load balance the image servers by writing out the image tags in a round robin fashion. That would mean that it could return multiple images from more than 1 server on a single page. It could also do a quick alive testing on the image machines to make sure they are up the instant before it pukes out the HTML. That would give the reliability and serviceability (you could take down any of the image servers, at any time).

The reason for my original post was to find if there were people interested in working on a project like this. Whatever I do will be open-sourced; I don''t have the resources to do this on my own.
Please try not to take offense to my response to your comments... I asked for your comments.

This topic is closed to new replies.

Advertisement