[Stackless] Stackless WSGI http server..

Alec Flett alecf at flett.org
Tue Apr 10 23:27:16 CEST 2007

Ok, another update:

It turns out I thought the server was blocking but it was just Firefox
trying to reuse http sockets for keepalive...so when I reloaded 2 tabs at
once, one tab was waiting on the other. Doh! :)

I was actually able to just derive the Stackless HTTP Server class from the
WSGIServer class and then it behaves as expected... and it flies! I'm really
excited about this solution - I've already put together a little mini
appserver for what I'm trying to accomplish.

Also, there's a typo in stacklesssocket.py - probably originally from a
comment I posted here a while back...

The code for handle_connect currently says:
and should say:

Now the next problem - how to use stacklessocket to do some kind of
asynchat-like behavior (read/write chatty protocol terminated by linefeeds)
for a separate socket connection I'm using. stacklessocket seems to
constantly block my code on the read if the client calls socket.recv() with
a bigger buffer than is available immediately.


On 4/6/07, Alec Flett <alecf at flett.org> wrote:
> So I uncovered the Google Code wiki (http://code.google.com/p/stacklessexamples/wiki/StacklessExamples
> ) - it seems like a slightly newer version of the wiki on stackless.com.
> Which should we be using?
> Anyhow, on there I discovered basicWebserver.py which supposedly fires off
> a tasklet for each connection. I mean the code there looks right, but I'm
> finding that it's still blocking on every request - meaning that if one
> request is running, no more requests can be served.
> I'm also not sure I see any specific advantage to using stacklesssocket.pyhere - is that just an extra optimization to get some scheduling done in the
> socket code? Because it seems like the server (should) be managing most of
> this pretty well.
> But I think it will be a good starting point for the StacklessWSGIServer I
> have in mind. I'll keep poking at it.
> Alec
> On 4/6/07, Alec Flett <alecf at flett.org> wrote:
> >
> > Hey folks -
> > I've scoured the web, the wiki, and anywhere else I can think of. I'm
> > wondering if anyone has written a stackless-based WSGI-compliant http
> > server..
> >
> > I've got a bunch of WSGI apps and middleware that I currently run under
> > mod_python with prefork apache. Apache is driving me nuts and I'd like to
> > switch to an asynchronous model. It's pretty CPU hungry stuff but I think
> > I've figured out a good way to break up the work. Stackless seems ideal for
> > this -- just dispatch each http connection to a new tasklet. Plus stackless
> > would allow me to easily share common work between the tasklets.
> >
> > (I looked at Twisted but since WSGI is pretty callstack-oriented, WSGI &
> > Twisted kind of a nasty combination)
> >
> > I don't need anything special from the server itself beyond the WSGI
> > capability because I'm using formencode, selector, and other WSGI-based
> > middleware that does all the work of your classic app server.
> >
> > I started down the road of building on the WSGIServer/WSGIRequestHandler
> > stuff in wsgiref, (which is based on BaseHTTPServer) but as it started to
> > get complicated I thought, surely someone has dealt with this stuff before?
> >
> > As an aside, this stuff will be eventually running on multi-core
> > machines. Two cores today, and I'm sure more tomorrow. One thought I had was
> > to run one long-running stackless python for each core, and let lighttpd do
> > the work. In that case, maybe what I really need is a fastcgi-wsgi gateway
> > that's stackless friendly. Has anyone explored that avenue?
> >
> > But hey, if nobody else has done this stuff, I'll be happy to share my
> > work....
> >
> > Alec
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.stackless.com/pipermail/stackless/attachments/20070410/f26d7e82/attachment.htm>
-------------- next part --------------
Stackless mailing list
Stackless at stackless.com

More information about the Stackless mailing list