[Stackless] Beta testing a Stackless-backed web application server

Minor Gordon Minor.Gordon at cl.cam.ac.uk
Sat Jun 28 01:13:41 CEST 2008

Hello all,

I'm looking for beta testers, preferably hackers, for a highly
concurrent, multithreaded web server I've developed in C++ that uses
Stackless/greenlets with CPython as a back end for web applications. The
key difference between my server and others in pure Python is that
CPU-intensive (compression, serialization) and blocking (disk reads,
writes) calls are offloaded ("synchronously", using Stackless channels
to send and receive messages) from the Python interpreter to
mulitithreaded worker thread pools. All network I/O and HTTP parsing is
done in C++, as well, with non-blocking and async OS primitives (IOCP,
epoll, etc.) and a no-copy hand-written parser.

The server is very fast -- on an in-memory SPECweb99-like HTTP file
workload it can compete with lighttpd and nginx until the single Python
thread becomes CPU bound. It can leave Twisted, Medusa, or any other
pure Python server in the dust, as well as Apache, of course.

Some other features of the Python-side API:
- the web app interface is WSGI only, with a setup interface identical
to wsgiref's make_server
- the async disk I/O interface is encapsulated in a file-like object
- applications can send and receive IDL-generated RPCs, e.g. result =
call( ... params ..., target="jsonrpc://somehost/JSONRPC" ) using
- there is also an experimental DB API 2.0 interface that communicates
via C++ with a pool of C connections (MySQL, PostgreSQL, and SQLite are

If you are interested in beta testing and/or hacking and don't mind
using the source as documentation, please get in touch. The code is


P.S. I am also very interested in using generated PyPy VMs as
thread-safe web app back ends, if any of the PyPy guys are reading this.

More information about the Stackless mailing list