[Stackless] Stackless WSGI server

Arnar Birgisson arnarbi at gmail.com
Thu Aug 23 11:38:21 CEST 2007


On 8/23/07, Seun Osewa <seun.osewa at gmail.com> wrote:
> There are workarounds, but that's not really good enough.
> One ends up losing the desired performance advantage.
>
> I wrote a tiny web server using threads and using stackless,
> and the threaded server was 20-25% faster for an hello world app.

Under "normal" web server loads/concurrency that is normal.

> A threaded server is more vulnerable to overloading by slow
> clients, though, but the same methods used to tune Apache can
> be used to tune a threaded (or multiprocess) python server.
>
> For something like a chat server I guess Stackless will win,
> because raw performance is not the issue there but the ability
> to multiplex thousands of clients simultaneously.  Perhaps
> Stackless is doomed to occupy a rather small server niche?

Stackless, and asynchronous methods in general, are especially suited
for high concurrency on several orders of magnitude larger than what
threaded approach can handle.  Stackless also helps if pre-emption is
causing problems.

Of course, it's always about using tools that fit the problem and
stackless solves some common problems very nicely, but it's not the
answer for everything.

Arnar

_______________________________________________
Stackless mailing list
Stackless at stackless.com
http://stackless.com/cgi-bin/mailman/listinfo/stackless



More information about the Stackless mailing list