[Stackless] Multi-CPU Actor Based Python
senn at maya.com
Wed Nov 19 19:41:02 CET 2008
On Nov 19, 2008, at 1:01 PM, Larry Dickson wrote:
> Is there a non-object-oriented flavor of Stackless Python? I've run
> into this sort of thing before. OO techniques seem to require
> extreme centralization, which kills "fast", makes "safe" impossible
> in the real world, and I'm not even clear on "nice-looking"...
Yes, it's called Stackless Python.
You just have to exercise a lot of restraint in which features you
use! :-) :-)
> However there does seem to be a fundamental issue here that probably
> goes to the basis of how the universe works.
> Locality is scarce. You make things fast by making them fit in a
> small space so that the speed of light does not matter.
> You decouple their behavior from other things that are "far"
> True. But Unix piped chains of commands are a simple example of this.
Hm. I would say that unix pipes are "slow" (meaning you have to
data and write it to IO buffers that are then read and de-serialized
next process) compared to the alternative which is to simply pass
to a structure from within the same process. (Imagine the example of
to pass a complex digraph structure over a pipe)
> You make things robust and architectural ("componentized") by making
> them "big"... with well-defined boundaries that take up space
> and well-defined interactions that require synchronous coupling
> at the edges.
> Architectural, maybe, but robust I think not! Things that are big
> are never robust, because their behavior is too complex to
> understand. Big insides and small boundaries are possible, but only
> if you ABSOLUTELY eliminate side effects, including spec
> ambiguities, which means inheritance and especially polymorphism are
I agree about inheritance and polymorphism.
Perhaps I should have hyphenated: "robust-and-architectural" meaning to
disqualify architecturality that does not promote robust solutions.
What I mean about robust is (as you discover) a more complicated and
argument - but I'm pretty sure it is correct.
Robustness depends on the ability to verify the behavior for a set of
conditions -- in real life (and perhaps even in physics generally!)
this set of conditions is never fully specified, and architecture is
way to keep verification tractable as specifications and implementation
change. Since architecture has "size costs", therefore so does
for any system that has any fundamental complexity.
(A complex argument hopefully quickly summarized...)
> So you want-your-cake-and-to-eat-it-too... you're not the first one...
> and perhaps you shouldn't be discouraged by no-sayers... you might
> just invent something wonderful... However there are many issues
> you are not considering (even in your simple example):
> -- notice that both incrementing and decrementing the refcnt
> have to involve some sort of interlock. (Not to mention GC
> and heap structure management!)
> -- notice that you are starting to change the very nature of
> python. If, for example, I want several processes co-operating
> to add results to a search list, I can't just pop them into
> the same object, I now need to invent a whole structure to
> "re-combine" things again. How much more memory am I going
> to use to do that? How "pythonic" is it going to look when
> I'm done? Or will it look more like an Erlang program? :-)
> Well, you have n processes working and an n+1-st process managing
> the list... but they cannot all be accessing the same object... if
> you free yourself from OO, I think it makes this sort of thing a lot
Yes, and also "free" yourself from mutable container types and then
you wind up
with something a lot more like Erlang than Python -- which is exactly
> The other absolute killer is exceptions. They are an inadmissible
> design shortcut when asynchronous workers are cooperating. All
> outcomes have to be designed as normal.
Yep. But again: this is a very "functional programming" mindset...
and not very Pythonic.
More information about the Stackless