RE: [Zope] Zope needs this (and Dynamo has it)
From: Martijn Faassen [mailto:faassen@vet.uu.nl] Sent: Tuesday, March 07, 2000 10:40 PM To: Zope Mailing List (E-mail) Subject: Re: [Zope] Zope needs this (and Dynamo has it)
Alexander Staubo wrote:
I definitely believe that Python is Zope's largest problem right now. It's also, from a technical point of view, one of it's biggest assets -- Zope is becoming the killer application of Python, and might eventually become the driving force to evolve Python, which is, I believe, suffering from draught into the contributor department. Skilled developers aren't flocking to Python to make it faster. Not right now, anyway.
Hm, skilled developers do hack on Python! There's the Python people at CRNI, there's Christian Tismer in Germany doing Stackless Python (which offers interesting arcane features and a mild performance improvements), there's the whole JPython group, there's John Skaller with his Vyper project (a Python interpreter/compiler in Ocaml, one of the reasons is more speed). Then there's the Python Types-SIG, which I urge you to look at if you haven't already.
People are indeed working on Python, but not to the extent of more popular projects like Linux or even, I believe, Perl. Compiler/language projects have, historically, not thrived in the collaborative world of open source software -- for very specific reasons inherent in the concept itself of languages and compilers. Still, I'm very impressed with the evolution of next-gen GCC (formerly EGCS). I consider Skaller's Vyper project a fringe project with limited applicability, because Ocaml itself is a fringe project -- it's certainly not what you might call a mainstream, popular language. There's something icky about an OcamlPython. However, I'd love for it to succeed. (Unfortunately, Skaller is already out on a limb, implementing all sorts of stuff he deems interesting -- so true CPython compliance isn't necessarily on his agenda, and fragmenting Python like this is, imho, not a good thing.) Anyway, while I have not been monitoring the Python type/interface SIG, it seems to be moving quite slowly.
My hope is that Zope will change this, because Python has possibly one of the slowest interpreters on earth. Python zealots -- and there is such a beast -- will tell you that Python's performance is less of a matter because what you sacrifice in speed you gain in power, ie. computational expressiveness through a high-level, dynamic language.
It is _less_ of a problem, but most Pythoneers are indeed interested in speeding of the language. I haven't heard of zealots who want to hold back speedups, as long as it doesn't do away with the power.
I'm not saying they're holding back, but rather they're dismissing speed and scalability as being a problem. I firmly believe that speed is Python's number of problem today, and it's keeping Python from being used in more applications. Python has plenty of problems, but they're tiny by comparison and imho they're mostly artifacts of Python's currently somewhat anemic syntax (eg., only two kinds of loops, no "switch", no protected members, no constants, no syntax for declaring mutability). Imho Python's second big problem is what you might call, in lack of a better term (I'm not wearing my compiler-writer hat right now, sorry), contractual semantics -- anything from static typing (pass a string to a function that takes an int and you get either a compilation error or an exception) to interfaces (the ability to enforce semantic compatibility) to design-by-contract constructs.
For an example of a good, Python-like, dynamic language that is also very fast, take a look at Dylan. In current implementations, Dylan is not interpreted but rather compiled to native code, but in theory there is nothing stopping anyone from writing a highly efficient Dylan interpreter. Dylan's magic comes from its use of type hints, a system which I had recognized as likely pivotal to improving Python's interpreter long before I knew of Dylan.
It'll be pretty difficult to make a beast like Zope exploit type-hints, though, I think.
I disagree, but there's only one way to find out -- implement it (or simulate it very carefully).
By the way, the Types-SIG is working on type extensions to Python, partially for performance (OPT), partially for better error checking (ERR) and partially for better documentation (DOC). See:
http://www.python.org/sigs/types-sig/
And also see the new compiler SIG:
Very interesting, thanks.
Just imagine -- while Zope is running, thousands and thousands of the same little checks are performed continuously, over and over again, amounting to no uncertain overhead.
That's of course true. I heard Christian Tismer talk about making a register machine interpreter for Python, which could potentially speed things up quite a bit. I shall ask him about this idea again sometime. :)
Not potentially. Definitely. I thought about this yesterday, and I got to thinking that it would probably be pretty simple to create a dynamic native-code generator that translated bits of bytecode on-the-fly, with the necessary stubs implemented in a small helper-assembler written in C. Something like: def Halibut(): ... import codegen c = codegen.Translate(Halibut) c() # Call native-code function Halibut and then with whole modules: import pickle, codegen pickle = codegen.Translate(pickle) Internally, Translate() would disassemble the bytecode, generating native assembly language code and pass it to the helper C module, which would store it in an executable buffer and fix up the necessary references to the Py* API symbols. Just a thought. It would be an interim solution for certain CPU-intensive modules, in anticipation of such built-in functionality in CPython itself. Possibly.
Python's biggest overhead comes from its dynamism -- the dynamic typing, as explained, combined the object allocation system: most things is an object, so a lot of objects are constantly spawned and torn down, often in rapid succession because the compiler/interpreter can make few hard analyses of when stuff is needed. Python also has a high function-call overhead.
All true. Anyway, the only thing you're wrong about is that Pythoneers aren't interested in improving performance. They are, but I grant you they're not performance nuts like users of C or C++. :)
Well, I guess they're too busy arguing about whether Python should allow assignments as expressions so you could do "while line = f.readline():". ;-)
Regards,
Martijn
-- Alexander Staubo http://alex.mop.no/ "Do not go gentle into that good night/old age should rave and burn against the close of day/Rage, rage against the dying of the light." --Dylan Thomas
Alexander Staubo wrote:
From: Martijn Faassen [mailto:faassen@vet.uu.nl] [I list a number of projects about improving Python]
People are indeed working on Python, but not to the extent of more popular projects like Linux or even, I believe, Perl.
I don't know enough about Perl to judge this, but to me, there's plenty of activity out there. Of course there could always be more, but lots is happening.. We have _three_ independent implementations of Python, for instance (Python, JPython and Vyper, and then there are projects like Python2C).
Compiler/language projects have, historically, not thrived in the collaborative world of open source software -- for very specific reasons inherent in the concept itself of languages and compilers. Still, I'm very impressed with the evolution of next-gen GCC (formerly EGCS).
Sure, it's not easy to do. I would debate that compiler/languages have not thrived in the open source world, though. There seem to be quite a few successful open source languages around. [assessment of Vyper with which I agree]
Anyway, while I have not been monitoring the Python type/interface SIG, it seems to be moving quite slowly.
That's the nature of discussion lists, almost. There are some prototype implementations of type-checking systems for Python, though (I think two or three independent ones, in fact). [Python people say speed is generally not the problem]
It is _less_ of a problem, but most Pythoneers are indeed interested in speeding of the language. I haven't heard of zealots who want to hold back speedups, as long as it doesn't do away with the power.
I'm not saying they're holding back, but rather they're dismissing speed and scalability as being a problem. I firmly believe that speed is Python's number of problem today, and it's keeping Python from being used in more applications.
Hm, but in my experience Python has been shown to be plenty fast enough for the majority of applications. You're probably right though, and I agree completely that a faster Python would be very nice.
Python has plenty of problems, but they're tiny by comparison and imho they're mostly artifacts of Python's currently somewhat anemic syntax (eg., only two kinds of loops, no "switch", no protected members, no constants, no syntax for declaring mutability).
I suggest you go discuss these things on comp.lang.python. :) I don't miss switch. Functions are first class objects, and dictionaries can be used instead of switch when necessary. Also, good object oriented design tends to kill switch statements anyway. I think the two loops are fine, though it would be nice to have the current constructs to know about iterators. I believe Guido is taking that into consideration. Also there's the list-comprehension proposal which may help here. I don't think you'll see protected members any time soon. I'm not sure what you mean with syntax for declaring mutability, though it sounds interesting.
Imho Python's second big problem is what you might call, in lack of a better term (I'm not wearing my compiler-writer hat right now, sorry), contractual semantics -- anything from static typing (pass a string to a function that takes an int and you get either a compilation error or an exception) to interfaces (the ability to enforce semantic compatibility) to design-by-contract constructs.
The types-sig is reaching consensus about optional static typing and Guido does want to include that into the language in the future (beginnings in 1.7, full implementation in Python 3000 aka Python 2). Interfaces are being considered too, and Jim Fulton in fact has an interfaces proposal out there (that may get used in Zope's code base). I think using a unit testing framework can substitute for design by constract constructs myself. [snip]
It'll be pretty difficult to make a beast like Zope exploit type-hints, though, I think.
I disagree, but there's only one way to find out -- implement it (or simulate it very carefully).
Okay. [snip code generator extension idea] Something like that would definitely be neat. The hard part would be in the translation process, of course. The other part is in the interfacing with regular Python; do you allow your translated function to refer to other Python objects? If you do, you're opening a can of worms. If you don't, then it starts to look like the 'Swallow' subset of Python I proposed once; a strict subset of Python with added full type annotations, which is translatable into C or native machine code.
It would be an interim solution for certain CPU-intensive modules, in anticipation of such built-in functionality in CPython itself. Possibly.
If you're starting such a project I want to hear more about it! [snip]
All true. Anyway, the only thing you're wrong about is that Pythoneers aren't interested in improving performance. They are, but I grant you they're not performance nuts like users of C or C++. :)
Well, I guess they're too busy arguing about whether Python should allow assignments as expressions so you could do "while line = f.readline():". ;-)
Not to mention indentation! Anyway, the people who keep bringing these things up don't tend to be the Python nuts (mostly they're the newbies). Regards, Martijn
participants (2)
-
'Martijn Faassen' -
Alexander Staubo