most of the time I don't encounter performance problems at all. Should it happen, I
think mere caching will be good. I have worked on XML storage and can save and load many
modules without issues. Saving the state of modules pre global optimization, or even
afterwards, should save some time.
But generally performance is not the issue I am trying to address there, merely memory
usage. Having all modules in memory at the same time, without need, is causing a lot of
memory to be used.
Generally I found Nuitka to be reasonably fast. Of course, for performance the solution
ought to be to compile Nuitka with Nuitka, which I do as a test. But it seemed not
necessary so far.
One other thing to note, is that for cross compilation, say Linux amd64 to arm, or Linux
to Windows, it would be nice to be able to use a slave interpreter that Nuitka connects
to. This would have to answer compile time questions, like 1**33 whose result (on
Python2) might be int or long, depending on arch. Caching the reply of those slave
Pythons for next rounds, it would become unnecessary to ask. Obviously, easily this
might hit performance.
But in principle, PyPy with a slave CPython could then work, as the host Python is not
actually used for anything.
For me, there is always a C compilation that takes the most time. Others are scalability
issues, that we should address in Nuitka. Many attribute assignments to many global
variable assignments used to break down performance, and I am sure there is more cases
like that. I am pretty sure that the optimization of Nuitka has not seen a lot
optimization itself. :)
I know Nuitka aims to make Python as fast as possible, and in an ideal world this
wouldn't be needed, but... It would be nice if Nuitka could run on one interpreter
(say, PyPy) but target another (CPython).
Compilation times for even small projects can be pretty long, and much of the time is
spent running through optimizations. Being able to use PyPy for the heavy lifting (at
least until Nuitka can be bootstrapped) would help speed up development.