On Thu, Nov 09, 2006 at 11:40:26PM +0100, Kjetil Torgrim Homme wrote:
Not to me, I'm afraid. To me, the concept of ${} as boundaries for
"magic inside" asks for strict inside-out evaluation of all magic
enabled by extensions. Should we ever get new ${} extensions, do we
really want to specify a total order for all of them?
yes, I think so. being able to parse this without backtracking is a big
plus in my book. I don't see a compelling use case for nesting these
constructs. if we introduce functions in a similar syntax, I don't
think it will be a big problem that variables expand before the
functions are evaluated. supporting non-constant variable names is just
asking for problems, IMHO.
On evaluating variables before anything else: Of course that forbids
functions ever opening a new scope with new variables, or functions
having side effects on variables. Exim uses the second to store the
results of lookups. I don't particularly like that, but use it and must
admit I don't have a better idea.
To me, it does not feel like a good idea to forbid such extensions at
this point.
note that with this ordering, the encoded-character extension can be
byte-compiled away entirely, a very nice property for a language which
is meant to be lightweight and simple.
It also means you can never ever add an extension to have ${hex}
work on variable expansions, i.e. one that adds true functions.
I think we are looking at ${hex} from two perspectives: I think of a
function of (in absence of other extensions constant) arguments, and
you think of it purely as quoting. In case we don't find a unification
we are all happy with, I suggest to find a new name that does not look
like a function at all. Like ${0xdeadbeef}, or ${\xdeadbeef} and
${\u0040}, or just anything that not being an identifier.
Michael