Now that I think about it, I think this complaint has been lodged
in this forum many times, and I've turned a deaf ear to some extent.
But I think it has finally sunk in: the stuff about "tokenization"
needs to be expanded to be as detailed as a lex specification.
So, barring objections from this working group, I'm going to make
another revision to address this issue. I expect it will take a week
to write and a week to review. So the target would be June 26 or so.
Daniel W. Connolly "We believe in the interconnectedness of all things"
Research Technical Staff, MIT/W3C
<connolly@w3.org> http://www.w3.org/hypertext/WWW/People/Connolly