Re: Globalizing URIs

Gavin Nicol (gtn@ebt.com)
Wed, 2 Aug 95 23:46:25 EDT

>I don't like this model, but prefer another one:
>
>Let me explain this via an 'ftp' example.
>
>The FTP protocol doesn't care what character set your file system
>uses. You open a 8-bit connection and send US-ASCII characters to the
>server. If you want to retrieve a file, you send 'RETR xxxx' and when
>you want to store a file, you send 'STOR xxxx', where 'xxxx' are
>characters *NOT* in the native character set of the file system, but
>rather in whatever transcription of that character set is made
>available by the FTP server.

I don't understand how this works, especially the "some transcription"
part. How is the receiving server to know the name to store it under,
or is "%B0%F5%BA%FE.html" translated to insatsu.html for storage
purposes?

There are 3 problems I have with this model:

1) It would probably require the use of some kind of database to map
the local filename to the HTTP representation, because there are
possible transcription collisions, and because HTTP is stateless.
2) Without some standard mapping it seems somewhat difficult for a
browser to decide what to send to the server. Yes, I know
people will say that the server decides, because they make the
URL's available in the first place, but what happens if a server
sends me an EUC URL, and I send it a SJIS one back?
3) URL's are *not* used solely in HTTP transactions.