[42] in Coldmud discussion meeting
Re: Regular expressions
daemon@ATHENA.MIT.EDU (Wed Nov 17 04:30:40 1993
)
To: coldstuff@MIT.EDU
In-Reply-To: Your message of "Tue, 16 Nov 1993 17:31:55 EST."
Date: Wed, 17 Nov 1993 03:23:29 -0600
From: Adam Harris <harris@cs.uchicago.edu>
I'd like to cast a vote of in-server regexp routines.
It's true that they are "unreadable" to some average, untutored user.
On the other hand, it seems like there's a huge base of people
who know and use "regular expressions". It's implemented all over
the place (aside from emacs, already mentioned, there's perl, grep,
MOO implementations, for a start). And to those users---like myself---
grep descriptions are *decodable*, if not entirely readable.
This is not to say that, in general, we should restrict ourselves
to what is currently the standard. That would be a mistake. But in
this case I feel the standard is a reasonably good one. At least,
I haven't really seen better one.
(How many times have I wished MS Word had reg-exps! Ah, serves me
right.)
> > Fifth, I'm kind of wary of totally in-server robots. I think that
> > decision mechanisms in robots should generally be done client-side
> > (there are plenty of client tools for this).
>
> In my experience, in-server robots are the way to go. Client robots just
> can't get at the information as expediently or easily using text-only
> protocols. Client robots also require the implementation in a different
> language than the system which they are interfacing with, which deters
> otherwise creative users.
>
Could someone give me a clue on what a "client robot" is?
.......................................................Adam Harris
............................................harris@cs.uchicago.edu