If you add an user_agent: config, that string is used for http gets. However, it is NOT used when parsing the returned robots.txt.
If I set "user_agent: agent-foo" in htdig.conf, and robots.txt to:
...then htdig will http/get robots.txt using the user-agent string "agent-foo". However, it will parse robots.txt "using myname = htdig". It therefore thinks it is not allowed to dig.
I realize that robots.txt doesn't provide real security. But, this provides some means of allowing indexing only locally, since a unique user-agent string can be created.