Menu

Simulating processing delay/jitter with the In-Memory LDAP server

2013-12-12
2013-12-12
  • Oscar Golden L.

    Oscar Golden L. - 2013-12-12

    Hi,
    I am currently in the process of fine-tuning various time sensitive parameters (serverSet and LDAP connection pool). I am basically trying to see how my application behaves when connecting to an LDAP server with various processing delays and jitters.
    I was wondering if there is a simple way to control the processing delay of the In-memory LDAP server.
    Thanks.

     
  • Neil Wilson

    Neil Wilson - 2013-12-12

    If you want a uniform delay for all operations in a server instance, then you could simply use the InMemoryDirectoryServer.setProcessingDelayMillis method.

    If you want something more complex, then you could create a custom LDAPListenerRequestHandler that intercepts a request, examines it to determine whether and how long to sleep, and then passes the request on to the InMemoryRequestHandler. Note that if you go this route, you'll have to use the generic LDAPListener framework rather than using the InMemoryDirectoryServer class itself, but that should be pretty straightforward (you can look at the source for the InMemoryDirectoryServer constructor to see how to do that).

     

Log in to post a comment.