More surprising than necessarily wrong. "sink" is in fact operating as three different inputs to different parts of the model.

1: The part of the model that sets the features.
2: The part of the model that knows the length of the target stimulus (even if that target is yet to be presented).
3: The part of the model that knows the length of the longest string on that trial (even if that string is yet to be presented).

You should see these three distinct occurences being listed in the scm_ldt (or whatever) table but not the (standard, e.g., IA) ldt table. A related (but probably desirable) phenomenon can be seen if you use sandwich priming with IA.

Basically it happens because for lazyNut to do real SCM (and the current setup is much closer to real SCM than that you had available originally for simulations for this paper) there need to be hacks to allow the model see the future.