#154 order of super class mapping is wrong

Dozer v.4.3

I found a very nasty bug which took me a long time to isolate it well, as it didn't always maybe
itself visible.

The bug: The super classes, determined in MappingProcessor.checkForSuperTypeMapping are
put in unordered hashset, such that processing of his content happens in an unpredictable order,
that can result in nasty bugs that happens when the super classes aren't prcoessed
before his subclass isn't processed/mapped.

In my case it's important that the super class needs to be mapped first before his subclass is
mapped because the values in the super class calculate the hashCode.
Why? because the mapped class is put in a HashSet, and because the hasCode is still incorrect
because the super classes aren't mapped yet, the item will be placed in an incorrect position
in the HashSet. That is: after it has been put in the HashSet, the super classes are mapped
such that the hashCode changes and as such the item can never be found anymore in the HashSet.
That is that HashSet.contains(item) returns false where as in the previous mapping it was put
in the same HashSet. (reminder: a HashSet uses a HashMap and an key object in the map is located through his hasCode).

BTW: the patch below also solved bug 1865775

Let my explain this through an example which shows when this happens:
Suppose the following domain model: class Item implements an interface Inf and contains a collection
of child Items:

class Item extends Base implements Intf {
private Item parent;
private Set<Intf> childs

//... getters and setters of child and parent


interface Intf {
// ... getters and setters of childs and parent ...

The Base class:
class Base {
private String id
// getters/setters for id...

public int hashCode() {
return this.id;

public boolean equals(Object other) {

We will map this to an equivalent Dto object: ItemDto, which looks the same.
The dozer mappings contains an entry from Item to ItemDto with a hint for the types contained
in the Set.
Note further in the above that the item is contained in the parent as a child such that:
Item item;
item.getParent.getChilds().contains(item) == true...

If we map Item to ItemDto, the following happens in the current dozer code:
We map an instance A of Item to ItemDto.
When we start mapping A in MappingProcessor.map(...) we first create an instance of ItemDto, let's see B
and put that in the mappedFields field so we can reuse the same instance B latter.
In the MappingProcessor.map(...) we determine the super class mappings through the
method checkForSuperTypeMapping and store that in the local superClasses. Super classes then contains
only the Base mapping. Then the interface mappings are added, such that superClasses contains
Base and Intf mapping (als if the Intf mapping isn't contained in the dozer mapping it will create a default
mapping and add it). It will then first map the content of superClasses before mapping the actual
class. BTW: This is something I don't realy understand well, as I think this could be optmized
as the superClasses will perform the same mapping, that of the interface Intf which does exactly the
same as the mapping from Item to ItemDto.. But anyway, there are probably reasons for doing it this way, let
me continue. The superClasses local is a HashSet and it will now first map his content. If it then
first map the Intf mapping it will goes wrong...If it first map the Base class and as such that the hashCode
returns the correct value, the mapping goes correct.
If it first map Intf, it will map the parent field of A and will then map the childs of the parent. The child contains A,
the one we started with. It will then map A again, but because it is stored in the mappedFields field, it will use
instance B from there and store it directly in the childs Set.
This all looks good not.... NO... it doesn't.... Note that Base class of instance A wasn't mapped yet, such that
the id of B is null and such that the method hashCode returns the incorrect value and as such that it is stored
incorrectly in the HashSet.
When we continue and map Base of A, the id field is filled. If I then call childs.contains(A) it will return false
for sure!!
I hope you can follow me here, otherwise, just email me...
It's a very nasty bug.
What I did in the patch below: replaced the method checkForSuperTypeMapping by my own version which makes sure
that the superClasses local always maps the super classes first (using an ArrayList and putting the super classes first).
Like mention above, it also solves the other bug 1865775 that overcomed the mapping of more then one super class.

The patch works well for me, but you should add an extra junit test the present the above situation.

Then still there are a few open issues that I can't really understand well, but this cost me already way to much time, so
please have a look at them closely:
1) Like mentioned above, I don't realy understand why the superClasses field will contain a mapping for the interface class
when mapping. It sames that it creates some duplicate mapping and as such does too much work.
2) Suppose that the dozer mapping doesn't contain a mapping for Intf but for Item to ItemDto instead. Like mentioned
above, the mapping is added to hint the content of the childs Set. In this case, when Dozer tries to map the field "Intf parent"
it will look for mapping information of Intf, which it will not found, because we defined mapping information between Item and ItemDto.
It will then add a default mapping information to the field customMappings in the method getClassMap. I dont' realy understand this and this
smells as we already definid this mapping in dozer between Item and ItemDto which should be the same... (my tests work however)

I think the interface implementation/design needs a closer look because of the above and the other bugs(s) about interface that I reported
(maybe others as well). For example: currently it's not possible to define field mappings with "is-accessible="true""
(not through his getters/setters) between interface mapping definitions. To me, the interface mapping, is all a bit confusing :(..


  • ed

    ed - 2008-11-15


  • ed

    ed - 2008-11-15
    • priority: 5 --> 9
  • ed

    ed - 2008-11-15

    Like already noted in the above text: please have a good look at the interface mapping and how that is used in the code.
    I just noticed that I mapped an array with items that contains duplicates after mapping it, when I describe the mapping in the dozer mapping file as interfaces. However, if I describe the mapping between the items in the arrayList as concrete classes, no duplicates appear :(

  • dmitry (lv)

    dmitry (lv) - 2008-11-18

    First of all thanks for your help! It would be more difficult to evolve an open source project without guys like you :)

    The reason I write is to ask whether switching from Set to List is the only change you had in you patch. MappingProcessor.checkForSuperTypeMapping method changed a little bit so it is difficult to apply patch as is. And your patch also seems to have indentation changes, which makes things even more scary. If List conversion is the only addition I will apply the change and include that in the upcoming release.

    Regarding interface mapping, I think it is a separate topic and worth of opening a ticket to review the whole thing.

  • ed

    ed - 2008-11-19

    Hi Dimitri,

    I undestand that it's quite a big issue that I posted above, but please take your time to fully understand it. I did my best to explain it as good as possible which isn't very easy for this nasty bug.
    I am sure you can easily merge it with your current changes if you fully understand it.

    The short answer: No, it's not a matter of changing a set to a List. If you do that you will still have the deep interface bug, like mentioned above and posted in an earlier issue and you would still have the problem that the root super classes are located at the end of the list, whereas they should be located at the very beginning of the list (as they must be mapped first!).

    Sorry about the indentation, appearantly I pressed my format shortcut which I wasn't my intention :(...
    To make it more easy, below you find the replacement of the method checkSuperTypeMapping and his helper methods (if you replace the method you also have a replace several method calls that expect a Set instead of an Collection but you will notice that automatically).

    Let me know if this help, otherwise let me know your identation and will make a new patch.
    private List checkForSuperTypeMapping(Class srcClass, Class destClass) {
    // Check cache first
    Object cacheKey = CacheKeyFactory.createKey(new Object[] { destClass, srcClass });
    CacheEntry cacheEntry = superTypeCache.get(cacheKey);
    if (cacheEntry != null) {
    return (List) cacheEntry.getValue();

    // If no existing cache entry is found, determine super type mappings.
    // Recursively walk the inheritance hierarchy.
    List superClasses = new ArrayList();
    // Need to call getRealSuperclass because proxied data objects will not return correct
    // superclass when using basic reflection
    Class superSrcClass = MappingUtils.getRealSuperclass(srcClass);
    Class superDestClass = MappingUtils.getRealSuperclass(destClass);
    if (superSrcClass != null && superDestClass != null) {
    addSuperClassMapping(superSrcClass, superDestClass, superClasses);

    if (superSrcClass != null) {
    addMapping(superSrcClass, destClass, superClasses);
    if (superDestClass != null) {
    addMapping(srcClass, superDestClass, superClasses);

    // Add result to cache
    cacheEntry = new CacheEntry(cacheKey, superClasses);

    return superClasses;

    private void addSuperClassMapping(final Class srcClass, final Class destClass, final List superClasses) {
    addSuperClassMapping(srcClass, destClass, superClasses, true);

    * Recursive method that will walk through all super classes of the specified source and destination class to find a mapping between all
    * combinations between the source and destination classes. The combinations are added to the specified super class list.
    * When calling this method for the first time, call it with <code>changeDestination</code> equals true such that all possible combinations between
    * the destination/source classes are determined. It will then walk up to the
    * root super destination class, followed to walking up all the way to the source root super class. Such that any mappings between the root/upper
    * classes are added first to the list of super classes. We will then walk down by changing the destination/source class, to search
    * for any mapping between the source and destination classes.
    * @param changeDestination if true it will determine the super class of the destination class and use that
    private void addSuperClassMapping(final Class srcClass, final Class destClass, final List superClasses, final boolean changeDestination) {
    Class superClass = null;
    if (changeDestination) { // change destination
    superClass = getValidSuperClass(destClass);
    if (superClass != null) {
    addSuperClassMapping(srcClass, superClass, superClasses, true);

    superClass = getValidSuperClass(srcClass);
    if (superClass != null) {
    addSuperClassMapping(superClass, destClass, superClasses, false);
    else { // we are at the end, no iterate the the other
    superClass = getValidSuperClass(srcClass);
    if (superClass != null) {
    addSuperClassMapping(superClass, destClass, superClasses, false);
    else { // change source
    superClass = getValidSuperClass(srcClass);
    if (superClass != null) {
    addSuperClassMapping(superClass, destClass, superClasses, false);

    // no further iteration, such that we check if we have to add a mapping between the source and target
    addMapping(srcClass, destClass, superClasses);

    private void addMapping(Class srcClass, Class destClass, List superClasses) {
    ClassMap classMap = (ClassMap) customMappings.get(ClassMapKeyFactory.createKey(srcClass, destClass));
    if (classMap != null) {
    log.debug("Adding a mapping between source and target super classes, Source: " + srcClass + ". Target: " + destClass);

    * @return null if no valid super class was found
    private Class getValidSuperClass(Class cls) {
    if (cls == null) {
    return null;

    Class superClass = MappingUtils.getRealSuperclass(cls);
    if (superClass == null || superClass.getName().equals("java.lang.Object")) {
    return null;
    return superClass;

  • ed

    ed - 2008-11-19

    Hi Again,

    Because I am working with a patached version I changed your pom a bit so it suits my needs.
    Anyway, you want want to add the next piece of pom snippet, to make sure that the sources are packaged in a jar as well.

    This is very convenient, especially when using the maven eclipse plugin and you run something like mvn eclipse:eclipse -DdownloadSources=true. It will then automatically create the eclipse .classpath files for you and attach the *-sources.jar, such that you can directly debug in to an attached library without eclipse giving the enoying message "that it can't find the source, do you want to attache it?"

    Pom snippet (in your build/plugins section):

  • dmitry (lv)

    dmitry (lv) - 2008-11-19


    I am trying to reproduce the problem with the following test case:

    public void testInheritanceDirection() {
    Item source = new Item();

    HashSet children = new HashSet();
    Item child = new Item();

    ItemDTO dest = (ItemDTO) mapper.map(source, ItemDTO.class);
    assertEquals("A", dest.getId());
    assertEquals(1, dest.getChildren().size());

    ItemDTO resultingChild = (ItemDTO) dest.getChildren().iterator().next();
    assertEquals("B", resultingChild.getId());
    assertEquals(dest, resultingChild.getParent());


    Note that ItemDTO does not extend any BaseDTO and contains fields: id, children and parent.
    Am I missing something? My test case works as expected. I tested it on the Trunk.

  • ed

    ed - 2008-11-19

    Yep, like that it works just fine, but you forgetting the interface of Item (like in the pseudo code above)
    Because you aren't posting the the classes below, I can't see if Item also extends from a base class.

    Let me help you a bit with some concrete code of mine where it goes wrong (i modified the code a bit, but this should produce the bug).
    I added a file below with some classes:
    ContentItemGroup: interface.
    ContentItemGroup: the implementation.
    Entity: super interface
    EntityBase: super class.

    And some mapping snippets.
    You only have put the correct mappigns packages in the mapping snippets and map an instance of ContentItemDefault to ContentItemDto.. or you could even use the same classes as DTO I quess (for the test)....

    I also added the code I posted earlier in the another file.
    BTW: I just noticed in the code that checkForSuperTypeMapping should never return null, otherwise it will result in a nullPointer exception... Maybe you could improve this a bit.

    when running the test, put a breakpoint at the following line in the method MappingProcessor.map(...):
    superClasses.addAll(ClassMapFinder.findInterfaceMappings(this.customMappings, srcClass, destClass));

    Step over this and the content of the superclasses variable should contain at least two mappings, one for the interface ContentItemGroup and one for the EntityBase.
    At least then you know you superClasses variable is filled with the correct possible-bug data.
    Your test can now still run OK, which depends on how these mappings are processed (like explained above)...
    If the ContentItemGroup mapping is prcoessed first, your test will fail. See above for further details.

    Hope this helps.

  • ed

    ed - 2008-11-19
  • dmitry (lv)

    dmitry (lv) - 2008-11-19

    In my example Item did have an interface Base with id attribute.
    Please help me out to reproduce this. One question - what is on DTO side?

    My guess is:
    public class EntityDTO {
    private String id;
    // get/set methods

    public class ContentItemGroupDTO extends EntityDTO {

    private ContentItemGroup parentGroup;
    private Set childGroups;
    // get/set methods

  • ed

    ed - 2008-11-19

    He Dimitri,

    Don't worry and hang in there. We will get there and get the bug out of the shadow :)
    I just came back from jogging and realized that I forgot a few things, sorry for that.
    What I forgot:
    - The hasCode() part of the EntityBase, so I updloaded a new version of the code.txt file below.

    - Some test code so you can see the bug...
    First of all, please use my code (latter I will explain why). You could even commit your test in svn, such that I can see it and help you a bit.

    We have to map an instance of the ContentItemGroup that itself also contains an ContentItem as parent. This parent contains the original ContentItemGroup also as child which is important!
    So the test code should be something like this:

    ContentItemGroup childItem = new ContentItemGroupDefault();
    ContentItemGroup parentItem = new ContentItemGroupDefault();
    Set<ContentItemGroup> childs = new HashSet<ContentItemGroup>();

    Then map the childItem: map(childItem, childItemDto);

    Okkkk, now we have this, let's continue where we left below, just after the breakpoint.
    Just after the breakpoint you will will see that the superclasses variable contains two mapping: one of EntityBase (super class mapping) and one for ContentItemGroup (interface mapping).
    It will then process this superclasses mappings FIRST before continuing mapping the childItem fields.
    If it will then process first the ContentItemGroup mapping, your test will FAIL!...

    Why ?... quite simple: If it will first process the ContentItemGroup mapping, it will encounter the field parentGroup and will try to map this. This is again just another ContentItemGroup mapping which contains childGroups which it will try to map as well. When it maps the childs (of the parent) it will encounter the childItem again (the one we started with) and it will found this in the field mappedFields of MappingProcessor instance because it already mapped this instance. As such it will use this instance and add it to the childs Set: BOEM.......
    Why Boem?... Because the EntityBase mapping wasn't processed yet, and such the id wasn't set properly that is used in the hashCode....
    Do you understand this ?

    Note: ofcourse there is a little change that it will first process the EntityBase mapping in case your test will not fail... In that case, you are very lucky :)... But form the above you will understand that the order of processing is important.
    Anyway: just make sure that the ContentItemGroup is processed first by making sure that his hasCode returned form the mapping object, results in a lower table index (in the HashSet) then the EntityBase mapping. ....But let's not get there... Let's hope that the ContentItemGroup mapping appears int the Map (of the HashSet) before the EntityBase mapping....

    Let's me know how it goes please ?

    -- Ed

    BTW: The DTO can just be a copy of the orignal classes. Just copy them and rename them, just to make sure it's about the same as in my test case.

  • ed

    ed - 2008-11-19

    Improved version

  • dmitry (lv)

    dmitry (lv) - 2008-11-20

    Hi :)

    Lets go on.

    I have made a test, please make yourself familiar with that:

    public void testInheritanceDirection() {
    ContentItemGroupDefault parentItem = new ContentItemGroupDefault();

    ContentItemGroupDefault childItem = new ContentItemGroupDefault();


    ContentItemGroupDTO result = (ContentItemGroupDTO) mapper.map(parentItem, ContentItemGroupDTO.class);

    assertEquals("A", result.getId());
    assertEquals(1, result.getChildGroups().size());

    ContentItemGroupDTO childDTO = (ContentItemGroupDTO) result.getChildGroups().iterator().next();

    assertEquals("B", childDTO.getId());
    assertEquals(result, childDTO.getParentGroup());

    eturn childGroups;

    public void setChildGroups(Set childGroups) {
    this.childGroups = childGroups;


    I have changed your code so there would be setter for id too. But I tried it without setter too, with public field. The test is passing. In my case EntityBase is processed first, but superClasses Set contains both EntityBase and ContentItemGroup you are right.

    Now I see that in theory ordering does matter and ordering will be different on different JVM versions. But still I would very like to reproduce it.

    I have submitted InheritanceDirectionTest in Trunk, please take a look. I need to reproduce the problem before fixing.

  • ed

    ed - 2008-11-20

    He Dimitri,

    I looked at your test. A few notes:
    1) Copy the hashCode from EntityBase to EntityDto.
    2) Make the getId in the EntityDto protected.
    3) Map the child and NOT the parent, so it must be:
    ContentItemGroupDTO result = (ContentItemGroupDTO) mapper.map(childItem, ContentItemGroupDTO.class);

    4) Put your test in a loop, as it will not always be that the ContentItemGroup mapping appears before the EntityBase mapping. That depends on the hashCode calculation (the SUn mistery).

    Anyway: I was a bit suprised that the test didn't fail, even when the ContentItemGroup mapping appeared before the EntityBase mapping. When I searched on I discovered that the id mapping is included in the ContentItemGroup mapping (Not in my test). That is strange and I thhink incorrect as it's already included in the EntityBase mapping, so you will get a duplicate mapping!! So the id FieldMap should be included in the ClassMap of ContentItemGroup. I don't understand how it get's there, as it wasn't in my own test.
    Anyway: I found out that it comes because of thet setId() in the EntityDto. I don't realy understand why this setId() results in an id mapping in the COntentItemGroup. You might want to have a look at that, as it smells. If you make the setId() protected it will disappear..

    Anyway: when I made the setId protected like indicated in the note 2) I did get the bug (see below). Just sit and wait and run it several time. Like mentioned before, it highly depends on the assigned hashCode of the jvm. If you can't reproduce the bug, then just add an extra method to the ContentItemGroup or EntityBase/EntityDto and save it. The JVM uses the class bytecode for his hashCalculation so this works if you are lucky....
    I also noticed that if I put a breakpoint in the hashCode method of ENtityDto is shows itself more often....so you might want to try that...

    Let' me know if you see it ?

    My EntityDto:
    public class EntityDTO {

    private String id;

    public String getId() {
    return id;

    protected void setId(String id) {
    this.id = id;

    public int hashCode() {
    if (this.id == null) {
    throw new IllegalStateException("Dto Id not mapped yet: BOEM.");
    return this.id.hashCode();


    Dest field type: net.sf.dozer.util.mapping.vo.direction.ContentItemGroupDTO
    java.lang.IllegalStateException: Dto Id not mapped yet: BOEM.
    at net.sf.dozer.util.mapping.vo.direction.EntityDTO.hashCode(EntityDTO.java:21)
    at java.util.HashMap.getEntry(HashMap.java:344)
    at java.util.HashMap.containsKey(HashMap.java:335)
    at java.util.HashSet.contains(HashSet.java:184)
    at org.apache.commons.collections.set.ListOrderedSet.add(ListOrderedSet.java:173)
    at net.sf.dozer.util.mapping.MappingProcessor.addToSet(MappingProcessor.java:665)
    at net.sf.dozer.util.mapping.MappingProcessor.mapCollection(MappingProcessor.java:513)
    at net.sf.dozer.util.mapping.MappingProcessor.mapOrRecurseObject(MappingProcessor.java:398)
    at net.sf.dozer.util.mapping.MappingProcessor.mapFromFieldMap(MappingProcessor.java:304)
    at net.sf.dozer.util.mapping.MappingProcessor.mapField(MappingProcessor.java:250)
    at net.sf.dozer.util.mapping.MappingProcessor.map(MappingProcessor.java:219)
    at ....................

  • dmitry (lv)

    dmitry (lv) - 2008-11-21

    Hi Ed,

    I have modified superClass to be an ArrayList. I have also made all adjustments to the test case you have proposed.

    Please take a look. Now the order is guaranteed and the test case is still passing.


  • ed

    ed - 2008-11-21

    He Dimitri,

    From your text below, I was afraid it would end up to changing this :).
    But then I think you not completely understand the problems written in this bug and the other bug that was mentioned which is still there: 1865775.
    I am getting the impression that you a very busy or something like that as it takes me a lotttt of "words" to make you understand what goes wrong.

    Don't get me wrong, I am very happy that Dozer is picked up and not dead anymore. (I was debugging and patching Dozer for the last 2 years myself in the meantime). And Dozer code isn't very easy to understand (I don't like the code design).

    - Like mentioned before the order of the super classes in the superclasses variabele is important, which is wrong still now in your code with the ArrayList.
    Why? Just add another super class that extends EntityBase and put something in it that dependends on a variable from the EntityBase, like the Id.
    Because the method checkForSuperTypeMapping add the super classes in the wrong order, it will always stay tricky and will fail depending on the dependency (the EntityChildBase will appear before the EntityBase and will be mapped first as such).
    You have to add them in the reverse order, which is actualy what my patch does, nothing less, nothing more... that's it, and it's tested...

    - Bug 1865775 isn't solved with your solution.

    - Of course it's nice to be able to reproduce this mistake in a test, but that's not easy with this test. But many things in java are just common sense thinking. For example: a thead safe object is always hard to test, as you will have to fire at least two threads and then fire them just at the same time such that they will mix and you get rubisch... This is very hard to test, but is common sense if you look at it. Just like this, if you think about what I am writing in this post, it mustn't be so hard to see that it's incorrect:
    1) Usage of HashSet (order not guaranteed)
    2) Order in which the Collection is filled.
    ( 3) bug 1865775)

    - Why not use the patch code that I submitted ? It's very easy to understand and it does so little, and as such his impact. At least it will safe us both time. Dozer contains many tests, so you can check the code by reviewing and then by testing it with the dozer tests...

    good luck,

  • dmitry (lv)

    dmitry (lv) - 2008-11-21


    Bug 1865775 is fixed by my previous modifications not this one, and I do have a test case to prove it. Switching to ArrayList was done purely to make the test case 100% reproducable, because then we can not blame Sun for HashSet algorithm or whatever.

    From my practice I am pretty sure that if we can not reproduce the bug and fix it just like that - it is a pretty bad thing. I think you will agree with me, because the same situation can happen again in the next release after refactoring, especially when there are more than one person working with the code.

    I am very appreciating your involvement, but I need to double-check the stuff. I understand what you are telling me, and I realize that even if my test case succeeds, it does not mean anything, because the order in HashSet is undetermined and classes can have different hierarchies. I am not claiming that I fixed something with switching to ArrayList. All I say is that now problem is localized and could be reproduced in all cases, not just sometimes under certain JDK version.

    I do also believe your patch fixes the problem and leaves all existing tests intact, but the point is that it is necessary to write an automated test before fixing something. Then I will look at your fix carefully and apply it, may be with some modifications. Now I will try to spend some time to reproduce that myself, because I thought you do have a good test case for that. I understand the direction where to dig so than you for explaining all this things.

    And you are right, right now I am on my work, doing some other things in parallel. So I was hoping that someone would provide me with nice test case to analyze. I guess I will have some time on weekend. So I am sorry if my replies seemed not so clear. Hope we can collaborate on forthcoming bugs in a more pragmatic fashion :)


  • ed

    ed - 2008-11-21

    He Dimitri,

    Thanks for some explanation, now I understand why you switched to an ArrayList (and that you already solve the other bug.. dammm your fast.. :) )......
    Yep, your right, it's a good think such that you aren't dependent on Sun's hashCode.

    My own test isn't usable as it contains about 3 super classes and because I use rich domain objects they contain refences like a Dao reference. Also dozer, is hidden behind a general Dto converter.. etc.. So it's not a copy/pase thing..

    Still I think it's a good habbit to first map the root super classes first and then work your way down. Java also first creates first the super classes and then the subclasses... and not they other way around, whereas it would be correct in 90%.

    Anyway: I was thinking about to fail your test and. Not so difficult on second thought.
    The mapping processor will first (because of the ArrayList) map the super classes and then the ContentItemGroup. Just shift the contained Set and one-2-one mapping in the ContentGroup one level up to a new super class that is the parent of the ContentItemGroup and extends from EntityBase, that's it... Because of the wrong order it will first map the new super class EntityChildBase (and you get the same problem that we had without the ArrayList) that contains a parent EntityChildBase and a set with EntityCHildsBase objects, just that you get the same problem when the set is mapped and the root class isn't mapped yet.

    It the same problem but we just shifted it to a super class as they are mapped first now....

    You understand this?
    Let me know how it goes and when the changes are in svn...?


    BTW: I am working now for about 1 year around 6/7 days a week, 12 hours a day (my girlfriend is getting crazy). Why ?... I (we) have a big delay setting up our new company.

  • dmitry (lv)

    dmitry (lv) - 2008-11-21

    I have finally got it failing :)

    The test is in SVN. To fix I do Collection.reverse() at end of the method. Looks to me that it will fix the problem. I have also noticed that the same approach is employed in ClassMapFinder.findInterfaceMappings(). Not exactly as in your patch but seems to be working.

    My guess is that we can close this ticket finally. What do you think?

    P.S. Guess I can not really help you with your company business in return, but if you have any idea please let me know.

  • ed

    ed - 2008-11-21

    He Dimitri,

    Good to hear.
    I thought about the reverse() method as well (when I was solving the bug), but I don't think it's the correct option. Of course it fixes your bug, but it costs unnecessary performance which for me is important.

    The reverse() method should be used to reverse the order of a "received" List. I mean: you receive the list from some third party component and then you want it in reverse order.
    You can't change the third party code to put it in the correct order so you have to do it like this (calling the reverse method).
    In our case it's different and you are abusing the method to solve the bug.

    This method shouldn't be used if you can do it correct at the first place. That is: we have the option to store the items in the correct order, so let's do that then, or are you afraid of a recursive method :)...( I already did all the work for you ;) )

    Don't forget that there are developers out there that use Dozer to map dozens of objects with a few super classes. Such a reverse() call will certainly have an unnecessary, and unwanted performance impact...
    Even do we talk only about a few super classes but the total super class mapping can be the permutation of the dto en not-dto super classes.

    Just my 50 cents.... but I am happy that you constructed a test that reproduced the bug.
    My advice: use the patch!

    -- Ed

  • dmitry (lv)

    dmitry (lv) - 2008-11-21

    Then lets do the following, I am closing the bug and start evaluating the performance.
    I am pretty sure that there are far more superior bottlenecks in Dozer than this particular call.

    Premature Optimization stuff also comes to mind :)

    I will make code profiling for a couple of our applications using the framework.
    Yes I am afraid of recursion because it is complex and should be avoided to solve such simple task. I am thinking of caching Class metadata using DozerClass or something else. In this case we can simply go through precalculated set of hierarchy sorted in correct order and no recursion.

  • dmitry (lv)

    dmitry (lv) - 2008-11-21
    • milestone: --> Dozer v.4.3
    • assigned_to: nobody --> buzdin
    • status: open --> closed-fixed
  • ed

    ed - 2008-11-21

    He Dimitri,

    Too bad..... strange way of solving things...

    I was afraid of recursion but used it so much now that I love it... Probably I am just getting old ;)

    You can't avoid recursion if you want to walk through trees/graphs and stuff like that efficiently... ... To bad you don't share the same experience as it's very powerful (my basic patch code is only about 10 lines and works well).

    Ok then, I will keep using my own version of Dozer (merged with our changes) for the moment then and will keep monitoring the dozer release notes for performance improvements...

    Maybe it's an idea to let somebody else of the Dozer team have a look at this, that have more experience with recursion and stuff ?
    Maybe Franz, the previous lead developer, as Dozer is currently full of recursions... I am currious how you gonne deal with those...

    Good luck,


Log in to post a comment.

Get latest updates about Open Source Projects, Conferences and News.

Sign up for the SourceForge newsletter:

No, thanks