From: Roy S. <roy...@ic...> - 2007-02-15 16:30:26
|
On Thu, 15 Feb 2007, John Peterson wrote: > These functions are O(1) "fast" (they just return the sizes of the > underlying std::vectors of Nodes and Elements) but they are not always > 100% correct. I don't know - we already have the slightly confusing standard of n_elem > n_active_elem because the former includes ancestor elements and possibly subactive elements - why can't it include NULL elements as well? ;-) > An alternative is to replace these implementations with something > similar to the n_active_elem() functions (defined in mesh_base.C) > which use the Mesh iterators but are O(N) complexity. > > Another possible try would be to keep a "running total" of the number > of nodes/elements that would be incremented/decremented by the > Mesh::add_elem(), Mesh::delete_elem(), Mesh::add_point(), and > Mesh::delete_node() functions. However, I see this as potentially > error-prone as well. > > Thoughts? O(N) is too expensive when it's in a function that often gets tested after every iteration of a for loop. I'd try to keep a running total, but then check it with the O(N) algorithm when DEBUG is defined. The element and node vectors are private, so the potential for adding bugs to the running total should at least just be limited to the Mesh class and friends. This might also be a good time to go through the code and try to get rid of any n_elem() calls we can. Just grepping through the code I see a few places where we ought to be using iterators instead. --- Roy |