[libdnet-devel] intf.get_dst() problem on OSX
Status: Abandoned
Brought to you by:
dugsong
From: Darryl R. <da...@oe...> - 2006-01-18 03:03:15
|
Hey All, Trying to write a problem that is cross-platform. So far I've got it working fine on Win32. Next step is OSX. I'm having a problem with the intf.get_dst() call though: | Python 2.3.5 (#1, Mar 20 2005, 20:38:20) | [GCC 3.3 20030304 (Apple Computer, Inc. build 1809)] on darwin | Type "help", "copyright", "credits" or "license" for more information. | >>> import dnet | >>> it = dnet.intf() | >>> it.get_dst(dnet.addr('0.0.0.0')) | {'addr': 127.0.0.1/8, 'mtu': 16384, 'flags': 35, 'type': 24, | 'alias_addrs': [::1, fe80:1::1/64], 'name': 'lo0'} As you can see it's returning the loopback interface as the interface for the default route. This definitely is not correct, it should be en1. If I use a real IP address, then it returns the correct information. Anyone have any ideas? Regards Darryl -- Darryl Ross Senior Network Engineer OEG Australia Email: da...@oe... Phone: 61 8 81228361 If you want to live up to the whole "There is more than one way to do it" slogan, you have to give someone a swiss army chainsaw ... |