Never mind....I had a bug in my setup (duh). Thanks anyway.
> -----Original Message-----
> From: Haskins, Gregory=20
> Sent: Thursday, January 15, 2004 5:05 PM
> To: 'user-mode-linux-user@...'
> Subject: Default signal handling in UML applications?
> Hi all,
> I am in the process of adding support for some error=20
> logging to the linux kernel. I am using UML 2.4.22 to do the=20
> prototyping. I wrote a sample application that causes a=20
> crash, and I run UML in debug mode. I set a breakpoint in=20
> do_coredump(), but the system never gets there. =20
> Investigating, I found that the signal handler has been set=20
> to a non default (!=3D SIG_DFL), so that explains why I am not=20
> hitting the default handler and not executing the core-dump logic.
> Does UML do something funky with overriding default signals? =20
> The application that I wrote is 3 lines and certainly doesn't=20
> configure signal handling, so I wasn't sure what was=20
> happening. Is there some doc out there someone can point me=20
> to that explains UML signal handling?
> Thanks a bunch!