Work at SourceForge, help us to make it a better place! We have an immediate need for a Support Technician in our San Francisco or Denver office.

Close

STDERR issues with compiled .so log outputs

Help
IcedSt
2012-10-24
2013-04-04
  • IcedSt
    IcedSt
    2012-10-24

    I'm using pydoop 0.64.  When I'm running the script with hadoop down, the Compiled .so code produces STDERR output that python doesn't/can't capture or silence.  The best way for me to silnce the STDERR output is to follow http://stackoverflow.com/questions/977840/redirecting-fortran-called-via-f2py-output-in-python/978264#978264  .  I don't want to mess with the module code so I've created an error catcher using "pydoopcode.py 2>&1 1>/dev/null | python error_catcher.py".  Is this ok?  Will there be a way to capture and/or turn off the .so STDERR log outputs in the future?

     
  • Simone Leo
    Simone Leo
    2012-10-25

    Hi,

    can you please expand on this? Post the whole error message as well as the command you're trying to execute.

    Also, why are you running Pydoop with Hadoop down? Why does STDERR junk bother you so much?

    Simone

     
  • IcedSt
    IcedSt
    2013-03-22

    Sorry for taking so long to respond.

    I'm running the python stuff in the back ground with CRONTAB, and I wanted to catch ALL error output and redirect it anywhere I want to place it.

    This includes errors from my programming to hadoop and pydoop errors.

    What I've created as a workaround was creating a python script that caught errors as a STDIN and pushed it into my logging scripts.

    Example crontab entry: python hadoop_script.py 2>&1 1>/dev/null | python error_catcher.py --log=logname

     
  • Luca Pireddu
    Luca Pireddu
    2013-03-22

    Maybe even simple shell redirection would have worked? For instance,

    python hadoop_script.py 2>> logfile 1>/dev/null
    
     
    Last edit: Luca Pireddu 2013-03-22
  • IcedSt
    IcedSt
    2013-03-22

    That's a good idea, though it's not preferrable... I want to make sure that nothing gets munged in the event that I have more than a few processes writing to the same log file, i'll let the python module handle the unmunging for me.

     
  • Simone Leo
    Simone Leo
    2013-04-04

    Hello,

    take a look at the silent_call function here:

    https://sourceforge.net/p/pydoop/code/ci/9a4818934d1fa891a25b39abd4dd3b770d85ef9a/tree/test/utils.py

    I'm using it to silence Java-side messages in the Pydoop test suite. For instance:

    Standard:

    >>> import pydoop.hdfs as hdfs
    >>> fs = hdfs.hdfs("default", 0)
    13/04/04 12:33:06 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    

    Silent:

    >>> import pydoop.hdfs as hdfs
    >>> from utils import silent_call
    >>> fs = silent_call(hdfs.hdfs, "default", 0)