Menu

Pydoop build fails on Centos 6.2

Help
Berg
2013-12-04
2014-02-07
  • Berg

    Berg - 2013-12-04

    I've been unable to build pydoop on my Centos6.2 machine via any method (pip, easy_install, or download and build). The version I am trying to install is 0.10.0. There are many build errors associated with include files, along the lines of:

    In file included from /usr/include/python2.6/Python.h:42,
    from /usr/include/boost/python/detail/wrap_python.hpp:142,
    from /usr/include/boost/python/detail/prefix.hpp:13,
    from /usr/include/boost/python/args.hpp:8,
    from /usr/include/boost/python.hpp:11,
    from src/hdfs_common.hpp:31,
    from src/hdfs_fs.hpp:24,
    from src/hdfs_fs.cpp:21:
    /usr/include/stdlib.h:503: error: ‘size_t’ was not declared in this scope
    /usr/include/stdlib.h:503: error: expected ‘,’ or ‘;’ before ‘throw’
    /usr/include/stdlib.h:508: error: ‘size_t’ has not been declared

    and on and on and on for pages . . .

    Has anyone else encountered this and conquered it?

    p.s. See blog.cloudera.com/blog/2013/01/a-guide-to-python-frameworks-for-hadoop/ for another user with similar build issues.

     
  • Mauro Del Rio

    Mauro Del Rio - 2013-12-05

    Hi, which version of Hadoop are you running?

     

    Last edit: Simone Leo 2013-12-05
  • Berg

    Berg - 2013-12-05

    Cloudera CDH 4.4

     
  • Mauro Del Rio

    Mauro Del Rio - 2013-12-06

    Sorry, but at this moment Pydoop supports only CDH 4.2 and 4.3.

     
  • Berg

    Berg - 2013-12-11

    Interesting. We get the same errors on our CDH 4.3 cluster.

     
  • Mauro Del Rio

    Mauro Del Rio - 2013-12-12

    That'strange, did you install cdh from packages? Installation from tarball is not supported.

     
  • AJ Rader

    AJ Rader - 2014-02-06

    Why are only the older versions of CDH 4.X supported? Is there any way to get it to work with CDH 4.5? I have hadoop 2.0.0-cdh4.5.0 and am trying to install pydoop 0.11.1.
    Ensuing errors are:

    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:39:26: error: openssl/hmac.h: No such file or directory
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:40:28: error: openssl/buffer.h: No such file or directory
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc: In member function ‘std::string HadoopPipes::BinaryProtocol::createDigest(std::string&, std::string&)’:
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:423: error: ‘HMAC_CTX’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:423: error: expected ‘;’ before ‘ctx’
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:424: error: ‘EVP_MAX_MD_SIZE’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:425: error: ‘ctx’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:426: error: ‘EVP_sha1’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:426: error: ‘HMAC_Init’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:427: error: ‘HMAC_Update’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:429: error: ‘digest’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:429: error: ‘HMAC_Final’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:430: error: ‘HMAC_cleanup’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:433: error: ‘BIO’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:433: error: ‘bmem’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:433: error: ‘b64’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:434: error: ‘BUF_MEM’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:434: error: ‘bptr’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:436: error: ‘BIO_f_base64’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:436: error: ‘BIO_new’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:437: error: ‘BIO_s_mem’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:438: error: ‘BIO_push’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:439: error: ‘BIO_write’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:440: error: ‘BIO_flush’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:441: error: ‘BIO_get_mem_ptr’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:444: error: ‘digestBuffer’ was not declared in this scope
    src/hadoop-2.0.0-cdh4.5.0.patched/pipes/impl/HadoopPipes.cc:446: error: ‘BIO_free_all’ was not declared in this scope
    error: command 'gcc' failed with exit status 1

     
  • Mauro Del Rio

    Mauro Del Rio - 2014-02-07

    Supporting new versions of hadoop/cdh can be a non trivial task, since we have to patch the code that meanwhile can be changed. Recently we have worked for supporting hadoop 2.2.0, so cdh has been left aside for a while.
    Now we have time for working on cdh 4.4/5, so they will be supported as soon as possible.

     

Log in to post a comment.