Download every 6 point in dimension using ncks

Anil
2014-02-27
2014-03-05
1 2 > >> (Page 1 of 2)
  • Anil
    Anil
    2014-02-27

    Hi NCO Users,

    I am trying hands in downloading remote data from NOMADS OPenDAP server and storing the desired data to a NetCDF file locally.Though, this is working fine and also the downloading speed of the content is quite good.The data downloaded is 1/12 of a degree for LAT & LON. I want it to download it for 1/6th of a degree for LAT & LON means every 6th record of LAT and LON.

    For this I am using below command

    ncks -d time,1 -d lat,0,,6 -d lon,0,,6 -v u_velocity,v_velocity -p http://nomads.ncep.noaa.gov:9090/dods/rtofs/rtofs_global20140226/rtofs_glo_2ds_forecast_daily_prog \ testnoaa4.nc

    I can see the temp file generated for testnoaa4.nc.temp (only 2KB) but not complete NetCDF file even after waiting for 2 hours.

    I want to confirm, whether is any error in the command used or Bug ???

    Regards,
    Anil

     
    Last edit: Anil 2014-02-27
  • Charlie Zender
    Charlie Zender
    2014-02-27

    Anil,
    The expected output is 15 GB:
    24360360720720/(361000000000)
    14.92992000000000000000
    Maybe your local system or the DAP server will not like this.
    Try building up from smaller requests until you understand the issues.
    cz

     
  • Anil
    Anil
    2014-03-04

    Charlie,

    If I use below command, for downloading time level 1 and complete dimension set for variables u_velocity, v_velocity, nco gives me file of about 72 MB.

    ncks -d time,1 -v u_velocity,v_velocity -p http://nomads.ncep.noaa.gov:9090/dods/rtofs/rtofs_global20140226/rtofs_glo_2ds_forecast_daily_prog \ testnoaa4.nc

    As I already tried to explain that I want to download every 6 point for dimension lat and lon for both the variables ( to reduce the download size) using stride step. To my knowledge about Stride does the same. So, according to me it should be quite less than 72 MB. How it would become 15 GB ??

    Will you please comment on it and of-course correct me if am misunderstanding the usability of Stride.

    Regards,
    Anil

     
    Last edit: Anil 2014-03-04
  • I can't get stride to work at all with this URL. It's a Grads server -- is that important, I wonder?

    Without stride it works fine. This takes less than 10 seconds:

    rsignell@gam:~$ time ncks -O -D4 -d time,1 -d lev,0 -d lat,0,100 -d lon,0,100 -v u_velocity,v_velocity http://nomads.ncep.noaa.gov:9090/dods/rtofs/rtofs_global20140303/rtofs_glo_2ds_forecast_daily_prog foo.nc
    ncks: INFO nco_fl_mk_lcl() successfully accessed this file using the DAP protocol
    ncks: INFO ncopen() will request file buffer of default size
    ncks: INFO nc
    open() opened file with buffer size = 0 bytes
    ncks: INFO nco_aed_prc() examining variable u_velocity
    ncks: INFO nco_aed_prc() examining variable v_velocity
    NCO version 4.2.5
    cvs_nm_sng nco-4_2_5
    cvs_mjr_vrs_sng 4
    cvs_mnr_vrs_sng 2
    cvs_pch_vrs_sng 5
    cvs_mjr_vrs 4
    cvs_mnr_vrs 2
    cvs_pch_vrs 5
    ncks: INFO nco_aed_prc() examining variable Global
    ncks: TIMER Metadata setup and file layout before main loop took 0.01 s
    ncks: INFO Moving foo.nc.pid9103.ncks.tmp to foo.nc...done
    ncks: TIMER Wallclock-elapsed time for command is 0.01 s

    But add a stride, and things just sort of hang. No problems reported, but just sits there waiting...

    rsignell@gam:~$ time ncks -O -D4 -d time,1 -d lev,0 -d lat,0,100,3 -d lon,0,100,3 -v u_velocity,v_velocity http://nomads.ncep.noaa.gov:9090/dods/rtofs/rtofs_global20140303/rtofs_glo_2ds_forecast_daily_prog foo.nc
    ncks: INFO nco_fl_mk_lcl() successfully accessed this file using the DAP protocol
    ncks: INFO ncopen() will request file buffer of default size
    ncks: INFO nc
    open() opened file with buffer size = 0 bytes
    ncks: INFO nco_aed_prc() examining variable u_velocity
    ncks: INFO nco_aed_prc() examining variable v_velocity
    NCO version 4.2.5
    cvs_nm_sng nco-4_2_5
    cvs_mjr_vrs_sng 4
    cvs_mnr_vrs_sng 2
    cvs_pch_vrs_sng 5
    cvs_mjr_vrs 4
    cvs_mnr_vrs 2
    cvs_pch_vrs 5
    ncks: INFO nco_aed_prc() examining variable Global
    ncks: TIMER Metadata setup and file layout before main loop took 0.01 s
    [infinite waiting....]

     
  • Hmm, I tried accessing with stride using NetCDF-Java and had no problems. So I guess the GRADS OPeNDAP server is not the problem. Bizarre! Or are we overlooking something simple?

     
  • Anil
    Anil
    2014-03-04

    Thanks Richard,
    Yes, without stride it works like a charm.

    The necessity of using Stride is to reduce download size. As I have to download 9 days daily forecast data. Downloading two coordinate variables u_velocity,v_velocity for complete dimensions for 1 day is about 72 MB and daily downloading 72x9 = 648 MB/day, is wastage of bandwidth because further, if I get subseted data from 72 MB local file downloaded earlier and using command

    ncks -d lat,0,2159,6 -d lon,0,4319,6 -o outfile.nc -p foo.nc

    it comes out to be about 4 MB and solves my purpose (But wastage of approx. 19 GB data a month and cannot go with it.)

    So, what should I think, it's a Bug ??? if yes, any fix or alternative solution.

    Regards,
    Anil

     
  • Charlie Zender
    Charlie Zender
    2014-03-04

    I think this is a problem with an outdated NCEP server, not with NCO. I vaguely remember an issue with stride and DAP that was fixed some months/years ago. Here is an example of NCO working fine with stride arguments on another server:

    zender@roulee:~$ ncks -v Time -d Time,0,10,2 http://eosdap.hdfgroup.uiuc.edu:8080/opendap/data/NASAFILES/hdf5/BUV-Nimbus04_L3zm_v01-00-2012m0203t144121.h5
    Time: type NC_FLOAT, 1 dimension, 5 attributes, chunked? no, compressed? no, packed? no
    Time size (RAM) = 6sizeof(NC_FLOAT) = 64 = 24 bytes
    Time dimension 0: Time, size = 6 NC_FLOAT (Coordinate is Time)
    Time attribute 0: units, size = 4 NC_CHAR, value = year
    Time attribute 1: long_name, size = 4 NC_CHAR, value = Time
    Time attribute 2: _FillValue, size = 1 NC_FLOAT, value = -9999
    Time attribute 3: origname, size = 4 NC_CHAR, value = Time
    Time attribute 4: fullnamepath, size = 17 NC_CHAR, value = /Data_Fields/Time

    Time[0]=1970.37 year
    Time[2]=1970.54 year
    Time[4]=1970.71 year
    Time[6]=1970.88 year
    Time[8]=1971.04 year
    Time[10]=1971.2 year

    zender@roulee:~$

    I don't know what else to say. DAP is supposed to make all this transparent to the NCO client---there's nothing we can do except point the finger...
    cz

     
    • Charlie,
      the Unidata tools-UI application (based on NetCDF-Java) doesn't have any problem subsetting from this NCEP server, and I just tested NetCDF4-Python, which is based on Unidata's C library (just as NCO is). And it works fine:

      http://nbviewer.ipython.org/github/rsignell-usgs/notebook/blob/master/nomads_stride_test.ipynb

      So if this isn't a NCO bug, it would have to be a subtle server bug that only NCO seems to be tickling.

      -Rich

       
  • Charlie Zender
    Charlie Zender
    2014-03-04

    google this "dap server stride bug" and follow the results...
    there have been multiple problems with DAP servers and stride through the years,
    some originally reported by NCO to Unidata (and fixed).
    the servers need to be updated to get the DAP fixes.
    these problems can affect most netCDF C-clients.
    not sure why this one does not affect netCDF4-Python.
    in any case, this is unlikely to be an NCO problem.

    cz

     
  • Charlie,
    Did you see that I also used a NetCDF C application, same as NCO, and it worked?
    What's the explanation for that? Different versions of the NetCDF C library?

    Thanks,
    Rich

     
  • My version of NCO that doesn't work is 4.2.5, but I don't know how to tell what version of the NetCDF C library it was linked with.

    Charlie, if you have the latest and greatest NCO with latest and greatest NetCDF C library, can you it out?

     
  • Charlie Zender
    Charlie Zender
    2014-03-04

    use ncks --library to print the library version.
    the problem occurs with the current NCO snapshot on the current netCDF library.
    it's possible it's an NCO problem, yet the behavior is what one would expect (an what we have seen in the past) when it's a DAP server bug. i'd prefer that someone show there's a problem with a similar command with NCO accessing a known-to-be up-to-date server before looking into the root causes.
    best,
    cz

     
  • Charlie,

    Since NetCDF4-Python is working and NCO is not, I asked Jeff Whitaker if he was doing anything "special" to handle striding with DAP. His reply was: "netcdf4-python uses nc_get_vars for strided access, and nc_get_vara for everything else. Nothing special is done for DAP access (that's all internal to the C lib). I would think that NCO would do the same."

    I love NetCDF4-Python, but I think it's a shame to have to write a python program to extract strided data from NCEP when ncks should be able to do the job instead.

     
  • Charlie Zender
    Charlie Zender
    2014-03-04

    Thank you, Rich. That's helpful. I think NCO uses nc_get_varm() for this. IIRC nc_get_varm() is more general than nc_get_vars() and that may kill us on Opendap (but not local) accesses. That is the type of difference that could explain this problem so you've convinced me that there's a good chance NCO could be responsible. Will put it high on the list of things TODO. c

     
  • Charlie,
    Fantastic!
    Thanks for working through this with us.
    -Rich

     
  • Charlie Zender
    Charlie Zender
    2014-03-04

    I've committed "the fix". Please try with the latest CVS snapshot if so inclined. Using nc_get_vars() instead of nc_get_varm() (with an empty mapping vector) solved the access time problem for strides over DAP for me.
    Will be in 4.4.3. Thank you very much Rich for asking and reporting Jeff's method. Never would have guessed it myself. Always assumed nc_get_varm() with an empty mapping vector would reduce to the same algorithm as nc_get_vars(). Glad to have that fixed. Could improve read/write times for many users! Best, cz

     
  • Anil
    Anil
    2014-03-05

    Hi Charlie, Rich!!

    Good catch by Richard and nicely done by Charlie. Good job done by both.
    Today came to know that issue has fixed and will help all NCO users.

    Charlie, Am happy "the fix" applied would be out with version 4.4.3, but I need it urgent to move ahead with my work. I tried to compile code from CVS in Visual Studio 2010 but lots of errors and unable to compile due to dependencies of netcdf.h and many more and can not figure out the way due to lack of expertise in it.

    Will you please do me a favor by providing the latest binaries for Windows platform (Both 32 bit and 64 bit for server machine). Currently i have installed "nco-4.4.0.windows.mvs.exe" on my machine.

    It would be greatly appreciated.

    Regards,
    Anil

     
    Last edit: Anil 2014-03-05
  • I just saw on this page that you can build native Windows NCO binaries using QT:
    http://nco.sourceforge.net/nco_qt_msvc.shtml
    Does it build with the open-source version of QT (http://qt-project.org/doc/qt-4.8/install-win.html) or do you need the commercial version (http://qt.digia.com/)?

     
    Last edit: Richard P Signell 2014-03-05
  • Anil
    Anil
    2014-03-05

    I would go for Open Source version.

     
  • Pedro Vicente
    Pedro Vicente
    2014-03-05

    Anil

    I uploaded Windows NCO binaries built with the current CVS.

    http://nco.sourceforge.net/#windows

    Pedro

     
  • Is there some dependency this needs? I downloaded the .exe, ran the installer to install in c:\programs\nco, but when I try open a command prompt (cmd.exe), cd to c:\programs\nco and run "ncks" I get "application error".

     
  • Anil
    Anil
    2014-03-05

    Pedro

    After installing "nco-4.4.2.windows.mvs", I also got the error but of VCOMP100D.dll.

     
    Attachments
  • Pedro Vicente
    Pedro Vicente
    2014-03-05

    Anil

    Building with Visual Studio 2010 from source is a bit of a time consuming process, because it requires to build all the NCO dependency libraries as well.

    Here are the steps in case you want to do it.

    1) Obtain the sources of these libraries

    HDF5

    http://www.hdfgroup.org/

    since HDF5 depends on ZLIB, you need this as well, it is available at the same location

    netCDF

    http://www.unidata.ucar.edu/software/netcdf/

    ANTLR and GSL

    We have SVN repos for these, at

    svn co http://glace.ess.uci.edu/antlr-2.7.7
    svn co http://glace.ess.uci.edu/gsl-1.8

    Since you are in Windows, I highly recommnend using TortoiseSVN a graphical SVN client that integrates with Windows explorer

    To build HDF5 and netCDF, use the Cmake build provided. Make it generate Visual Studio 2010 projects. Use static builds (no DLLs)

    To build ANTLR and GSL there are Visual Studio 2010 in the repos.

    2) Define these environment libraries for the dependency libraries

    LIB_NETCDF
    LIB_DISPATCH
    LIB_NETCDF4
    LIB_HDF5
    LIB_HDF5_HL
    LIB_ZLIB
    LIB_SZIP
    LIB_GSL
    LIB_CURL
    LIB_ANTLR

    Each of them has to point to the path where the library was build in step 1)

    Example

    LIB_HDF5 defined as

    T:\hdf5\hdf5\proj\hdf5\Debug\hdf5d.lib

    3) Define these environment libraries for the dependency libraries C header files

    HEADER_NETCDF
    HEADER_GSL
    HEADER_ANTLR

    Note here that these are only 3 "direct" dependencies, since NCO does not depend directly on HDF5, for example, since there are no HDF5 API calls

    Pedro

     
  • Pedro,
    But just to be clear, the nco .exe you provided should just work, right? Or are there .dlls that we need to download?

     
  • Pedro Vicente
    Pedro Vicente
    2014-03-05

    Richard

    That error message is a first for me, what Windows version do you have?. These binaries were built in Windows 7

    Anil

    Also, a first for that error message.

    VCOMP100D.dll is a DLL for OpenMP. It is not tested in Windows but I enabled it in the build solution.
    You may try to use the DLL version that you have in your system, if any.
    If that does not solve it, I'll build binaries without OpenMP.

    Pedro

     
1 2 > >> (Page 1 of 2)