Menu

Creating ".gim" file.

DOHWAN KIM
2016-02-16
2016-02-16
  • DOHWAN KIM

    DOHWAN KIM - 2016-02-16

    I’m studying article geometric image by Gu.
    And I’m handling 3d mesh file, especially stl file.
    By using your library, I can get the image of 3d mesh file. (not image file)
    But I can’t get 2d gim file. In this process,
    What I want to get is 2d gim file ( or file which structure is array [nxnx3] with RGB data).
    I have some questions..
    1. How can I get the gim file by using this library?
    2. Is there a converter any 3d mesh(ply,stl,…) file to gim file?
    Please, I hope your answer.
    Thank you for reading.
    Have a nice day.

     
  • LE PETITCORPS Yann

    I just discover the Open GI library and have begin to analyse a little the gim.c file into the examples directory of the source code of OpenGI library

    The geometry and normals images seem to generated into the call

    int create_gim(unsigned int *gim, int res)
    

    where

    res is the resolution of images where width=height=res
    (the maximal resolution is 1024x1024, cf. res <= 1024)
    
    gim[0] is the id of the geometry image (res x res x 4 float)
    
    gim[1] is the the id of the normals image (res x res x 3 unsigned char)
    
    gim[2] is the texture image (res x res x 3 unsigned char)
    

    The geometry image use a float RGBA texture, so I think that R,G,B and A values of each pixel is directly mapped to X, Y, Z and W coordinates of vertices

    The normals image use an RGB unsigned char format, so I think that RGB values given by this texture need to be "decompressed" for to generate negative value when < 128 and positive value where > 128
    (or the inverse, I haven't already take a look at them)

    So, I think that we have only to save the content of the glm[0] texture for to backup the geometry data and to save the content of the glm[1] texture for to backup the normals data, with something like DevIL for example
    (the gim[2] texture handle seem to be used for the "true" texture data, cf. the picture that we have to map on the objet)

    I don't know if texture coordinates can too be stored into another texture image like the geometry image or the normal images but this don't seem me to be technically impossible to handle because X,Y,Z coordinates of vertex positions and normals are already handled by the library, so we have "only" to handle the (U,V) texture coordinates using the same "interpolation" scheme than positions or normals coordinates that are already handled by the OpenGI library

    I take a look in more detail on next days into this library for to see how this can to be "easily" handled

    I take a look too for the support of anothers 3D file formats, like the Wavefront .OBJ or Quake MD2/MD3 models files formats for to begin, because the .ply2 format seem only to can handle X,Y,Z coordinates and very basics shapes using this very simple file format :

    numVertices
    numPolygons
    x coordinate of the first vertice
    y
    z
    x coordinate of the second vertice
    y
    z
    ...
    x coordinate of the last vertice (where last = numVertices)
    y
    z

    degree of the first polygon (=3 for a triangle)
    v0
    v1
    v3
    degree of the second polygon
    v1
    v2
    v3
    ...
    degree of the last polygon
    v1
    v2
    v3

    => texture coordinates, color coordinates and/or animations are not handled by this very basic 3D file format used into the create_mesh(const char *filename) function that load the 3D object to convert :(

    @+
    Yannoo

     

    Last edit: LE PETITCORPS Yann 2017-04-09
    • Christian Rau

      Christian Rau - 2017-04-13

      Hello,

      As explained in the OpenGI programming guide, you can store pretty much any
      arbitrary vertex attribute you like. So in order to store the texture
      coordinates as well, just add the corresponding code for that specific
      attribute. You can use the in-built facilities for normalizing the
      attributes (like done for the normals), or just store tham as they are in a
      float image (like done for the positions). However, note that the actual
      texture coordinates in the example have been computed by the library's
      paramterization functions and are used to create the geometry images. They
      are thus trivial in the final geomtry image space, as they represent the x
      and y coordinates in the generated images. That's why the example is not
      stroing them and one of the advantages of geomtry images, the
      parameterization is implicit and trivial.

      You might also want to take a look into the "OpenGI Programming Guide",
      which might help you get to know the library better than looking at the
      plain example code.

      2017-04-09 1:31 GMT+02:00 LE PETITCORPS Yann yannoo95170@users.sf.net:

      I just discover the Open GI library and have begin to analyse a little
      the gim.c file into the examples repertory of the source code of OpenGI
      library

      The geometry and normals images seem to generated into the call

      int create_gim(unsigned int *gim, int res)
      

      where

      res is the resolution of images where width=height=res
      (the maximal resolution is 1024x1024, cf. res <= 1024)
      
      gim[0] is the id of the geometry image (res x res x 4 float)
      
      gim[1] is the the id of the normals image (res x res x 3 unsigned char)
      
      gim[2] is the texture image (res x res x 3 unsigned char)
      

      The geometry image use a float RGBA texture, so I think that R,G,B and A
      values each pixel is directly mapped to X, Y, Z and W values of vertices

      The normals image use an RGB unsigned char format, so I think that RGB
      values given by this texture need to be "decompressed" for to generate
      negative value when < 128 and positive value where > 128
      (or the inverse, I haven't already take a look at them)

      So, I think that we have only to save the content of the glm[0] texture
      for to backup the geometry data and to save the content of the glm[1]
      texture for to backup the normals data, with something like DevIL for
      example
      (the gim[2] texture handle seem to be used for the "true" texture data, so
      the picture that we have to map on the objet)

      I don't know if texture coordinates can too be stored into another texture
      image like the geometry image or the normal images but this don't seem me
      to be technically impossible to handle because X,Y,Z coordinates of vertex
      positions and normals are already handled by the library, so we have "only"
      to handle the (U,V) texture coordinates using the same "interpolation"
      scheme than positions or normals coordinates are already handled into the
      library

      I take a look in more detail on next days into this library for to see how
      this can to be "easily" handled

      @+
      Yannoo


      Creating ".gim" file.


      Sent from sourceforge.net because you indicated interest in <
      https://sourceforge.net/p/opengi/discussion/934528/>

      To unsubscribe from further messages, please visit <
      https://sourceforge.net/auth/subscriptions/>

       

Log in to post a comment.