How to build the libraries

Anonymous
2012-01-06
2013-03-12
  • Anonymous - 2012-01-06

    Hi, I'd like to use a set of libraries compiled for the large memory model with auto stack.  I see that this isn't one of the precompiled options.  I also noticed that the sdcc manual states "(note: support libraries will need to be recompiled with the same options. There is a predefined target in the library makefile)."  But I didn't find the makefile.  I started to put one together myself but its a little slow and painful and it seems a waste if the work is already done somewhere.  I'm using sdcc 3.0.0 for mingw32.  Anyone have an idea where the makefiles are located?  I can't seem to find them anywhere in my download?

    Thanks,

    Chris

     
  • Maarten Brock

    Maarten Brock - 2012-01-06

    Hi Chris,

    You are aware that there is a new release 3.1.0, right?

    The makefile is in the source distribution of SDCC. Or actually there is a makefile.in that is converted into the makefile when you run './configure' on a *nix system.

    In the source distribution you will also find make51lib.bat but I must admit that I have never tried it. You can also find it here:
    http://sdcc.svn.sourceforge.net/viewvc/sdcc/trunk/sdcc/device/lib/make51lib.bat?view=log
    However it seems it no longer works since we changed the default library distribution to ar-format.

    I'm afraid there is no easy way for this on windows.

    But why do you want to use stack-auto with the large memory model? It will place all local (auto) variables on stack and all globals in xdata. Usually using the small memory model with judicious use of the __xdata keyword for some (large) global variables gives much better results.

    Maarten

     
  • Anonymous - 2012-01-11

    Maarten, I'm trying to get some examples to compile on SDCC that were originally written for IAR.  The linker runs out of space to place the variables.  This is the sdcc error with small and default stack: "?ASlink-Error-Could not get 1028 consecutive bytes in internal RAM for area DSEG."  I'm trying not to change their source code much so I don't want to move around variables if I don't have to. 

    From looking at the IAR mapfile it seems that they were using large mem map and auto stack:
    RUNTIME MODEL


      __calling_convention     = xdata_reentrant
      __code_model             = banked
      __core                   = plain
      __data_model             = large

     
  • Maarten Brock

    Maarten Brock - 2012-01-11

    To match this setting you would have to use -model-huge to get all banked functions. Is the example really that big that it needs bank switching?

     
  • Anonymous - 2012-01-12

    It is a rather large example, but as to if it's that big, how large can things get before you need banked?

    Is -model-huge a supported option?  And is auto-stack the best choice?

     
  • Maarten Brock

    Maarten Brock - 2012-01-12

    The mcs51 can access 64k of code memory. If you need more, bankswitching is possible, but it is questionable if it's not better to switch to another architecture in that case. Both IAR banked code model and SDCC huge model use bankswitching. All other SDCC models can go upto 64k.

    Whether stack-auto is best depends on your needs. Many programs on embedded systems can do without reentrancy. And though not C standard compliant SDCC uses fixed addresses for variables to circumvent the limited stack capabilities of the mcs51.

     

Get latest updates about Open Source Projects, Conferences and News.

Sign up for the SourceForge newsletter:





No, thanks