I'm working with SDCC on a SiLabs C8051F120 and am currently running into the following message.
?ASlink-Error-Insufficient space in data memory. 8 bytes short.
If I comment out some of my code, I can compile just fine. Almost all of my variables are xdata variables, so the internal ram shouldn't be used very much at all for common variables.
Also, it does't matter what type of code I comment out. It can be printf segments, functions or just blocks of plain code.
Also, I am currently using --model-large in both compiler & linker. This solved my problem for a little bit until I started adding more code.
For those who are not familiar with this chip, It has 128k flash memory for code, 8K on chip xram and 256k ram.
Is there some other type of compiler options I can use to get rid of these messages?
There might be. The problem is most likely due to too many spill locations (sloc's). See the generated .map file. Whenever the compiler runs out of registers it moves some of them into sloc's. These are stored in internal ram (data memory) for speed efficiency. The linker tries to overlay them, but is not very good at that. You may get better results if you use --stack-auto which renders all functions reentrant and therefor all sloc's are stored on stack (idata). Now idata is larger than data and using stack guarantees optimal "overlaying", but it requires registers(r0 or r1) to access it (generating more sloc's) and produces larger code and is slower.
Hope this helps,
Go to Project -> Tool Chain Integration and add
--model-medium to both the Compiler and Linker tab command line flags!
@Maarten: Is it ok to add the --stack-auto option only to some files (ie. files containing functions were speed optimization is not the main goal) ?
What do you think about it ?
You cannot use --stack-auto on some files only. It influences the way functions outside the source file (including the libraries!) are called. You can however declare only some functions reentrant. Look in the .map file for functions that use much space in data memory. Those are good candidates.
You can also add storage classes (idata,pdata,xdata) to variables to move them outside the limited data memory.
yeah, I've tried --stack-auto and it can get it to compile, but several of my functions quit working when I do that. Especially any of my functions relating to pointers (functions working with string parsing for UART stuff).
As a result of that, I cannot use that option until I find out what it is doing. Could this be the result of having to re-compile some of the includes with the --stack-auto option?
Yes, definitely. If you want to use --stack-auto, you need to recompile your whole project with it and on top of that you must recompile the libraries with this option.
I'm having this kinda problem on an mcs51 compatible EZ-USB device.
compiling with sdcc version 2.3.0 I get no problems and everything is running fine, while compilation with version 2.4.0 (which says it was version 2.4.1 ...) ends up with a lack of memory...
Just read an email from Maarten saing to try pack-iram - will try that in a minute.
You (Maarten) said there was a mistake in version 2.3.0 using the same addresses for different variables - I even deactivated an array containing 24 bytes, but the size of the missing data mem doesn't change. As I already mentioned, changing to large model, didn't change anything, using the auto-stack-option made 34 missing bytes be 15 missing bytes, but as it still wouldn't compile then, this doesn't help much...
Thanx for any hints - for now I'm falling back to version 2.3.0 (even though there might be problems - as long as that firmware does, what it's meant to do, I'm fine...)
--pack-iram is a great option - though I don't yet really understand, what it does, it's done fine...
-> works !
Sign up for the SourceForge newsletter:
You seem to have CSS turned off.
Please don't fill out this field.