I hope I'm not missing something obvious here. Using the PIC16F88, I am able to set the clock speed to 1, 2, 4 and 8 MHz within the #chip directive in Great Cow Basic without any trouble. But so far, I've been unable to figure out how to get the lower speeds of 31.25 kHz through 500 kHz.. Just what in the heck is the correct syntax for a clock frequency below 1 MHz?
Thanks!
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
It appears to me that the #chip directive can only handle 1, 2, 4 and 8 MHz for this chip. Unfortunately, specifying anything lower than 1 MHz doesn't generate an error message, simply an incomplete program without warning.
That's my take…
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I was hoping there would be a definitive answer to this, but as there isn't here's my take.
The oscillator is set in the #CONFIG OSC=LP(or XT or HS or INT).
so if OSC=LP then the appropriate bits in the CONFIG register are set.
If OSC=INT then the oscillator frequency from the #CHIP line is used to set the OSCCON register. All other times the speed is merely used to calculate accurate times for WAIT. I doubt whether WAIT will work in LP mode (but I am subject to correction on this one) .
If you want to be careful you could set CONFIG and OSCCON registers separately.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I hope yall dont mind if I jump in here as well. I just started here and I am having problems understanding some of the basics. I wrote a program but im getting no response on the expected pins. I forget which troubleshooting file I was looking at, but they suggested a clock issue, and I would have to agree; The #chip and #config commands are throwing me for a curve. I am assuming this because my burning software is able to read and write to the chip with no apparent problems.
All I want is for an LED to flash on and off, only to show me that GCBASIC works, that my programmer is functioning, and that my chips are aren't dead. Here is the code:
#chip 12F683, 1
DIR GPIO.5 out
start:
set GPIO.5 OFF
wait 1 sec
set GPIO.5 ON
wait 1 sec
goto start
This wasnt working so I added #config OSC = (LP for 32 KHz, XT for medium gain, etc… INTOSCIO for more inputs.) I feel like im getting close, I just dont understand specifically what config settings to use, or if I am missing any other commands . I just started programming tonight, and the lack of in-depth and clear instructions in http://gcbasic.sourceforge.net/help/ is frustrating. The list of commands is nice, but in what order, and what is required?
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
#chip 12F683, 4 (instead of 1, datasheet says 4 is default)
#config OSC = INTOSCIO (the last one I tried before, I changed the 4 above.)
(then my blinking loop)
My question now is how do I get my chip to run at 32 KHz instead of 4 MHz. The datasheet says that power consumption is 220uA @ 4 MHz 2.0V, but only consumes 11uA @ 32 KHz 2.0V.
#config OSC = LP gives a compiler error
From the Microchip Datasheet:
The Oscillator module can be configured in one of eight
clock modes.
1. EC – External clock with I/O on OSC2/CLKOUT.
2. LP – 32 kHz Low-Power Crystal mode.
3. XT – Medium Gain Crystal or Ceramic
Resonator Oscillator mode.
4. HS – High Gain Crystal or Ceramic Resonator
mode.
5. RC – External Resistor-Capacitor (RC) with
FOSC/4 output on OSC2/CLKOUT.
6. RCIO – External Resistor-Capacitor (RC) with
I/O on OSC2/CLKOUT.
7. INTOSC – Internal oscillator with FOSC/4 output
on OSC2 and I/O on OSC1/CLKIN.
8. INTOSCIO – Internal oscillator with I/O on
OSC1/CLKIN and OSC2/CLKOUT.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I had not realized that EC, LP, XT, and HS were external source clocks. No wonder my program was doing nothing when I specified LP for power savings. Severe lack of sleep will cause silly errors like that. No worries, I just placed an order for 10x 32.768 kHz crystal oscillators, Ill probably end up starting a new thread when they come in if I run into any problems though.
So to answer the OPs original question about specifying clock speeds lower than 1MHz it sounds like you have to look at the data sheet for your specific PIC, and look at the clock modes. For my 12F683 look at the post above for the list of possible oscillator modes. If I want LP 32kHz then the code would be #config OSC = LP, and the chip will be looking for an external oscillator connected.
This doesnt touch what to do about calculating delay times in the #chip 12F683, (freq) line though… For the time being I suppose you could set it to 1, and then figure out the ratio between 1MHz and the frequency you decided to run your chip at…. offsetting any delay commands in the code itself?
Im still brand new to programming PICs with GCB (<24 hours old) haha, so I could be completely wrong, but this is what I gather.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Anonymous
-
2012-01-25
HI everyone,
I'm the original poster, but now I have a username here. Thanks for the previous comments, but I think things have moved away from my first question. Here are some additional details that might clarify what I'm seeing.
My problem is this. I'm using the internal RC clock of the PIC16F88 and understand that various bits in the registers can change the speed of the clock. I have been able (using nothing more than GC Basic) to blink an LED on B.0 once a second using an internal RC clock running at 1, 2, 4 and 8 MHz. But, the PIC internal RC clock should also be capable of running at several other frequencies down to 31.25 kHz.
I realize that I can access these lower clock speeds by setting register bits directly in assembler language, but was hoping to do so in GC Basic, just for consistency sake.
Here are some code snippets from my tests. In the first four portions, the GC Basic is shown for a clock of 1, 2, 4 and 8 MHz, followed by the relevant assembler code that is generated. Everything seems to work okay, and the LED does flash once a second as expected.
In the last two snippets, I show what happens if I try to use a clock rate of either 0.5 or 1/2 (for half a MHz). The program neither works correctly, nor does the compiler give an error message.
I think it would be nice if GC Basic recognized the lower rates of 500, 250, 125 and 31.25 kHz in the #chip directive, just to stay consistent with the 1, 2, 4 and 8 MHz arguments which do work okay. Otherwise, how about an error message to indicate that the format just isn't acceptable?
What do you think? Could we get GC Basic to work properly with all eight values, and simultaneously have the WAIT command parameter properly set?
Thanks for listening,
Thomas Henry
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
This is a bit of a silly bug in GCBASIC, but shouldn't be too hard to fix. The #ifdef lines in the system.h file aren't working properly, but you should be able to make the compiler behave. Open up system.h, and find all the lines like this:
#IFDEF ChipMHz .5
Add a zero in front of the decimal point, like this:
#IFDEF ChipMHz 0.5
That should make it behave.
Note: It doesn't matter whether you have a 0 in front in the #chip line - "#chip 16F88, 0.5" and "#chip 16F88, .5" both get converted to the same thing inside the compiler, and the ChipMHz constant always gets a 0 in front. The bug is in the code that handles the #ifdefs, which doesn't know that 0.5 = .5 (yet).
This is not a very tidy solution (sorry!), but it should work for now and I will fix the comparison code soon!
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Anonymous
-
2012-01-25
Hi again,
Thanks a lot for pointing me to the right place in "system.h." I tacked on the leading zeros and away it went!
While I was at it, I added a line to include the 31.25kHz clock speed as well. Now GC Basic can do all of the clock speeds available in the PIC16F88 sitting on my board. Presumably it will also handle other PICs equally well.
This is really cool! Thanks again,
Thomas Henry
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
start:
set PORTB.1 OFF
wait 1 sec
set PORTB.1 ON
wait 1 sec
goto start
Timing is correct for all clock speeds except of 31.25kHz which is 1.5s. Any idea why?
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Anonymous
-
2012-01-28
Hi,
I'm brand new to PICs and GC Basic, so I'm still learning myself. But I can answer your first question. No OR command is needed in the 31.25 kHz clock setting because all three IRCF bits need to be zero to specify this speed according to the data sheet. The AND command clears these and you're done. The other speeds require some of the IRCF bits to be set, hence the OR.
As for the timing in the WAIT command, I'm not too surprised. All that I had asked about was the clock speed, but understood that some of the commands that depend on this might also be affected. For example, I bet things like the USART might act funny too.
The relevant variables seem to be SysWaitTemp and DelayTemp, (two bytes each), but I'm unsure how the authors arrived at the correct values for these.
For the present, we should probably stay away from 31.25 kHz. But perhaps the authors will add this feature in officially at a later date. The other speeds seem to be completely supported presently.
Thomas Henry
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
The inaccurate delay is probably due to how GCBASIC generates the delay code. When you use the 1 second delay, GCBASIC calls the 1 ms delay 1000 times, but doesn't take in to account the amount of time spent repeatedly calling the ms delay. I'd guess that at a speed as low as 31.25 KHz, a single instruction takes 0.1 ms to run, so the 6 or so instructions it takes to call the 1 ms delay each time would add up to the extra 1.5 seconds.
There are two different delay generation routines inside the compiler. One produces the 1 ms delay subroutine, and the other generates microsecond length delays of the right length. Both of these routines use the clock speed to calculate how many instruction cycles on the PIC must be wasted to give the correct length delay. The microsecond delay generator then produces mix of "nop"s (1 cycle), "goto $+1"s (2 cycles, go to next instruction) and short delay loops (some multiple of 6 cycles) that will give the correct delay. The millisecond delay is a prewritten routine that has two loops, one inside the other. Each loop can be repeated up to 256 times. GCBASIC uses a brute force approach here - it tries every possible repeat count for the inner and outer loop and uses whatever combination gives the most accurate delay. If you're curious, have a look at the PrepareBuiltIn function in preprocessor.bi to see exactly how this gets done.
A lot of the routines for communication with external devices (LCD, serial, etc) also change based on clock speed. Most use the GCBASIC Wait command, so as long as the Wait command is working, changing the clock speed has no impact. From memory it's just the hardware RS232 and a couple of other hardware communication routines (I2C and SPI I think) that depend on clock speed. Often their internal clock comes from the system clock, and is divided by some number that GCBASIC calculates and sets as specified in the datasheet. If you are attempting serial communication, I'd recommend having a clock speed much higher than 31.25 KHz anyway!
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hello All,
I hope I'm not missing something obvious here. Using the PIC16F88, I am able to set the clock speed to 1, 2, 4 and 8 MHz within the #chip directive in Great Cow Basic without any trouble. But so far, I've been unable to figure out how to get the lower speeds of 31.25 kHz through 500 kHz.. Just what in the heck is the correct syntax for a clock frequency below 1 MHz?
Thanks!
It appears to me that the #chip directive can only handle 1, 2, 4 and 8 MHz for this chip. Unfortunately, specifying anything lower than 1 MHz doesn't generate an error message, simply an incomplete program without warning.
That's my take…
I was hoping there would be a definitive answer to this, but as there isn't here's my take.
The oscillator is set in the #CONFIG OSC=LP(or XT or HS or INT).
so if OSC=LP then the appropriate bits in the CONFIG register are set.
If OSC=INT then the oscillator frequency from the #CHIP line is used to set the OSCCON register. All other times the speed is merely used to calculate accurate times for WAIT. I doubt whether WAIT will work in LP mode (but I am subject to correction on this one) .
If you want to be careful you could set CONFIG and OSCCON registers separately.
I hope yall dont mind if I jump in here as well. I just started here and I am having problems understanding some of the basics. I wrote a program but im getting no response on the expected pins. I forget which troubleshooting file I was looking at, but they suggested a clock issue, and I would have to agree; The #chip and #config commands are throwing me for a curve. I am assuming this because my burning software is able to read and write to the chip with no apparent problems.
All I want is for an LED to flash on and off, only to show me that GCBASIC works, that my programmer is functioning, and that my chips are aren't dead. Here is the code:
#chip 12F683, 1
DIR GPIO.5 out
start:
set GPIO.5 OFF
wait 1 sec
set GPIO.5 ON
wait 1 sec
goto start
This wasnt working so I added #config OSC = (LP for 32 KHz, XT for medium gain, etc… INTOSCIO for more inputs.) I feel like im getting close, I just dont understand specifically what config settings to use, or if I am missing any other commands . I just started programming tonight, and the lack of in-depth and clear instructions in http://gcbasic.sourceforge.net/help/ is frustrating. The list of commands is nice, but in what order, and what is required?
My LED is flashing perfectly now, I used
#chip 12F683, 4 (instead of 1, datasheet says 4 is default)
#config OSC = INTOSCIO (the last one I tried before, I changed the 4 above.)
(then my blinking loop)
My question now is how do I get my chip to run at 32 KHz instead of 4 MHz. The datasheet says that power consumption is 220uA @ 4 MHz 2.0V, but only consumes 11uA @ 32 KHz 2.0V.
#config OSC = LP gives a compiler error
From the Microchip Datasheet:
The Oscillator module can be configured in one of eight
clock modes.
1. EC – External clock with I/O on OSC2/CLKOUT.
2. LP – 32 kHz Low-Power Crystal mode.
3. XT – Medium Gain Crystal or Ceramic
Resonator Oscillator mode.
4. HS – High Gain Crystal or Ceramic Resonator
mode.
5. RC – External Resistor-Capacitor (RC) with
FOSC/4 output on OSC2/CLKOUT.
6. RCIO – External Resistor-Capacitor (RC) with
I/O on OSC2/CLKOUT.
7. INTOSC – Internal oscillator with FOSC/4 output
on OSC2 and I/O on OSC1/CLKIN.
8. INTOSCIO – Internal oscillator with I/O on
OSC1/CLKIN and OSC2/CLKOUT.
I had not realized that EC, LP, XT, and HS were external source clocks. No wonder my program was doing nothing when I specified LP for power savings. Severe lack of sleep will cause silly errors like that. No worries, I just placed an order for 10x 32.768 kHz crystal oscillators, Ill probably end up starting a new thread when they come in if I run into any problems though.
So to answer the OPs original question about specifying clock speeds lower than 1MHz it sounds like you have to look at the data sheet for your specific PIC, and look at the clock modes. For my 12F683 look at the post above for the list of possible oscillator modes. If I want LP 32kHz then the code would be #config OSC = LP, and the chip will be looking for an external oscillator connected.
This doesnt touch what to do about calculating delay times in the #chip 12F683, (freq) line though… For the time being I suppose you could set it to 1, and then figure out the ratio between 1MHz and the frequency you decided to run your chip at…. offsetting any delay commands in the code itself?
Im still brand new to programming PICs with GCB (<24 hours old) haha, so I could be completely wrong, but this is what I gather.
HI everyone,
I'm the original poster, but now I have a username here. Thanks for the previous comments, but I think things have moved away from my first question. Here are some additional details that might clarify what I'm seeing.
My problem is this. I'm using the internal RC clock of the PIC16F88 and understand that various bits in the registers can change the speed of the clock. I have been able (using nothing more than GC Basic) to blink an LED on B.0 once a second using an internal RC clock running at 1, 2, 4 and 8 MHz. But, the PIC internal RC clock should also be capable of running at several other frequencies down to 31.25 kHz.
I realize that I can access these lower clock speeds by setting register bits directly in assembler language, but was hoping to do so in GC Basic, just for consistency sake.
Here are some code snippets from my tests. In the first four portions, the GC Basic is shown for a clock of 1, 2, 4 and 8 MHz, followed by the relevant assembler code that is generated. Everything seems to work okay, and the LED does flash once a second as expected.
In the last two snippets, I show what happens if I try to use a clock rate of either 0.5 or 1/2 (for half a MHz). The program neither works correctly, nor does the compiler give an error message.
I think it would be nice if GC Basic recognized the lower rates of 500, 250, 125 and 31.25 kHz in the #chip directive, just to stay consistent with the 1, 2, 4 and 8 MHz arguments which do work okay. Otherwise, how about an error message to indicate that the format just isn't acceptable?
What do you think? Could we get GC Basic to work properly with all eight values, and simultaneously have the WAIT command parameter properly set?
Thanks for listening,
Thomas Henry
This is a bit of a silly bug in GCBASIC, but shouldn't be too hard to fix. The #ifdef lines in the system.h file aren't working properly, but you should be able to make the compiler behave. Open up system.h, and find all the lines like this:
#IFDEF ChipMHz .5
Add a zero in front of the decimal point, like this:
#IFDEF ChipMHz 0.5
That should make it behave.
Note: It doesn't matter whether you have a 0 in front in the #chip line - "#chip 16F88, 0.5" and "#chip 16F88, .5" both get converted to the same thing inside the compiler, and the ChipMHz constant always gets a 0 in front. The bug is in the code that handles the #ifdefs, which doesn't know that 0.5 = .5 (yet).
This is not a very tidy solution (sorry!), but it should work for now and I will fix the comparison code soon!
Hi again,
Thanks a lot for pointing me to the right place in "system.h." I tacked on the leading zeros and away it went!
While I was at it, I added a line to include the 31.25kHz clock speed as well. Now GC Basic can do all of the clock speeds available in the PIC16F88 sitting on my board. Presumably it will also handle other PICs equally well.
This is really cool! Thanks again,
Thomas Henry
Great result.
Would you post you source files and .h file? This would very useful
thank you.
Hi again,
Well, okay, but really it's just a matter of tacking on a zero as mentioned above. Here's the code:
Hope that helps,
Thomas Henry
Thank you.
I did wonder how you specified the 31.25kHz but now I know.
Why is there no OR statement in the #IFDEF ChipMHz 0.03125 condition? Are the AND and OR statements as per the datasheet?
I have just inspected the timing.
Simple code as follows:
#chip 16F88, 0.03125
#config Osc = INT, MCLRE_ON
DIR PORTB.1 out
start:
set PORTB.1 OFF
wait 1 sec
set PORTB.1 ON
wait 1 sec
goto start
Timing is correct for all clock speeds except of 31.25kHz which is 1.5s. Any idea why?
Hi,
I'm brand new to PICs and GC Basic, so I'm still learning myself. But I can answer your first question. No OR command is needed in the 31.25 kHz clock setting because all three IRCF bits need to be zero to specify this speed according to the data sheet. The AND command clears these and you're done. The other speeds require some of the IRCF bits to be set, hence the OR.
As for the timing in the WAIT command, I'm not too surprised. All that I had asked about was the clock speed, but understood that some of the commands that depend on this might also be affected. For example, I bet things like the USART might act funny too.
The relevant variables seem to be SysWaitTemp and DelayTemp, (two bytes each), but I'm unsure how the authors arrived at the correct values for these.
For the present, we should probably stay away from 31.25 kHz. But perhaps the authors will add this feature in officially at a later date. The other speeds seem to be completely supported presently.
Thomas Henry
The inaccurate delay is probably due to how GCBASIC generates the delay code. When you use the 1 second delay, GCBASIC calls the 1 ms delay 1000 times, but doesn't take in to account the amount of time spent repeatedly calling the ms delay. I'd guess that at a speed as low as 31.25 KHz, a single instruction takes 0.1 ms to run, so the 6 or so instructions it takes to call the 1 ms delay each time would add up to the extra 1.5 seconds.
There are two different delay generation routines inside the compiler. One produces the 1 ms delay subroutine, and the other generates microsecond length delays of the right length. Both of these routines use the clock speed to calculate how many instruction cycles on the PIC must be wasted to give the correct length delay. The microsecond delay generator then produces mix of "nop"s (1 cycle), "goto $+1"s (2 cycles, go to next instruction) and short delay loops (some multiple of 6 cycles) that will give the correct delay. The millisecond delay is a prewritten routine that has two loops, one inside the other. Each loop can be repeated up to 256 times. GCBASIC uses a brute force approach here - it tries every possible repeat count for the inner and outer loop and uses whatever combination gives the most accurate delay. If you're curious, have a look at the PrepareBuiltIn function in preprocessor.bi to see exactly how this gets done.
A lot of the routines for communication with external devices (LCD, serial, etc) also change based on clock speed. Most use the GCBASIC Wait command, so as long as the Wait command is working, changing the clock speed has no impact. From memory it's just the hardware RS232 and a couple of other hardware communication routines (I2C and SPI I think) that depend on clock speed. Often their internal clock comes from the system clock, and is divided by some number that GCBASIC calculates and sets as specified in the datasheet. If you are attempting serial communication, I'd recommend having a clock speed much higher than 31.25 KHz anyway!
Thank you - it does explain things very clearly. I will have a hunt around PrepareBuiltIn.
Thank you (and everyone that I do not yet know) for a great tool.