Hello everyone. I have been silently following GCB for quite a while, and considering I havent done much with microcontrollers in a while, I would give it a shot. I have taken working code and ported it over to GCB (by hand-not copy and paste). Now, it does work, but I get erroneous results. This is what the program does, or what it should do: Basically take analog data once per n (be it minutes, or seconds), transmit it via UART and go to sleep. Eventually it will use wireless to do so, but for now I am using wired for testing.
The first thing I did was get timer 1 to work. Load it TIMER1H with 128, turn it on and let it run. The microchip datasheet says to do just that. So that works. Next up was analog. Well..I had a issue or two getting it to work. I am using an external vreference so I had to set the bits manually (which, wasnt a problem). Getting the UART To correctly spit out the raw data was a bit troublesom, but I figured out how to get it to work (It would spit out the ASCII code and not the actual binary value).
Which leads up to today. I wrapped it all up, had it taking analog data once per second and it works! I am using a MCP9700 temperature sensor and a LM4040 2.048 Voltage reference with a PIC18F13K22I am also using the built in terminal with GCB. I have noticed that sometimes it doesn't pick up on ASCII codes 10 and 13.
So, then I moved on to taking a reading once per minute. If it helps, I am using 2400 bps because it will be wireless eventually. Well, here is where things get weird. It worked, last night, and I was getting the correct ADC code (decimal value 360) but when it transmitted ASCII codes 10 and 13, they would show up as 0's. Changed my code every which way, made it go back to once per second and it seemed fine.
Until I changed the clock speeds. I figured, well maybe its not fast enough. Bumped the clock speed up to mhz from 4..and then I started getting very wrong ADC results. Went back down to 4Mhz and it's still not reading ther ADC correctly, nor is it sending 10 and 13 correctly either. I have also noticed that there is a pause between each byte, a very long pause infact. It will send the two ADC bytes and then maybe 10 seconds later it will send a byte full of 0's..and then it will send another byte 10 seconds later. I dont know why this is.
I am at work right now, so I can do much in terms of testing, however, I do have both versions of my code.
Notes: For this project it has external power to sensors and to the Vref. Right now these are hooked up Via LEDs but once in field it will turn the power on and off eventually.
**To Sum Up: Since I have changed clock speeds from 4Mz to 8 Mhz and then back to 4, the ADC does not read correctly. It also spits out UART characters when its not supposed to. If I take a reading once per second, the ADC reads fine. But once I take a reading once per minute, it goes out the window. **
2017_11_17_A Code:
This code takes data once per second
So Would I still have to dim AnalogPort02 as word?
Yes, but to then access the High and Low order Bytes separately you need to
assign the word variable to a byte variable, that truncates the High Order
Byte then assign the High order Byte with the _H post fix.
I am not sure why there is no _L postfix to cut out the extra steps, but
the above works.
If you just need the low byte of a larger variable, you can access that by writing [byte] before the variable name. For example, if you had a word or long variable called BigVar, this line would set the low byte only and leave the others untouched:
[byte]BigVar=24
(That's the closest I can offer to a _L postfix, it will do the thing you want, but just with a bit more typing and a lot less breaking than changing the name of the low byte would cause.)
Last edit: Hugh Considine 2017-11-29
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Update:
First, this is the code that I got working with 1 minute intervals:
Dim AnalogPort02 as Word
Dim AnalogPort02Low as Byte
AnalogPort02Low=AnalogPort02
AnalogPort02High=AnalogPort02_H
------
code
-----
AnalogPort02=ReadAD10(AN2)
Pause 1 'may not be needed
HSerSend AnalogPort02High'These commands send out the raw bytes of data.
Do Until TRMT = 1
Loop
HSerSend AnalogPort02Low
Do Until TRMT = 1
Loop
HSerSend (13)
Do Until TRMT = 1
Loop
HSerSend (10)
Do Until TRMT = 1
Loop
It seems that waiting for all bits to transmit also fixes my errornous "0"s. I want to make a suggestion that instead of polling the TXIF bit, we should poll the TRMT bit instead.
Suggestion: There should also be a easier way to get the high and low bytes without assigning another variable. Maybe something like "Word_H" "Word_L".
Question: Is there an easier way to write that little function instead of writing it over and over again?
For those curious: These are the correct bits I should be seeing
(dec)
11041310
Last edit: Chris S 2017-11-30
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
If for this chip for some specific reason TRMT works when TXIF works in the general case there is a simple workaround and there is no need to create/update the existing libraries. Add to the code
#define TFIF TRMT
This will map the bits.
But, I not think this is the issue. Looked at the errata for this chip?
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Yes, Errata doesn't say much..But now Im curious. Once I get home tonight I want to remove TX_BLOCKING and try my own loop that does this:
Do Until TXIF = 0
Loop
Yes, it will accomplish the same thing as the in built function..But I am curious.
From the datasheet:
"The TXIF flag bit is not cleared immediately upon writing TXREG. TXIF becomes valid in the second instruction cycle following the write execution. Polling TXIF immediately following the TXREG write will return invalid result "
Sounds like I would need a few NOP cycles after sending each byte..
To add to this
"To use interrupts when transmitting data, set the TXIE
bit only when there is more data to send. Clear the
TXIE interrupt enable bit upon writing the last character
of the transmission to the TXREG."
So probably after I set my last char, I should probably also clear the TXIE interrupt, unless that is handled in the background.
I will honest. I think you need to find the route cause of the issue. Looks to me that this is not the root cause of the issue and treating the symptom.
Protocol analyser on serial. Why ? Because I know the the terminal software throws away characters - try putty. It is faster and handles the FIFO buffers better in my experience. The protocol analyser will prove or disprove the microcontroller is not sending the correct chars.
My thoughts.
Last edit: Anobium 2017-11-30
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I dont own one of those. Do you reccomend any particular one that is affordable? Would a bus pirate be acceptable for that? I do have a Oscilloscope but it might be hard to see things.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I dont own one of those. Do you reccomend any particular one that is
affordable? Would a bus pirate be acceptable for that? I do have a
Oscilloscope but it might be hard to see things.
Do you have a PICKit 2 or PICKit 3 ?
If so you can use that as a simple Logic analyzer.
It would work perfectly at your speed of 2400 Baud.
I have used it many times to analyse serial signals, despite having a
dedicated 16 channel Logic Analyzer, as the PICKit is easier to set up for
simple tests.
Cheers
Chris
P.S. as for recommendations if you want to buy one get a Saleae Logic, https://www.saleae.com/, or clone there of.
All the bit-fiddling is completely unnecessary. The USART library works fine, as does ADC and Timer1.
First thing I would do is comment out all code except that necessary to send serial data to the terminal.
Something like this:
#chip 18F13K22, 16
#config LVP = ON
#config WDTEN= OFF
#option Explicit
#define USART_BAUD_RATE 2400
#Define USART_TX_BLOCKING
Dim Char as Byte
Do
HSerprint "Great Cow BASIC"
HSerPrintCRLF
HSerprint "Serial Test"
HSerprintCRLF
For Char = 48 to 90
HSersend Char
HSersend 32 ' space
Next
HSerprintCRLF 2
Wait 1 s
Loop
Does this print correctly on the screen? Yes/No
If no then the most likely problem is poor grounding. All devices & power sources MUST share a common ground. Star grounding is a good practice when possible. If there are wires snaking around all over the board then tidy them up and make all connections as short a possible. Check all gdound connections ....twice.
If that does not solve the serial problem, then remove EVERYTHING from the board except the PIC and the Serial Adapter. Then try again.
It is possible (but not likely) that you have a bad chip. Bad chips usually (but not always) draw more current than they should. If you connect power to to the chip (with nothing connected) and it draws more than about 5 ma, then it is probably bad.
Try the code above then give us feedback
William
Last edit: William Roth 2017-11-30
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
This code does work, so I thank you for writing it. As for my code, I have a little more insight to offer. I noticed that after it is done sending the string of characters that moments after, the tx pin stays high. I couldn't get the logic analyzer on the PICkit2 to work, so these tests might be inconclusive at best since I used my scope to look at things and judged it by the 2 leds I had. I will modify the above code to wait once per minute and see if I get any erroronous bits.
I dont want to admit this but I couldnt get Timer 1 to work using the built in commands. For my own personal use though, I like to see what bit's are on and off. It helps me understand what's going on better.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Here is complete code for a PIC16F1829 with ADC and with Timer1 used as a "Real Time Clock" by adding a 32.768KHz Crystal to the SOSC pins.
Your 18F13K22 Timer1 is slightly different. Nevertherless this code may demonstrate a good way forward, It works.
;#chip 16F1829, 16#config LVP = ON#config WDTEN= OFF#option Explicit#define USART_BAUD_RATE 2400#Define USART_TX_BLOCKING'VariablesDIMADCValasWordDIMADC_HighByteAliasADCVal_HasByteDimADC_LowByteAliasADCVALasByteDimSecsasByteDimMinsasByteDimHoursasByteDimOldSecsasByte#define LED PORTC.3DirLEDOUT'// initalize/clear variablesADCVal=0Hours=0Mins=0Secs=0OldSecs=secs'*** Connect 32.768KHz Crystal between Pins 2 & 3'*** Place 7pf - 10pf Capacitor from each leg to gnd'// InitCLockSettimer1,32768'// Prescale for 1 secInittimer1(ExtOSC,PS1_1)'select ExtCrystalSetT1CON.2On'// do not sync to FOSCStarttimer1Oninterrupttimer1OverflowCallSecondsCounter'Config ADC to use 4.0968V Or 2.048V Fixed Voltage ReferenceFVRInitialize(FVR_4x)''FVRInitialize(FVR_2x)''' --- This is the MAIN LOOP ----Do' Send serial Data once per second If Secs <> OldSecs then '// Clock has incremented Show_Data ;// Send_Raw_Data 'test with scope or Logic Analyzer End ifLoop'''--------EndMain---------------SubShow_Data'// Display some stuff'Displays Relevant on Terminal for troubleshootingOldSecs=Secs'// Reset OldSecsADCVal=ReadAD10(AN6)'Show Whole WordHSerprint"ADCVal = "HSerprintADCVal:HSerprintCRLF'Show High ByteHserprint"ADC_Highbyte = "HSerprintADC_HighbyteHSerprintCRLF'Show Low ByteHSerPrint"ADC_Lowbyte = "HSerPrintADC_LowbyteHSerPrintCRLFHSerprint"Seconds = "HSerprintSecs:HSerprintCRLFHSerPrint"Minutes = "HSerprintminsHserPrint" "EndSubSubSend_Raw_DataOldSecs=secsADCVal=ReadAD10(An6)HsersendADC_HighbyteHSersendADC_LowbyteHsersend13'//CRHSersend10'//LFEndSubSubSecondsCounterTMR1H=0x80'// Preload timerTMR1L=1'// Tweak for accuracySecs++'// Increment secondsPulseoutLED,50ms'// Heartbeat shows timer workingIFsecs<60thengotoexitISRSecs=0:Mins++IFmins<60thengotoexitISRmins=0:Hours++IfHours>23thenHours=0ExitISR:EndSUB
Last edit: William Roth 2017-12-01
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Thank you for this. Since I am using an external voltage reference, is there a built in command to set it up instead of setting the bits manually? I dont have a problem setting the bits manually.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
There is no GCB ommand that I am aware of to set the PIC VRef to external.
To set Vref+ to external VREF+ on 18F13K22
ADCON1.2 = 1
(This assumes that the ext reference shares a common ground with PIC)
The external VRef+ is on Pin 18 (RA1)
*** The ext VREF pin(s) are also used for programming so it is IMPERATIVE that the
Programmer be physically disconnected in order to get accurate/consistent ADC reads.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Will, So that code you posted does work..but only if you are sending data once every second. Modify it to do it once per minute and I get junk data. I DO get the ASCII code for the representation of what the adc is reading (360), so the ADC does work, but for some reason the High and low bits just come out as 0's.To reiterate: I do get the correct data in those registers if I am doing it once per second but NOT once per minute.
Logically, Yea I could work with the ascii codes and put them together in some manner with a look up table on the recieving end. Until then though, I am going to buy a logic analyzer and use that. Maybe the terminal is freaking out for some reason. I could see if the hackerspace has one.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Yep, A logic anlayzer might help see what is going on.
BTW I have absolutely no problems with data corruption here whether the data is sent ever 10m sec, 1 min, 10 min or even every hour. So my guess is that you have either a hardware problem or a connection problem. What USB/TTL device are you using?
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
In rare situations, one or more extra zero bytes
have been observed in a packet transmitted by the
module operating in Asynchronous mode. The
actual data is not lost or corrupted; only unwanted
(extra) zero bytes are observed in the packet.
This situation has only been observed when the
contents of the transmit buffer, TXREGx, are transferred to the TSR during the transmission of a Stop
bit. For this to occur, three things must happen in
the same instruction cycle:
• TXREGx is written to;
• the baud rate counter overflows (at the end of
the bit period); and
• a Stop bit is being transmitted (shifted out of
TSR)."
Even though the errata doesnt say this, it is what I am observing. Using TRMT fixes this. However, I will be trying a different PIC to make sure its just this series (13K22). I think I have a 16F lying around somewhere and another 18F. I dont mind using the TMRT bit to be honest, its not like I need to go fast. I just want to help GCB out and others if they have the same issue :).
Q: Is there a difference between HSersend and HSerPrint? Do I need to use parentheses around a byte or word? I was slightly confused by this.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hello everyone. I have been silently following GCB for quite a while, and considering I havent done much with microcontrollers in a while, I would give it a shot. I have taken working code and ported it over to GCB (by hand-not copy and paste). Now, it does work, but I get erroneous results. This is what the program does, or what it should do: Basically take analog data once per n (be it minutes, or seconds), transmit it via UART and go to sleep. Eventually it will use wireless to do so, but for now I am using wired for testing.
The first thing I did was get timer 1 to work. Load it TIMER1H with 128, turn it on and let it run. The microchip datasheet says to do just that. So that works. Next up was analog. Well..I had a issue or two getting it to work. I am using an external vreference so I had to set the bits manually (which, wasnt a problem). Getting the UART To correctly spit out the raw data was a bit troublesom, but I figured out how to get it to work (It would spit out the ASCII code and not the actual binary value).
Which leads up to today. I wrapped it all up, had it taking analog data once per second and it works! I am using a MCP9700 temperature sensor and a LM4040 2.048 Voltage reference with a PIC18F13K22I am also using the built in terminal with GCB. I have noticed that sometimes it doesn't pick up on ASCII codes 10 and 13.
So, then I moved on to taking a reading once per minute. If it helps, I am using 2400 bps because it will be wireless eventually. Well, here is where things get weird. It worked, last night, and I was getting the correct ADC code (decimal value 360) but when it transmitted ASCII codes 10 and 13, they would show up as 0's. Changed my code every which way, made it go back to once per second and it seemed fine.
Until I changed the clock speeds. I figured, well maybe its not fast enough. Bumped the clock speed up to mhz from 4..and then I started getting very wrong ADC results. Went back down to 4Mhz and it's still not reading ther ADC correctly, nor is it sending 10 and 13 correctly either. I have also noticed that there is a pause between each byte, a very long pause infact. It will send the two ADC bytes and then maybe 10 seconds later it will send a byte full of 0's..and then it will send another byte 10 seconds later. I dont know why this is.
I am at work right now, so I can do much in terms of testing, however, I do have both versions of my code.
Notes: For this project it has external power to sensors and to the Vref. Right now these are hooked up Via LEDs but once in field it will turn the power on and off eventually.
**To Sum Up: Since I have changed clock speeds from 4Mz to 8 Mhz and then back to 4, the ADC does not read correctly. It also spits out UART characters when its not supposed to. If I take a reading once per second, the ADC reads fine. But once I take a reading once per minute, it goes out the window. **
2017_11_17_A Code:
This code takes data once per second
2017_11_28 Code:
This takes data once per minute
Last edit: Chris S 2017-11-29
Hello, what version of the compiler please?
The one thing I forgot to list: v0.98.00
With just a quick look I can see that you are reading the ADC into AnalogPort02
But are Printing AnalogPort02High and AnalogPort02Low
Have you decided those?
What I think you meant to use is:
I have not looked at the rest yet but did notice that potential error so thought I should mention it.
Cheers
Chris
So Would I still have to dim AnalogPort02 as word?
Cheers
Chris
If you just need the low byte of a larger variable, you can access that by writing [byte] before the variable name. For example, if you had a word or long variable called BigVar, this line would set the low byte only and leave the others untouched:
(That's the closest I can offer to a
_L
postfix, it will do the thing you want, but just with a bit more typing and a lot less breaking than changing the name of the low byte would cause.)Last edit: Hugh Considine 2017-11-29
Would this code then be possible?
In theory, as HSerSend takes a Byte Argument that should print the High
Byte then the Low Byte.
Cheers
Chris
Update:
First, this is the code that I got working with 1 minute intervals:
It seems that waiting for all bits to transmit also fixes my errornous "0"s. I want to make a suggestion that instead of polling the TXIF bit, we should poll the TRMT bit instead.
Suggestion: There should also be a easier way to get the high and low bytes without assigning another variable. Maybe something like "Word_H" "Word_L".
Question: Is there an easier way to write that little function instead of writing it over and over again?
For those curious: These are the correct bits I should be seeing
(dec)
Last edit: Chris S 2017-11-30
Try that,
If for this chip for some specific reason TRMT works when TXIF works in the general case there is a simple workaround and there is no need to create/update the existing libraries. Add to the code
This will map the bits.
But, I not think this is the issue. Looked at the errata for this chip?
Yes, Errata doesn't say much..But now Im curious. Once I get home tonight I want to remove TX_BLOCKING and try my own loop that does this:
Do Until TXIF = 0
Loop
Yes, it will accomplish the same thing as the in built function..But I am curious.
From the datasheet:
"The TXIF flag bit is not cleared immediately upon writing TXREG. TXIF becomes valid in the second instruction cycle following the write execution. Polling TXIF immediately following the TXREG write will return invalid result "
Sounds like I would need a few NOP cycles after sending each byte..
To add to this
"To use interrupts when transmitting data, set the TXIE
bit only when there is more data to send. Clear the
TXIE interrupt enable bit upon writing the last character
of the transmission to the TXREG."
So probably after I set my last char, I should probably also clear the TXIE interrupt, unless that is handled in the background.
Psuedo code:
Granted it won't be nice looking but maybe it will work so I dont have to poll TMRT.
I will honest. I think you need to find the route cause of the issue. Looks to me that this is not the root cause of the issue and treating the symptom.
Protocol analyser on serial. Why ? Because I know the the terminal software throws away characters - try putty. It is faster and handles the FIFO buffers better in my experience. The protocol analyser will prove or disprove the microcontroller is not sending the correct chars.
My thoughts.
Last edit: Anobium 2017-11-30
I dont own one of those. Do you reccomend any particular one that is affordable? Would a bus pirate be acceptable for that? I do have a Oscilloscope but it might be hard to see things.
Cheers
Chris
P.S. as for recommendations if you want to buy one get a Saleae Logic,
https://www.saleae.com/, or clone there of.
Forgot about the PICkit2. I'll try that when I get home. Im going to keep this thread open and update as needed. Thanks!
All the bit-fiddling is completely unnecessary. The USART library works fine, as does ADC and Timer1.
First thing I would do is comment out all code except that necessary to send serial data to the terminal.
Something like this:
Does this print correctly on the screen? Yes/No
If no then the most likely problem is poor grounding. All devices & power sources MUST share a common ground. Star grounding is a good practice when possible. If there are wires snaking around all over the board then tidy them up and make all connections as short a possible. Check all gdound connections ....twice.
If that does not solve the serial problem, then remove EVERYTHING from the board except the PIC and the Serial Adapter. Then try again.
It is possible (but not likely) that you have a bad chip. Bad chips usually (but not always) draw more current than they should. If you connect power to to the chip (with nothing connected) and it draws more than about 5 ma, then it is probably bad.
Try the code above then give us feedback
William
Last edit: William Roth 2017-11-30
This code does work, so I thank you for writing it. As for my code, I have a little more insight to offer. I noticed that after it is done sending the string of characters that moments after, the tx pin stays high. I couldn't get the logic analyzer on the PICkit2 to work, so these tests might be inconclusive at best since I used my scope to look at things and judged it by the 2 leds I had. I will modify the above code to wait once per minute and see if I get any erroronous bits.
I dont want to admit this but I couldnt get Timer 1 to work using the built in commands. For my own personal use though, I like to see what bit's are on and off. It helps me understand what's going on better.
Here is complete code for a PIC16F1829 with ADC and with Timer1 used as a "Real Time Clock" by adding a 32.768KHz Crystal to the SOSC pins.
Your 18F13K22 Timer1 is slightly different. Nevertherless this code may demonstrate a good way forward, It works.
Last edit: William Roth 2017-12-01
Thank you for this. Since I am using an external voltage reference, is there a built in command to set it up instead of setting the bits manually? I dont have a problem setting the bits manually.
There is no GCB ommand that I am aware of to set the PIC VRef to external.
To set Vref+ to external VREF+ on 18F13K22
(This assumes that the ext reference shares a common ground with PIC)
The external VRef+ is on Pin 18 (RA1)
*** The ext VREF pin(s) are also used for programming so it is IMPERATIVE that the
Programmer be physically disconnected in order to get accurate/consistent ADC reads.
Will, So that code you posted does work..but only if you are sending data once every second. Modify it to do it once per minute and I get junk data. I DO get the ASCII code for the representation of what the adc is reading (360), so the ADC does work, but for some reason the High and low bits just come out as 0's.To reiterate: I do get the correct data in those registers if I am doing it once per second but NOT once per minute.
Logically, Yea I could work with the ascii codes and put them together in some manner with a look up table on the recieving end. Until then though, I am going to buy a logic analyzer and use that. Maybe the terminal is freaking out for some reason. I could see if the hackerspace has one.
Yep, A logic anlayzer might help see what is going on.
BTW I have absolutely no problems with data corruption here whether the data is sent ever 10m sec, 1 min, 10 min or even every hour. So my guess is that you have either a hardware problem or a connection problem. What USB/TTL device are you using?
Those cheap CP2102 USB to UART adapters from Ebay. I have a few of them on hand so I might try another.
I think I have found a solution to my issues per another post:
http://www.microchip.com/forums/m608851.aspx
"
In rare situations, one or more extra zero bytes
have been observed in a packet transmitted by the
module operating in Asynchronous mode. The
actual data is not lost or corrupted; only unwanted
(extra) zero bytes are observed in the packet.
This situation has only been observed when the
contents of the transmit buffer, TXREGx, are transferred to the TSR during the transmission of a Stop
bit. For this to occur, three things must happen in
the same instruction cycle:
• TXREGx is written to;
• the baud rate counter overflows (at the end of
the bit period); and
• a Stop bit is being transmitted (shifted out of
TSR)."
Even though the errata doesnt say this, it is what I am observing. Using TRMT fixes this. However, I will be trying a different PIC to make sure its just this series (13K22). I think I have a 16F lying around somewhere and another 18F. I dont mind using the TMRT bit to be honest, its not like I need to go fast. I just want to help GCB out and others if they have the same issue :).
Q: Is there a difference between HSersend and HSerPrint? Do I need to use parentheses around a byte or word? I was slightly confused by this.