Menu

WAIT: Compiler Help/Challenge - Can you assist with the development of Great Cow BASIC?

Anobium
2021-10-06
2021-11-15
1 2 > >> (Page 1 of 2)
  • Anobium

    Anobium - 2021-10-06

    RESOLVED IN THE COMPILER - 31k delays using WAIT are now acceptable.

    I am looking for help to update a specific functional area of the compiler.

    Delays (time) at 31k clock frequency.

    Currently, the compiler calculates the delays very well for a very wide range of frequencies. However, for 31k and other low frequencies the delays are not correct.

    The task is to resolve make delay timings work as expected for these slower frequencies.

    Ping me if you have the time to help.

    Evan


    The tools and software is all available and I can help you create the development environment. All very simple.


    The challenge is to resolve this... the delay(s) are not calculated correctly. Example is 10 ms is being calculated a16.40 ms (see the code below). :-)

    #CHIP 16f18855,31k
    
    #DEFINE TESTPORT portb.5
    Dir TESTPORT OUT
    
    Do
        TESTPORT = !TESTPORT
        Wait 10 Ms
    Loop
    
     

    Last edit: Anobium 2021-11-15
  • stan cartwright

    stan cartwright - 2021-10-06

    Does this apply to the millis function?

    #CHIP 16f18855,31k
    
    #DEFINE TESTPORT portb.5
    Dir TESTPORT OUT
    
    Do
        TESTPORT = !TESTPORT
        Millistemp=millis
        do until
            millis-millistemp=10
        loop    
    Loop
    
     
    • Anobium

      Anobium - 2021-10-06

      @Stan. Let us assume for a moment the program would work if it has the include, etc.

      The millis() library, developed by Chris Roper, would issue an error currently as the library only supports 1, 2, 4, 8, 16, 32 & 64 mHz as the clock frq.

      So, the answer to the question is 'no' as the library would handle with an error message.

       
  • stan cartwright

    stan cartwright - 2021-10-06

    well this is beyond my capabilities but I would have liked to help.
    I use #chip 18f25k22,64 or #chip mega328p,16 which is afik is their max speed.
    So could I use #chip 18f25k22,31k ? notice the k ?? Why the k ?
    Do people do this often or is it just a niggle it doesn't work in gcb?
    / or * by 2 the clock seems sufficient for most users. What other pic basic compilers let you change the clock speed to maybe 31.6... if they had floats?

    I liked @ Chris Roper's millis and his do every idea that never happened :(
    Trying it to work on all gcb chips was too difficult.
    I got a 328 timer0 interrupt that is like do every working.. handy.

    I wouldn't lose sleep over this clock issue, gcb is still brill.

    edit I looked up the pic and it's
    • Operating Speed:
    - DC – 32 MHz clock input
    - 125 ns minimum instruction cycle
    Doesn't look a good spec chip. Ram etc.

     

    Last edit: stan cartwright 2021-10-06
    • Anobium

      Anobium - 2021-10-06

      Off piste Stan. Great question. Open another thread. I need to get the developers to read the initial message.

       
  • stan cartwright

    stan cartwright - 2021-10-06

    Do the interrupts or Hpwm change with clock frequency?

     
    • Anobium

      Anobium - 2021-10-06

      Off piste Stan. Great question. Open another thread. I need to get the developers to read the initial message.

       
  • stan cartwright

    stan cartwright - 2021-10-06

    I know how wait works to use but not how it's derived.
    When I first used gcb I asked where is the source code for gcb and you said WHAT!
    I got freebasic working and built gcb on rpi. Could I from that file for freebasic see the compiler in freebasic form?
    Now asking for help with
    The tools and software is all available and I can help you create the development environment. All very simple.
    What's that?
    Your idea of simple...

     
  • Anobium

    Anobium - 2021-10-06

    Ah.. nice. An On-Piste question (may I remove the off piste posts?)

    Read the page at the URL shown - if you want to compile the compiler. But, I only know how to use Windows, so the page has no info on Pi/Linux. See https://github.com/Anobium/Great-Cow-BASIC-Help/blob/master/source/DeveloperGuideForCompiler.adoc

     
    • stan cartwright

      stan cartwright - 2021-10-07

      That link is an interesting and surprising read. The name Great Cow Basic wouldn't offend anyone. How about Hindus?
      Any way I got gcb on rpi to work... but not the flash hex yet as apps needed but it was easier than using the pc linux gcb instructions to get working... just copy and paste the info in the terminal, worked.
      I'll check the rpi to see if the fb source code is there and probably check the link for windows fb build.
      Thanks.

       
  • mkstevo

    mkstevo - 2021-10-06

    I'm not in a position to help a great deal sadly.

    Two {three now I've thought about it!} thoughts come to mind:
    One: At 31kHz replace the 'Wait' routine currently used, by a calculated 'Repeat' loop which gives the required delay(s)?

            Repeat 10    'This should be ~10 mS at 32kHz
                NOP
            End Repeat
    

    I used the above in my power saving routine for my battery clock at 32kHz and found it to be close to accurate for 500 mS or so. Closer than 16.40mS x 50 would be.

    Two: Issue a warning that delays are inaccurate at very slow speeds and let the user tweak their code.

    Three: Simply reduce the given time delay by one third when compiling!
    10 * 6 = 60
    60 / 10 = 6 (6 mS delay requested instead of 10)
    16.4 mS / 10 = 1.64 mS per required mS at 32kHz
    1.64 x 6 (corrected mS delay) = 9.84 mS
    Closer than 16.40?

     
    • Anobium

      Anobium - 2021-10-06

      All good approaches but the compiler should simply get the timing correct. The impact is not just wait is every time based delay.

      Needs to be sorted.

       

      Last edit: Anobium 2021-10-06
  • mkstevo

    mkstevo - 2021-10-06

    In PicAxe many time critical portions of code force a switch to the processor speed which is then returned to the original clock frequency on exiting that section of code.

    The delays in PicAxe are all dependent of processor speed, increasing or decreasing when the processor is not running at the default.

    For me, a warning that timings are inaccurate at such a low frequency would be sufficient.

    As uS delays are not likely to work, 1mS might be the lowest period possible, all the 'wait' statements for shorter delays would need to be either discarded or flagged any way? How would the compiler calculate the delays if the clock is set to 4MHz initially then switched to 31kHz in code? Presumably it would be left to the programmer to work out a solution?

     
    • Anobium

      Anobium - 2021-10-06

      @mkstevo - you do get a warning today if the delay is inaccurate. I have seen this once. :-)

      Re switching - this could be resolved in the planned change to the compiler. We could add some sort of compiler directive to inform the compiler of the frequency change - the set of the frequency would still be the job of the coder but the timing should be correct. :-)

       
  • David Stephenson

    Couple of things I've noticed with regard to WAIT.
    There is always some overhead - the delay you set comes out as the delay+time required to set up the delay loop (using small delays and/or slow clock rates makes this quite significant).
    There is a discontinuity when using WORD variables when 255-->256 as an extra loop is invoked.

     
    • Anobium

      Anobium - 2021-10-07

      All resolvable. Now we know.

      I have not examined the code. Awaiting someone to take this on.

       
      • Anobium

        Anobium - 2021-10-07

        @David. Can you provide an example of 'discontinuity when using WORD variables when 255-->256 as an extra loop'

        Need to understand the context in real code.

        Thanks.

         
  • David Stephenson

    I've compiled a program with a 1000 us delay and this is what I get.

    Delay_US
    incf SysWaitTempUS_H, F
    movf SysWaitTempUS, F
    btfsc STATUS,Z
    goto DUS_END
    DUS_START
    nop
    nop
    decfsz SysWaitTempUS, F
    goto DUS_START
    DUS_END
    decfsz SysWaitTempUS_H, F
    goto DUS_START
    return

    Looks a little different to the code I last tried to analyse (maybe it is good?)
    My contension was that the "high byte loop" is not executed everytime. The program spends most of its time in the "low byte loop" so when it happens to go into the high byte loop a longer delay is created.
    This is my post from 12 years ago! At the time I was collecting time series data which needed to have exactly the same delay increment of 2 us.
    David Stephenson August 12 2009 (GCBasic discussion)

    For all short delays there will be some "overhead". For microsecond delays this can be quite significant adding 4 to 5 us to each delay (count the number of clock cycles in the routine).
    It would be nice if the overhead is constant however it is not. Every time the high byte loop is executed it adds to the overhead in the delay. Now I have cobbled together the following ASM for microsecond delay delays that does not have this problem.

    Delay_US
    incf SysWaitTempUS_H, F
    incf SysWaitTempUS, F
    DUS_START
    nop
    nop
    decfsz SysWaitTempUS, F
    goto DUS_START
    nop
    nop
    decf SysWaitTempUS, F
    decfsz SysWaitTempUS_H, F
    goto DUS_START
    return

    Maybe somebody can do better!

     

    Last edit: David Stephenson 2021-10-07
    • Anobium

      Anobium - 2021-10-07

      The WAIT code has not changed. Where is the source program? We do not want to create a program this mispresents the issue.

      We need to exact program. Keep it very short. The essential code only.

      Thank you

       
    • Anobium

      Anobium - 2021-10-16

      I just tried 1000 us using 18855. At many frequencies the code look good.

       
  • David Stephenson

    Yes maybe it is being a little too pedantic.
    It is only important when you have to have exact increments in the wait.
    10us,12us,14us...etc (I had that problem and used the work-around above)
    It will only be important for us delays and maybe it was only important then as the max clock rate was 20 MHz (with a crystal).
    There was actually a serious problem with WAIT back in 2009 (and before) which Hugh sorted out.
    I can get my oscilloscope out and see what delays I get. Your example of 10 ms becoming 16 ms is a serious error

     
    • Anobium

      Anobium - 2021-10-07

      It is a simple ask. I cannot ask someone to take on a task, and, debug all the code and expect them to write the code. Hence, it has not been addressed.

      Let me explain... Is this PIC/AVR/LGT ? What the frequency? What is the explicit instruction.

      Please help by providing the code.

       
    • Anobium

      Anobium - 2021-10-16

      Are yo saying if I use wait 10 ms I get 16ms? I do not on test here using oscilloscope.

      Can you please explain how I see this error?

       
  • Chris Roper

    Chris Roper - 2021-10-07

    WAIT - was the biggest mistake in the history of computers.

    Not pointing fingers at GCBASIC, Arduino, Fortran, Cobol or PL1 - yes even the IBM360 had a wait statement.

    The mistake is that it should not have migrated to Microcontrollers.

    WAIT - In any language, I have not seen the GCBASIC Source but would wager that it follows suit, works on the premise that a CPU has a fixed Clock Rate and that the rate is immutable.

    Computer speed used to be measured in MIPS (Millions of Instructions per second) or FLOPS (Floating Point Instructions per second) or when the CRAY supercomputers hit the market, MegaFLOPS, but the CPU itself ran at a fixed frequency in KHz and the only way to perform timing operations was to count clock cycles.

    Every instruction set of every ALU (Arithmetic Logic Unit - the bit that does the work) ever made has a NOP (No Operation) instruction. It is used to pad code to fit page boundaries and found use in compilers for reserving memory, but programers always find an unintended use so it didn't take long for programmers, who were also conversant with the hardware back then, to realise that a succession of NOP instructions combined with never changing Clock frequencies, and therefore cycle time, could be used for short delays.

    It was a great idea when the Mainframe had to wait for a Punch Card Reader, Magnetic Tape Drive or a User Terminal to respond, and in a task switching / time sharing environment like a mainframe computer it was not noticable.

    The first microprocessor, the intel 4004, was used to create the first calculator, the Busicom, and the NOP function was included to help with display refresh and save on external timing circuits.

    Microcontrollers merged Microprocessors and the most common peripheral chips such as Serial and Parallel I/O, RAM, ROM, interrupt Controllers and Timer / Counters.

    The history lesson is getting a bit long winded so before I lose the audience, if he/she is still actually reading, I will get to the point..................

    The Mainframe / Micro Programer mindset naturally got carried over to Microprocontrolers so along with the inclusion of the Counter / Timer, the NOP instruction was retained.

    Back in those days I was new to this exiting field of microprocessors so, to the best of my knowledge, the Traffic Lights in Bulawayo in Southern Zimbabwe are still running on a Motorola 6800 Microcontrollers and Counting NOP’s in a Loop, i.e. the WAIT function, to determine traffic flow.

    Fast forward to modern practice and we have variable speed devices capable of speeds that would have made a mainframe computer look slow back in the 80’s but can also switch to a power saving mode and run off of an LC (Inductor/Capacitor Tank Circuit) rather than an external crystal or internal precision timing circuit.

    At this point the concept of the WAIT function Fails...

    Counting cycles is no longer a valid or accurate method of timing if the clock is not stable and the frequency is variable.

    The only way to resolve the issue of what I call “Throw away timing” is to stop relying on the NOP instruction in a LOOP, as we have for the past 60 odd years, and use the hardware timers to do all timing.
    But even then, the Timers are merely counters relying on a fixed frequency.

    We may be able to find a hardware solution but is that practical in a portable programming language?
    And is it even desirable?

    A change of mindset to say that software timing is unreliable may be a better long term solution given the portable nature of Great Cow Basic.

     

    Last edit: Chris Roper 2021-10-07
    • stan cartwright

      stan cartwright - 2021-10-18

      Hi Chris...quite a rant. I think wait is useful if it does what it's supposed to do.
      ie. nothing... but does hardware stuff still happen?
      I would have thought the #chip 328p,16 or #chip 18f25k22,64 would give the compiler the info to sort the wait code.. dunno.
      The simple interrupts like timer0 overflow seem to work to get an event at a exact time.
      Could they be used for wait... thinking they could be turned on for a wait then turned off for the code to work without an interrupt to slow it down. Slow it down.. and we're talking about wait. irony?

       
1 2 > >> (Page 1 of 2)

Log in to post a comment.

Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.