accurate software time delay - microsecond range

Discussion to talk about software related topics only.
mmk-tsm
Posts: 33
Joined: Mon Jan 05, 2009 9:22 am

accurate software time delay - microsecond range

Post by mmk-tsm »

Hello,
Has anyone got a good, simple way to do an accurate time delay in software. I need it for a 1-wire interface. I had been using the following in DevC

void DelayOneuS( void )
{
asm(" move.l #39,%d0"); //17
asm("DECLOOP1:");
asm(" subq.l #1,%d0");
asm(" bne DECLOOP1");
}

but when I ported over to Eclipse the loop variable had to increase from 17 to 39, presumably the compiler compiled differently. Is there any way to see listing of output?

Or has someone a better, more secure way using a timer perhaps. I can poll. The 1-wire inteface is only used at start-up, and I can put the code in an OS critical section.
TiA,
Mike.
mmk-tsm
Posts: 33
Joined: Mon Jan 05, 2009 9:22 am

Re: accurate software time delay - microsecond range

Post by mmk-tsm »

I probably should have added that I am using a MOD5270.
User avatar
yevgenit
Posts: 84
Joined: Fri Apr 25, 2008 12:47 am
Contact:

Re: accurate software time delay - microsecond range

Post by yevgenit »

The software delay cannot be accurate due to RTOS and application take non-deterministic quantity of CPU time.
Use any 5270 hardware timer. See related Netburner application notes.
Yevgeni Tunik
Embedded/RealTime software engineer
https://www.linkedin.com/in/yevgenitunik/
________________________
Ridgeglider
Posts: 513
Joined: Sat Apr 26, 2008 7:14 am

Re: accurate software time delay - microsecond range

Post by Ridgeglider »

see the harware timer larry posted:
http://forum.embeddedethernet.com/viewt ... ?f=7&t=397
User avatar
Chris Ruff
Posts: 222
Joined: Thu Apr 24, 2008 4:09 pm
Location: topsail island, nc
Contact:

Re: accurate software time delay - microsecond range

Post by Chris Ruff »

I have been one-wiring for years now on NB at the C level

// **||***********************************************************************
void TDelay(int val)
// **||***********************************************************************
{
volatile int tester;
tester = val;
while(tester--);
}

// **||***********************************************************************
void TestTiming(int iTim)
// **||***********************************************************************
{
USER_ENTER_CRITICAL();
WriteButtonBit(0);
TDelay(200);
WriteButtonBit(1);
TDelay(20);
USER_EXIT_CRITICAL();
}


The whole trick is to turn off the interrupts so that you 'own' the processor. If you don't have the interrupts turned off the durations will change as you are being preempted by the scheduler in the OS


Chris
Real Programmers don't comment their code. If it was hard to write, it should be hard to understand
mmk-tsm
Posts: 33
Joined: Mon Jan 05, 2009 9:22 am

Re: accurate software time delay - microsecond range

Post by mmk-tsm »

Chris,
we also have been one-wiring as you put it for years. And we use the very same approach as you outlined, software time delay.

But on moving from DevC to Eclipse, the timing value had to change, by more than a factor of two. (17 -> 38).

I thought there might be an easy way to do the timing with a hardware timer, maybe polling a fast free-running timer.
Mike.
Ridgeglider
Posts: 513
Joined: Sat Apr 26, 2008 7:14 am

Re: accurate software time delay - microsecond range

Post by Ridgeglider »

It seems like the DMA timer interrupt would allow you to specify a duration and forget it with the assurance that it is what it says it is. You would need to hog the processor with USER_ENTER and EXIT_CRITICAL();

However if you really need only a uSec and care about the ISR service time overhead (sounds like you do) it should be easy to do a specified number of assembler nops:

asm(" nop");
I think these take three processor cycles where the clock is 73728000 hz on the 5270. This is a bit simpler and therefore easier to calculate than the assembler code you suggested.
Be sure to include the USER_ENTER_CRITICAL(); and USER_EXIT_CRITICAL();

Makes no sense that eclipse changed your timing from devC++ unless you had the optimizer on.
Ridgeglider
Posts: 513
Joined: Sat Apr 26, 2008 7:14 am

Re: accurate software time delay - microsecond range

Post by Ridgeglider »

Mike: forgot to add that you said you werte interested in reading a free running timer. You could read the DMA timer count register easily. These are the DTCNn registers.

See chapter 22 in C:\Nburn\docs\FreescaleManuals\MCF5270_5271RM.pdf
rnixon
Posts: 833
Joined: Thu Apr 24, 2008 3:59 pm

Re: accurate software time delay - microsecond range

Post by rnixon »

One possible reason your asm delay changed is the fast buffer and fast sram task switching. Since you are running in a multi-task environment, if the tasks switch faster, then your code may run differently. Or, if at some later date someone modifies your code and moves task priorities around, the software delay may change as well, even if you didn't change to the latest compiler. The h/w timer is the only sure way to get an absolute delay. The problem with adding critical sections is that you increase the overall system latency because nothing else can run. If that is ok, then maybe its a solution, but remember there are network tasks, system timer tasks, etc. You are disabling all of those for the time period of your critical section. On the positive side of things, if the sram buffering is the reason, its nice to know the system runs that much faster!
mmk-tsm
Posts: 33
Joined: Mon Jan 05, 2009 9:22 am

Re: accurate software time delay - microsecond range

Post by mmk-tsm »

Hello,
thanks to all suggestions. The compiler did obviously compile my software loop differently.

Anyhow polling a timer is the correct approach. And using a DMA timer sounds good. I only use one-wire for reading an ID chip, only at power on. So I can use an OS Critical section.

Has anyone a little code snippet (Microchip forums are great for that) for setting up one of the timers, and then reading it?
I am a lazy bugger :D . Not really, saves a lot of time to get something that is known working.
TiA,
Mike.
Post Reply