Re: [myhdl-list] Sigma-Delta DAC
Brought to you by:
jandecaluwe
From: Bob C. <Fl...@gm...> - 2011-10-04 07:37:37
|
More questions about this code: http://www.myhdl.org/doku.php/projects:dsx1000 I'd like to optimize the DAC output performance so the analog audio output would faithfully as possible represent a 10-bit digital input sequence at 44.1 KHz. I have two usage scenarios: a) Driving a line-out connection to an external amplifier, and b) driving a local analog comparator (with known hysteresis) as part of a sigma-delta ADC. In both cases, I'd want to optimize both the RC filter design (for the intended load and PWM drive current/voltage) and the digital DAC design (mainly to use a minimum PWM output bit rate). Ideally, the synthesized circuit would default to the current code (or raise an error) when the output filter characteristics are not specified. With this goal in mind, my first detailed pass through the existing code raises two related questions: 1. The definition of dac_dsx1000() in dac_dsx1000_hd.pydoes not take into account either the characteristics of the load on the PWM output pin, nor the drive capability of that pin. If these characteristics are known prior to synthesis, shouldn't the generated PWM bitstream be pre-compensated accordingly? That is, when the input value changes, shouldn't the output be driven hard until the estimated filter output (not the pin output) gets to within a single bit-time delta of the desired value? The formula for a simple first-order RC output filter is simple. My naive guess is the correction could be implemented as a digital filter (the inverse of the output filter), perhaps by modifying the integrator stage (Adder()). The main benefit would be that the required PWM clock rate could be significantly reduced and/or output fidelity improved. The question is: Would the extra processing be worth doing? Why or why not? 2. The unit test in bench_static() in dac_dsx1000_ut.py sets "INTEGRATION_TIME = 512". This seems like a very long time. I doubt I'll see that many PWM clock periods in any realistic implementation scenario. - Is there a better or more realistic way to determine this value? Perhaps as a function of input word width and output bitstream frequency? - How should the value change if the optimization described in the prior question were implemented? That is, I'd like the test to tell me if the PWM output will drive the filter output (not just the pin output) with the desired level of error. - Or am I missing some of the intent of the test? -BobC |