Thread: Re: [myhdl-list] Migen logic design toolbox - now with simulator (Page 2)
Brought to you by:
jandecaluwe
From: Bob C. <fl...@gm...> - 2012-03-19 12:17:49
|
On 03/19/2012 01:26 AM, Jan Decaluwe wrote: > On 03/19/2012 01:19 AM, Bob Cunningham wrote: > >> Would you please define what you mean when you repeatedly use the >> phrases "serious digital design" and "serious design" in your posts? >> >> Please be clear and concise. State what is and is not "serious >> digital design" according to your definition of the phrase. Try to >> make your point without using that phrase. >> >> You are throwing those phrases around as if they were self-obvious >> and self-justifying. They most certainly are not! What do they mean >> to you? >> >> What do you believe those phrases should mean to me? > But of course. Clarity and conciseness: every poster should > keep it in mind. BTW, asking the question once is enough. > > With the word "serious" I refer to design work with a > complexity level that is such that > > 1) test bench modeling, high-level modeling and > verification will make up the bulk of the work; > > 2) for the synthesizable logic it is desirable to push > the abstraction level as high as possible for > productivity reasons. Can you provide some simple examples where such complexity is required? Why are these simple examples impossible to effectively design without explicitly handling such complexity? At what point should a non-EE hobbyist newcomer to digital design become concerned about such complexity? Do you think very many non-EE FPGA hobbyists would need to reach this level of complexity? Do you believe tools can and should help mitigate exposure to such complexity? -BobC |
From: Jan D. <ja...@ja...> - 2012-03-19 12:51:57
|
On 03/19/2012 01:17 PM, Bob Cunningham wrote: > On 03/19/2012 01:26 AM, Jan Decaluwe wrote: >> 2) for the synthesizable logic it is desirable to push the >> abstraction level as high as possible for productivity reasons. > > Can you provide some simple examples where such complexity is > required? Yes, look at the examples in the MyHDL by Example section. http://www.myhdl.org/doku.php/cookbook > Why are these simple examples impossible to effectively design > without explicitly handling such complexity? You tell me, e.g. after trying the cookbook examples with Migen. > At what point should a non-EE hobbyist newcomer to digital design > become concerned about such complexity? Right from the start. > Do you think very many non-EE FPGA hobbyists would need to reach this > level of complexity? Yes, all of them. FPGA's are huge. And it's not *that* difficult also. Non-EE FPGA hobbyists may have the advantage that they don't think in terms of gates and FFs all the time. > Do you believe tools can and should help mitigate exposure to such > complexity? Yes, that is one goal of MyHDL. -- Jan Decaluwe - Resources bvba - http://www.jandecaluwe.com Python as a HDL: http://www.myhdl.org VHDL development, the modern way: http://www.sigasi.com World-class digital design: http://www.easics.com |
From: Jan D. <ja...@ja...> - 2012-04-22 10:25:42
|
I still want to take the time to clarify my position on the many issues raised in this post. On 03/17/2012 09:20 PM, Bob Cunningham wrote: > On 03/16/2012 02:03 PM, Jan Decaluwe wrote: >> My conclusion is that Migen found an easy target in you, as a >> newbie, to confuse you. It made you think it can be used for >> serious design work. > > Sorry, Jan. If I have to be "confused" to play with my FPGA, then so > be it. I'm very "serious" about being able to play with my FPGA! > > Your statement has an obvious implicit context: To me, you are > shouting, "MyHDL is for Serious Designers Only! Newbies and > Pragmatists should Go Away!" > > If that's what you are saying, then please be direct: Don't attack > Migen for being what it was intentionally created to be, or for being > something MyHDL is not intended to be. Are you upset about Migen > existing, or that there is an audience MyHDL can't help as well as > Migen may be able to? I am disturbed by the suggestion that my critique on Migen is based on anything else than a purely technical assessment. Let me be clear. I don't like Mr. Bourdeauducq's attitude for one bit. But do you think that would be a reason for me to ignore any good idea that he might come up with? Of course not. I am not a masochist. It is quite simple. During my career, I have found that when you think you have seen the worst in HDL-based design, it always gets worse. To date, Migen is the worst that I have seen. But to understand why I am saying this, you have to be prepared to follow my technical arguments and to engage in technical discussions. I have made a few starts, but I certainly was not done yet. However, I see close to zero enthousiasm to continue such discussions. I am therefore frustrated by the fact that I hear all kinds of opinions and suggestions to "merge" but that whenever things get a little technical then the "I am a beginner" umbrella opens. Migen is not my problem. It will disappear in the milky mist of HDL history, just like the many HDLs based on the same flawed paradigm. I am addressing it simply because misleading posts about it appear on this newsgroup. What I am really targetting instead is the conventional wisdom in mainstream HDL design, which often has it all wrong. > If you'd rather beginners like myself go > elsewhere, just say so. MyHDL welcomes beginners. It is the first item on "Why MyHDL": http://myhdl.org/doku.php/why#you_are_new_to_digital_hardware_design In fact, I put most of my hopes on beginners, as they have not yet been brainwashed by the conventional wisdom. > Remember, Migen wasn't created to be useful to beginners: It was > created by an experienced FPGA designer concerned about practicality > and productivity for a specific Open Source hardware project. It I am not impressed by "arguments from experience". The conventional wisdom that I am targetting was created by experienced designers, mostly Verilog ones. Experience can lead to conservatism and can become a hindrance to clear thinking. > simply happened that some things became a bit clearer to me after > looking at Migen and the problems it was created to address. Was it because of Migen or simply because you were spending more time on the problem? > Whatever Migen leaves out may have been what was getting in my way! I find that strange. I understand that you have a lot of experience with embedded software. Therefore, you must know procedural software techniques very well. That is exactly what Migen leaves out. What it leaves is low-level concurrency at the statement level, which must be new to you. And now you suggest that the obstacle is exactly that what you are most familiar with. Beats me. > I'm actually quite lazy: What is the *least* I need to know to make > useful digital designs *now*? No secrets here. The first sentence of "Why MyHDL" warns you: "There's a lot to learn and it will be hard work". Therefore, if you are intellectually lazy (not prepared to learn new things even when they will save you lots of time and effort later on), MyHDL or HDL-based design is not for you. MyHDL is for those who are lazy in the good engineering sense, wanting to accomplish more with less effort eventually. > I'm a beginner: Though I'd love to someday be able to design > circuits like Shakespeare wrote sonnets, I'd be more than happy today > if I were able to work at the level of "Green Eggs and Ham", a true > masterpiece written with an absolute minimum of linguistic > complexity. Come on, let's keep some perspective here. It's not *that* difficult or complex either. And there is a cookbook that shows you the way. >> When Migen claims that the "event-driven" paradigm is too general, >> what it really dumps is procedural support in your HDL descriptions >> - the interesting stuff. > > What's "interesting" to you can be a frustrating block for a newbie. > I hope to one day also be very interested in those aspects of MyHDL, > but it seems to have little to do with what I want to get done today, I don't understand. Your specification seems very extensive and ambitious. It would seem that you have a big need for raising the abstaction level as high as possible, and for an easy path to strong verification. > which is to find the simplest path to get basic circuits out of my > mind and in to my FPGA. Then use them as building-blocks in my hobby > projects. There is a broad concensus about the "building blocks" paradigm in hardware design. That is really not the issue. The issue is what the abstraction level of the building blocks should be. > I am a MyHDL fan. Unfortunately, I simply remain unable to use MyHDL > to accomplish my own immediate hobby needs. That does not indicate > any flaw in MyHDL, merely the extent my own limitations. Do not be > surprised that I am interested in any tool that helps me circumvent > those limitations! > > I actually *like* how Migen slices and dices the process of FPGA > design: The parts that are left out are ones I doubt newbies like me > would notice, much less care about, until confronted with unusually > challenging designs. I suspect Sebastien would agree with much of > your analysis, the proper response being: "So What?" Suppose that I teach a class to newbies in embedded software design based on assembler. Would any of the newbies, except for the rare genius, miss the capabilities of say, C? Does this prove that teaching assembler was a good choice? > It's not about theoretical power or completeness: It's about barriers > to entry. It's not about what I can do in 5 years, but about what I > can do in 5 weeks. Migen is primarily about pragmatism and > productivity, making powerful circuits quickly and easily, and far > less about expert capabilities, theoretical purity or even > consistency. Again, I find this strange. I understand that you have not been successful with MyHDL. However, as I understand it you have not been successful with Migen either. So what is your defense based upon? Of course, we are about 5 weeks further now :-) More to the point. Barriers to entry - ok, but what is the task? I told you that I believe the main problem in HDL-based design is verification, and how MyHDL (unlike Migen) helps you by the fact that you can use the same modelling paradigm for high-level models and test benches as for synthesizable logic. You seemed surprized, which I found suprizing in turn. Is it so different in software? Really, getting those gates into an FPGA is the easy part. The difficult part is getting them to work properly. You will have noticed that Mr. Bourdeauducq made an error in the first simple "one-liner" circuit that I presented to him, as if he wanted to prove my point. Of course, the reason is not incompetence, but simply that he did not verify his design. There is a pattern however. Mr. Bourdeauducq cried foul because I didn't accept his "simple patches". What he ignored, and continued to ignore despite my insistence is that they broke MyHDL. Perhaps Mr. Bourdeaducq considers verification a "detail". Well, I don't. Verification is the problem. The reason why I think the abstraction level of synthesizable logic should be as high as possible, is because that leaves more time for verification. > I seek tools that will help me do what I want to get done today, and > so far Migen seems like it will be most useful. Tomorrow will likely > require additional tools, and I *absolutely* expect (and want) MyHDL > to the first of those tools. It is not an either-or proposition: I > want Migen *and* MyHDL. I realize MyHDL will take more time to > master, and I'm willing to commit that time. But I also want to > create something sooner, rather than later. And I believe that 100% > of what I learn using Migen will later prove useful with MyHDL. I > believe using Migen will keep me motivated toward learning MyHDL. Sounds good, but I think it is cheap talk. Most of Migen's technical choices, starting with its basic paradigm, are almost the opposite of MyHDL's. As a result, verification is not addressed, and it forces you to think at an artificially low level for synthesizable logic. What good can one learn from that? > Right next to me I have a Spartan 3E-500 that contains nothing of my > own design. That must change! Perhaps you are too ambitious. In your shoes, I would start as follows: * isolate a simple function out of your spec * try to concentrate on what it does, instead of how it should be implemented * write that behavior in a (clocked) MyHDL process or processes * also describe it in high-level python, and use that in a unit-test to verify * experiment with synthesis to see whether it works and the result is acceptable * iterate with the synthesizable description as necessary -- Jan Decaluwe - Resources bvba - http://www.jandecaluwe.com Python as a HDL: http://www.myhdl.org VHDL development, the modern way: http://www.sigasi.com World-class digital design: http://www.easics.com |
From: Christopher L. <loz...@fr...> - 2012-04-22 11:26:43
|
First thank you for your excellent email two days ago, and even more so for what you wrote this morning. On 4/22/12 5:25 AM, Jan Decaluwe wrote: > I > believe the main problem in HDL-based design is verification, and > how MyHDL (unlike Migen) helps you by the fact that you > can use the same modelling paradigm for high-level models and > test benches as for synthesizable logic. > > You seemed surprized, which I found suprizing in turn. Is > it so different in software? Really, getting those gates into > an FPGA is the easy part. The difficult part is getting > them to work properly. > > > Well, I don't. Verification is the problem. The reason why I > think the abstraction level of synthesizable logic should > be as high as possible, is because that leaves more time > for verification. I think this is the central point from a marketing perspective. And once you believe this, then the advantage of MyHDL is clear. Only MyHDL gives you both structural and dynamic information in the same computational model. And the latter is clearly needed for verification. On my wiki, I will be using the same creative commons license as on your wiki. Once that is posted on my wiki, may I go ahead and quote from your emails using that same license? -- Regards Christopher Lozinski Check out my iPhone apps TextFaster and EmailFaster http://textfaster.com Expect a paradigm shift. http://MyHDL.org |
From: Jan D. <ja...@ja...> - 2012-04-23 13:01:48
|
On 04/22/2012 01:21 PM, Christopher Lozinski wrote: > First thank you for your excellent email two days ago, and even more so > for what you wrote this morning. > > On 4/22/12 5:25 AM, Jan Decaluwe wrote: >> I >> believe the main problem in HDL-based design is verification, and >> how MyHDL (unlike Migen) helps you by the fact that you >> can use the same modelling paradigm for high-level models and >> test benches as for synthesizable logic. >> >> You seemed surprized, which I found suprizing in turn. Is >> it so different in software? Really, getting those gates into >> an FPGA is the easy part. The difficult part is getting >> them to work properly. >> >> >> Well, I don't. Verification is the problem. The reason why I >> think the abstraction level of synthesizable logic should >> be as high as possible, is because that leaves more time >> for verification. > > I think this is the central point from a marketing perspective. And > once you believe this, then the advantage of MyHDL is clear. Only MyHDL > gives you both structural and dynamic information in the same > computational model. And the latter is clearly needed for verification. > > On my wiki, I will be using the same creative commons license as on your > wiki. Once that is posted on my wiki, may I go ahead and quote from > your emails using that same license? Sure. -- Jan Decaluwe - Resources bvba - http://www.jandecaluwe.com Python as a HDL: http://www.myhdl.org VHDL development, the modern way: http://www.sigasi.com World-class digital design: http://www.easics.com |
From: Christopher L. <loz...@fr...> - 2012-04-24 10:53:32
|
Here is a MEP for floating point numbers. http://wiki.myhdlclass.com:8080/FloatingPoint The basic idea is to create a new signal type, float, out of sign, mantissa and exponent signals. In MyHDL model the calculation using python floating point operators with the appropriate delay. When exporting it call an existing Verilog or VHDL library. The big change is that MyHDL would need to understand hierarchical signals. Maybe it is not that hard. When dealing with a signal in a sensitivity list, MyHDL would first check it it were hierarchical, if it were, MyHDL would add all of the sub signals to the sensitivity list. And from there MyHDL could continue operating as before. If you are interested, I invite you to read the details in the MEP. http://wiki.myhdlclass.com:8080/FloatingPoint -- Regards Christopher Lozinski Check out my iPhone apps TextFaster and EmailFaster http://textfaster.com Expect a paradigm shift. http://MyHDL.org |
From: Thomas H. <th...@ct...> - 2012-04-24 11:40:02
|
Am 24.04.2012 12:53, schrieb Christopher Lozinski: > Here is a MEP for floating point numbers. > > http://wiki.myhdlclass.com:8080/FloatingPoint I always get timeouts when I try to connect to this site. Thomas |
From: Jan D. <ja...@ja...> - 2012-04-25 21:07:17
|
On 04/24/2012 12:53 PM, Christopher Lozinski wrote: > Here is a MEP for floating point numbers. > > http://wiki.myhdlclass.com:8080/FloatingPoint > > The basic idea is to create a new signal type, float, out of sign, > mantissa and exponent signals. In MyHDL model the calculation using > python floating point operators with the appropriate delay. When > exporting it call an existing Verilog or VHDL library. The big change > is that MyHDL would need to understand hierarchical signals. Maybe it > is not that hard. When dealing with a signal in a sensitivity list, > MyHDL would first check it it were hierarchical, if it were, MyHDL > would add all of the sub signals to the sensitivity list. And from > there MyHDL could continue operating as before. > > If you are interested, I invite you to read the details in the MEP. > > http://wiki.myhdlclass.com:8080/FloatingPoint I have a hard time understanding exactly what you expect. (btw this post is buried deep in a thread where it doesn't belong.) As you mentioned, all of this has been discussed before, I don't see new elements. The basic problem as I see it remains that the convertor (or "exporter" as you call it) is a tool which is 'event-accurate'. It cannot "export" a combinatorial operator such as '*' to a module that has pipelining. The operator operates within a clock cycle, the module needs several. Note however that in contrast to what you suggest, it is perfectly possible to *simulate* "hierarchical" signals. Simulation is intended to be completely general. -- Jan Decaluwe - Resources bvba - http://www.jandecaluwe.com Python as a HDL: http://www.myhdl.org VHDL development, the modern way: http://www.sigasi.com World-class digital design: http://www.easics.com |
From: Christopher L. <loz...@fr...> - 2012-04-25 22:05:19
|
On 4/25/12 4:06 PM, Jan Decaluwe wrote: > > Note however that in contrast to what you suggest, it is > perfectly possible to *simulate* "hierarchical" signals. > Simulation is intended to be completely general. Great to know! The basic problem as I see it remains that the convertor (or "exporter" as you call it) is a tool which is 'event-accurate'. It cannot "export" a combinatorial operator such as '*' to a module that has pipelining. The operator operates within a clock cycle, the module needs several. Understood. I was not planning on converting the innards of the floating point module. It is not "in the convertible subset". I just wanted something that I could simulate and that would also run in Verilog. I will use a floating point multiplication library in Verilog. Not quite sure how I am going to do the conversion. I expect that I will write some "user defined verilog or vhdl" code. Anyhow MyHDL does need the floating point modules. If I get them up and simulating correctly, then I am sure other newbies will start using them. Even without the conversion stuff. Anyone interested? -- Regards Christopher Lozinski Check out my iPhone apps TextFaster and EmailFaster http://textfaster.com Expect a paradigm shift. http://MyHDL.org |
From: Jan D. <ja...@ja...> - 2012-04-26 07:27:10
|
On 04/26/2012 12:05 AM, Christopher Lozinski wrote: > > > Anyhow MyHDL does need the floating point modules. If I get them up > and simulating correctly, then I am sure other newbies will start > using them. Even without the conversion stuff. Anyone interested? Wait a moment here, let's take a step back. You can perfectly do event-driven system simulations with MyHDL using Python floats. As I keep saying, modeling is intended to be completely general. You can perfectly use floats or other Python types as the underlying base type in a MyHDL signal. Today. A meaningful way of working would be to get the system simulations to work first, and then later on refine and use user-defined code etc. to get it to convert, if necessary. -- Jan Decaluwe - Resources bvba - http://www.jandecaluwe.com Python as a HDL: http://www.myhdl.org VHDL development, the modern way: http://www.sigasi.com World-class digital design: http://www.easics.com |
From: Christopher L. <loz...@fr...> - 2012-04-26 12:37:03
|
On 4/26/12 2:26 AM, Jan Decaluwe wrote: > On 04/26/2012 12:05 AM, Christopher Lozinski wrote: >> >> Anyhow MyHDL does need the floating point modules. If I get them up >> and simulating correctly, then I am sure other newbies will start >> using them. Even without the conversion stuff. Anyone interested? > Wait a moment here, let's take a step back. You can perfectly > do event-driven system simulations with MyHDL using Python floats. > As I keep saying, modeling is intended to be completely general. > You can perfectly use floats or other Python types as the underlying > base type in a MyHDL signal. Today. > Thank you enormously. I did do a first draft of the code last night, not yet interpreting, but it is a bit of a tangle. By just using python floats in the simulation, it will all get much easier. I can simplify things hugely for the first release. Maybe that will also be easier to export than a hierarchical representation. Back to reading the source code. There is good documentation in the spec directory of the source code. -- Regards Christopher Lozinski Check out my iPhone apps TextFaster and EmailFaster http://textfaster.com Expect a paradigm shift. http://MyHDL.org |
From: Jan D. <ja...@ja...> - 2012-03-19 08:27:26
|
On 03/19/2012 01:19 AM, Bob Cunningham wrote: > Would you please define what you mean when you repeatedly use the > phrases "serious digital design" and "serious design" in your posts? > > Please be clear and concise. State what is and is not "serious > digital design" according to your definition of the phrase. Try to > make your point without using that phrase. > > You are throwing those phrases around as if they were self-obvious > and self-justifying. They most certainly are not! What do they mean > to you? > > What do you believe those phrases should mean to me? But of course. Clarity and conciseness: every poster should keep it in mind. BTW, asking the question once is enough. With the word "serious" I refer to design work with a complexity level that is such that 1) test bench modeling, high-level modeling and verification will make up the bulk of the work; 2) for the synthesizable logic it is desirable to push the abstraction level as high as possible for productivity reasons. -- Jan Decaluwe - Resources bvba - http://www.jandecaluwe.com Python as a HDL: http://www.myhdl.org VHDL development, the modern way: http://www.sigasi.com World-class digital design: http://www.easics.com |
From: Jan D. <ja...@ja...> - 2012-03-19 12:30:37
|
On 03/19/2012 10:06 AM, Sébastien Bourdeauducq wrote: > If you are talking about VHDL variables / blocking Verilog assignments > that can be used to synthesize sequential code that "executes" in one > cycle, then they are supported too (with the "variable" property of the > Signal object). That is what I call "horrible". These things are confusing enough, even in VHDL that makes a clear distinction between signals and variables. MyHDL improves further on this by using a dedicated, distinctive attribute assignment for Signal assignment. Verilog is much worse, because it doesn't make the distinction between signals and variables. But at least it uses two different types of assignment. Unfortunately, this is not enough for most Verilog designers to get the difference. But Migen uses the same type of assignment for 2 things which semantically could not be more different, and hides that difference in an object constructor. No excuses here. This is horrible! Apart from all this, and even more importantly, what is missing is support for typical procedural modeling. Consider the following combinatorial circuit in MyHDL: def MsbDetector(pos, a): """Return the bit position of the msb.""" @always_comb def logic(): pos.next = 0 for i in downrange(len(a)): if a[i] == 1: pos.next = i break return logic I think it is hard to argue about the elegance and clarity of the description. Moreover, it is parametrizable for any bit width by default. How would this look like in Migen? Some elaborate if-then-else structure that explicitly lists all the cases. Moreover, the structure size would depend on the actual bit width. So far for elegance and parametrizability. -- Jan Decaluwe - Resources bvba - http://www.jandecaluwe.com Python as a HDL: http://www.myhdl.org VHDL development, the modern way: http://www.sigasi.com World-class digital design: http://www.easics.com |
From: Jan C. <jan...@mu...> - 2012-03-19 13:09:24
|
On 19/03/12 12:30, Jan Decaluwe wrote: ... > > But Migen uses the same type of assignment for 2 things > which semantically could not be more different, and > hides that difference in an object constructor. No > excuses here. This is horrible! > > Apart from all this, and even more importantly, what > is missing is support for typical procedural modeling. > Consider the following combinatorial circuit in MyHDL: > > def MsbDetector(pos, a): > """Return the bit position of the msb.""" > > @always_comb > def logic(): > pos.next = 0 > for i in downrange(len(a)): > if a[i] == 1: > pos.next = i > break > > return logic Thanks, these are the first two technical points in the discussion which I have been able to understand. Jan Coombs |
From: Sébastien B. <seb...@mi...> - 2012-03-21 09:50:34
|
Hi, On 03/19/2012 01:30 PM, Jan Decaluwe wrote: >> If you are talking about VHDL variables / blocking Verilog assignments >> that can be used to synthesize sequential code that "executes" in one >> cycle, then they are supported too (with the "variable" property of the >> Signal object). > > That is what I call "horrible". These things are confusing enough, > even in VHDL that makes a clear distinction between signals > and variables. MyHDL improves further on this by using a > dedicated, distinctive attribute assignment for Signal > assignment. Well, it's just a small detail, no need to break such a fuss about it. What would you propose then? replace the "variable" property simply with the use of a different assignment? or enforce that another assignment method is used when the "variable" property is set? > def MsbDetector(pos, a): > """Return the bit position of the msb.""" > > @always_comb > def logic(): > pos.next = 0 > for i in downrange(len(a)): > if a[i] == 1: > pos.next = i > break > > return logic (...) > How would this look like in Migen? comb += [If(a[i], pos.eq(i)) for i in downrange(a.bv.width)] Notes: 1. the default 0 is implicit (reset value of a combinatorial signal) 2. you can build the pos signal with the right size using bits_for: pos = Signal(BV(bits_for(a.bv.width-1))) 3. we should add len() support for signals, thanks for the reminder :) 4. you can either assume the synthesizer will automagically build an optimized structure (it doesn't always), or use a bit more control, as in: http://www.ohwr.org/projects/tdc-core/repository/revisions/master/entry/core/tdc_lbc.vhd It's VHDL, but you could do the same with Migen too. > So far for elegance One line :) > and parametrizability. In this example, it's just as parametrizable as yours. In general, Migen is more parametrizable than MyHDL. Let's have another simple example: how would you parametrize the number of wait states (removing them entirely when the parameter is 0) at different points of a FSM? > Well, no. Look at the test bench code versus Migen code and > note that the modeling paradigm is entirely different. > But in practice, today's high-level model that is part > of the verification environment becomes tomorrow's > synthesizable model and vice versa. Following the same logic, you could say that all hardware and software designs should be written using high level languages such as Python, Ruby or Lisp normally, and hope that one day some magical synthesizer will make them fast and optimized on FPGA, ASIC and CPU. Given that Lisp is still slow more than 50 years after its invention, let me have doubts about this position. Or you can be pragmatic and do things like Migen. > Very often, this even happens within the same project. Can you give some examples? Sébastien |
From: Christopher F. <chr...@gm...> - 2012-03-21 11:28:12
|
On 3/21/2012 4:43 AM, Sébastien Bourdeauducq wrote: > Hi, > > On 03/19/2012 01:30 PM, Jan Decaluwe wrote: >>> If you are talking about VHDL variables / blocking Verilog assignments >>> that can be used to synthesize sequential code that "executes" in one >>> cycle, then they are supported too (with the "variable" property of the >>> Signal object). >> >> That is what I call "horrible". These things are confusing enough, >> even in VHDL that makes a clear distinction between signals >> and variables. MyHDL improves further on this by using a >> dedicated, distinctive attribute assignment for Signal >> assignment. > > Well, it's just a small detail, no need to break such a fuss about it. > What would you propose then? replace the "variable" property simply with > the use of a different assignment? or enforce that another assignment > method is used when the "variable" property is set? I am confused, why would you ask for guidance on a "small detail". > >> def MsbDetector(pos, a): >> """Return the bit position of the msb.""" >> >> @always_comb >> def logic(): >> pos.next = 0 >> for i in downrange(len(a)): >> if a[i] == 1: >> pos.next = i >> break >> >> return logic > > (...) > >> How would this look like in Migen? > > comb += [If(a[i], pos.eq(i)) for i in downrange(a.bv.width)] > <snip> >> So far for elegance > > One line :) Is not elegant. > >> and parametrizability. > > In this example, it's just as parametrizable as yours. In general, Migen > is more parametrizable than MyHDL. Let's have another simple example: > how would you parametrize the number of wait states (removing them > entirely when the parameter is 0) at different points of a FSM? Ah, your modus aperandi. You give a hand waiving description without the example code. Putting the work on someone else to provide an example. > >> Well, no. Look at the test bench code versus Migen code and >> note that the modeling paradigm is entirely different. >> But in practice, today's high-level model that is part >> of the verification environment becomes tomorrow's >> synthesizable model and vice versa. > > Following the same logic, you could say that all hardware and software > designs should be written using high level languages such as Python, > Ruby or Lisp normally, and hope that one day some magical synthesizer > will make them fast and optimized on FPGA, ASIC and CPU. Given that Lisp > is still slow more than 50 years after its invention, let me have doubts > about this position. > > Or you can be pragmatic and do things like Migen. > Practicality is your defense? You are taking a huge leap from high-level model propagation to your "magical synthesizer", I believe you missed the point. Regards, Chris |
From: Sébastien B. <seb...@mi...> - 2012-03-21 11:32:26
|
On 03/21/2012 12:27 PM, Christopher Felton wrote: > Ah, your modus aperandi. You give a hand waiving description without > the example code. Putting the work on someone else to provide an example. Oh, sorry. https://github.com/milkymist/milkymist-ng/blob/master/milkymist/asmicon/bankmachine.py#L196 |
From: Christopher F. <chr...@gm...> - 2012-03-21 11:36:57
|
On 3/21/2012 6:35 AM, Sébastien Bourdeauducq wrote: > On 03/21/2012 12:27 PM, Christopher Felton wrote: >> Ah, your modus aperandi. You give a hand waiving description without >> the example code. Putting the work on someone else to provide an example. > > Oh, sorry. > https://github.com/milkymist/milkymist-ng/blob/master/milkymist/asmicon/bankmachine.py#L196 > This is Python? I have never seen Python if-elif-else code written that way. Regards, Chris |
From: Bob C. <fl...@gm...> - 2012-04-22 18:53:18
|
Thank-you Jan for your careful and caring response. It is clear I haven't gone far enough down my own learning path to make well-founded criticisms of HDL tools or approaches. While I continue to closely monitor this list and developments in the general world of "alternative" HDLs, a recent job change has temporarily eliminated my free time to pursue learning HDLs: My FPGA development board is gathering dust on my workbench. Still, when I finally do resume my efforts, I hope to have a better path to pursue. Rather than attempt to respond to your points individually, please permit me instead to step back and restate my goals and describe how they have evolved since encountering MyHDL and Migen. As an embedded real-time engineer specializing primarily in software, but with proven skills in system design and high-level circuit design (mainly processor and peripheral chip selection and integration), my greatest goal is to minimize the amount of new software in the products I create. Statistically, nothing adds bugs to a system faster, and delays product shipping longer, than adding a single new line of code. The buggiest code tends to occur at the lowest levels: At the functional level, the error rate for assembler is a multiple of that for C which in turn is a multiple of that for Python. It's not just about levels of abstraction, though that is certainly a vital factor. It also concerns the availability of proven code, and how easily that code can be integrated onto the current project. Part of the issue is how easy or difficult it is to create portable and reusable code on any system, the rest concerns how to make use of that code. I love projects where most of my effort is spent writing "glue" code to integrate known-good libraries to create a new application. My work as an engineer then becomes finding that good code, and selecting from among the candidates the code that best meets my needs and will be easiest to integrate. That said, what I love most is writing code from scratch, implementing something important that has never existed before. But that's not what gets products out the door: Professionally, I must hold my own code authoring skills and desires at bay, and use the rest of my skills to assess what's best for the system and product as a whole. It's kind of schizophrenic: I ship better systems sooner when I keep my own coding skills as a last resort, and prioritize integrating the work of others. But I always yearn for projects where I get to do something truly new and innovative. I view the FPGA as an ideal canvas for an embedded engineer, where I can work more closely with the EEs while simultaneously using using my system design and integration skills to further reduce the software needed to produce a product. The first FPGA skill I'd most like to have to be productive would be the ability to write glue between the FPGA and devices in the outside world, and between IP cores within the FPGA. I became very excited when I saw that Migen included the Wishbone interface as a pre-integrated macro: There are many cores over at OpenCores that use the Wishbone interface, but getting them to talk to each other required a level of skill that was way beyond my knowledge. Why would I need to know how to implement a Wishbone interface? What I need to know is how to design and test a system that uses components integrated with standard interfaces, and those skills are readily transferred from my embedded experience, where I have had to test and integrate many hardware systems and components. I like using logic analyzers and o'scopes far more than I like using software debuggers! I suppose this isn't really an HDL issue: It's much higher than that. I suppose it's more of a tool-level issue with language implications. When I finally do start to write my own complex circuits from scratch (something I greatly look forward to, and expect to have lots of fun doing), I'll want to use a language (or languages) that are appropriate to the task at hand, much as I use assembler, C and Python to create embedded software. However, identifying the most appropriate language can be difficult, and may even be a needless effort if the highest-level languages can "play nice" at the lowest level. Permit me to share a case in point: Several years ago I had to develop an instrument based on a complex sensor system that used inherently noisy sensors from which we needed to extract "soft" real-time performance. The statistics alone were quite daunting, and I decided to prototype the system in Python so I could use Numpy and related libraries to quickly develop the algorithms I needed. The prototype was a success, but it had one interesting characteristic: It met our performance sped when running on the desktop, and was only a factor of 2-3 slower than our minimum required performance spec when running on our intended target system. Surprising performance for a prototype in an interpreted language. I profiled the code and found that the Python numeric and statistical libraries were wicked-fast: They were more than capable of doing all the math needed well within our timing requirements. That meant the slow performance was due to the code I had written to interface to the sensors and prepare their data for use by the fast libraries, and passing data between those libraries. I first moved as much functionality as possible into the libraries, which yielded an immediate 25% speed improvement. Next, I stumbled across Psyco (these were pre-PyPy days), and used it to more than double the performance of the non-library code I had written. That left me tantalizingly close to meeting my timing requirements, but I was still missing them by 30%. I had never before considered using Python to implement a near-real-time embedded system. Not only that, I also had never shipped Linux in a delivered system: I had always been driven to minimize system power, which meant small processors, which in turn meant running no code that didn't directly support the end result, which in turn meant often running with no OS, or at most an RTOS. For this system I had a much more generous power envelope due to the generous maximum weight my battery-powered instrument could have. If I went with a Linux platform, I'd need to double my power budget, which meant doubling the current rating and capacity of the batteries: Switching from NiMh to Li-Ion permitted me to get the power I needed with only a modest increase in system weight. But how to obtain that last bit of performance I needed? I was already using the fastest CPU I could get that met our packaging requirements (no fan, no extensive or expensive passive heatsinks, which at the time meant a ULV Celeron). A final round of profiling indicated my interface code was spending lots of time reading and converting data: Each data element was causing individual library calls. Implementing larger buffers helped, but still didn't get me close to the time I needed: Servicing the hardware was the problem. The final speed boost came when the buffers were moved from software to hardware, so each interface access returned 1000x more data, a design change that also improved the timing between the individual sensors, which in turn simplified the statistics needed to process the data. Sorry for the length of that story, but that experience has affected all the work I've done since, and also affects my expectations when it comes to FPGA development. I want my FPGA development to integrate nicely into the above process, to be a powerful tool that permits me not just to engineer a fully-specified circuit into a fully-specified target, but also to experiment with integrating different components, to move solutions or parts of the problem from one domain to another, to quickly make use of the work of others, and be able to create my own when needed. The key element that must be present to facilitate such flexibility is the ability to implement common aspects of interfaces across multiple domains, to switch interfaces while preserving functionality, to be able to work with gates, busses, processors, instructions, libraries, languages and whatever else is needed to permit a given slice of functionality to be implemented and accessed in its best and most effective domain. I want everything from simple FIFOs to complex processor interfaces to be available to me in multiple domains without having to create them from scratch each time. And THAT is what initially attracted me to MyHDL: If I can use Python to move software into hardware, and interface with it from the Python domain, why, I could become a system development god! And the advent of PyPy makes that approach even more tractable. I soon learned that such power is not the goal of MyHDL: The goal of MyHDL is to use Python to make a incrementally better flavor Verilog or VHDL, a relatively small but significant evolutionary step compared to the transformative change I was dreaming of. I desire to climb up and skip between levels of abstraction and implementation domains that are almost orthogonal to what MyHDL pursues. MyHDL may prove to be a piece of the final process, but it does not, and perhaps cannot, encompass that process on its own. Hence my excitement with Migen: Even in its embryonic, incomplete, and flawed initial state, it clearly seeks to bridge abstractions in ways alien to MyHDL, to make interfaces an integral part of the package and process, rather than something to code to at a low level. In this respect, MyHDL feels more like C, and Migen aims to be Python with its powerful libraries being made available to higher-level code. Again, I'm only mapping from domains I know to a domain I hope to enter and in which I hope become proficient and productive. I'm certain my mapping has flaws and gaps owing to my lack of general HDL knowledge. But I suspect this very lack may enable me to conceive of and pursue a path that may yield a greater whole. MyHDL and Migen both seem to me to be stepping stones in a stream I wish to see crossed by a 4-lane bridge. And arguments between and about them seem like arguments about C vs. Python: It's about the solution path and the power of the process, not about any particular tool! It's about how tools interact and combine, not about how or why they differ or overlap. And if Python/PyPy doesn't evolve more quickly, it may get devoured by Julia, an interpreted/JIT language which abstracts into areas Python presently supports badly (such as multiprocessing and coroutines), with speeds PyPy has yet to attain. It may be that the concepts embodied in both MyHDL and Migen could eventually see more effective and more flexible implementations in the Julia ecosystem. Python, too, is just one tool in a procession of tools over time. -BobC On 04/22/2012 03:25 AM, Jan Decaluwe wrote: > I still want to take the time to clarify my position > on the many issues raised in this post. > > On 03/17/2012 09:20 PM, Bob Cunningham wrote: >> On 03/16/2012 02:03 PM, Jan Decaluwe wrote: >>> My conclusion is that Migen found an easy target in you, as a >>> newbie, to confuse you. It made you think it can be used for >>> serious design work. >> Sorry, Jan. If I have to be "confused" to play with my FPGA, then so >> be it. I'm very "serious" about being able to play with my FPGA! >> >> Your statement has an obvious implicit context: To me, you are >> shouting, "MyHDL is for Serious Designers Only! Newbies and >> Pragmatists should Go Away!" >> >> If that's what you are saying, then please be direct: Don't attack >> Migen for being what it was intentionally created to be, or for being >> something MyHDL is not intended to be. Are you upset about Migen >> existing, or that there is an audience MyHDL can't help as well as >> Migen may be able to? > I am disturbed by the suggestion that my critique on Migen is > based on anything else than a purely technical assessment. > > Let me be clear. I don't like Mr. Bourdeauducq's attitude for > one bit. But do you think that would be a reason for me to > ignore any good idea that he might come up with? Of course > not. I am not a masochist. > > It is quite simple. During my career, I have found that when > you think you have seen the worst in HDL-based design, it > always gets worse. To date, Migen is the worst that I have > seen. But to understand why I am saying this, you have to > be prepared to follow my technical arguments and to > engage in technical discussions. I have made a few starts, > but I certainly was not done yet. However, I see close to > zero enthousiasm to continue such discussions. > > I am therefore frustrated by the fact that I hear all kinds > of opinions and suggestions to "merge" but that whenever things > get a little technical then the "I am a beginner" umbrella opens. > > Migen is not my problem. It will disappear in the milky mist > of HDL history, just like the many HDLs based on the same > flawed paradigm. I am addressing it simply because misleading > posts about it appear on this newsgroup. > > What I am really targetting instead is the conventional wisdom in > mainstream HDL design, which often has it all wrong. > >> If you'd rather beginners like myself go >> elsewhere, just say so. > MyHDL welcomes beginners. It is the first item on "Why MyHDL": > > http://myhdl.org/doku.php/why#you_are_new_to_digital_hardware_design > > In fact, I put most of my hopes on beginners, as they have not > yet been brainwashed by the conventional wisdom. > >> Remember, Migen wasn't created to be useful to beginners: It was >> created by an experienced FPGA designer concerned about practicality >> and productivity for a specific Open Source hardware project. It > I am not impressed by "arguments from experience". The conventional > wisdom that I am targetting was created by experienced designers, > mostly Verilog ones. Experience can lead to conservatism and > can become a hindrance to clear thinking. > >> simply happened that some things became a bit clearer to me after >> looking at Migen and the problems it was created to address. > Was it because of Migen or simply because you were spending more > time on the problem? > >> Whatever Migen leaves out may have been what was getting in my way! > I find that strange. I understand that you have a lot of > experience with embedded software. Therefore, you must know > procedural software techniques very well. That is exactly what > Migen leaves out. What it leaves is low-level concurrency at the > statement level, which must be new to you. And now you suggest > that the obstacle is exactly that what you are most familiar > with. Beats me. > >> I'm actually quite lazy: What is the *least* I need to know to make >> useful digital designs *now*? > No secrets here. The first sentence of "Why MyHDL" warns you: > "There's a lot to learn and it will be hard work". Therefore, if > you are intellectually lazy (not prepared to learn new things even > when they will save you lots of time and effort later on), MyHDL > or HDL-based design is not for you. > > MyHDL is for those who are lazy in the good engineering sense, > wanting to accomplish more with less effort eventually. > >> I'm a beginner: Though I'd love to someday be able to design >> circuits like Shakespeare wrote sonnets, I'd be more than happy today >> if I were able to work at the level of "Green Eggs and Ham", a true >> masterpiece written with an absolute minimum of linguistic >> complexity. > Come on, let's keep some perspective here. It's not *that* difficult > or complex either. And there is a cookbook that shows you the way. > >>> When Migen claims that the "event-driven" paradigm is too general, >>> what it really dumps is procedural support in your HDL descriptions >>> - the interesting stuff. >> What's "interesting" to you can be a frustrating block for a newbie. >> I hope to one day also be very interested in those aspects of MyHDL, >> but it seems to have little to do with what I want to get done today, > I don't understand. Your specification seems very extensive and > ambitious. It would seem that you have a big need for raising the > abstaction level as high as possible, and for an easy path to > strong verification. > >> which is to find the simplest path to get basic circuits out of my >> mind and in to my FPGA. Then use them as building-blocks in my hobby >> projects. > There is a broad concensus about the "building blocks" paradigm > in hardware design. That is really not the issue. The issue is > what the abstraction level of the building blocks should be. > >> I am a MyHDL fan. Unfortunately, I simply remain unable to use MyHDL >> to accomplish my own immediate hobby needs. That does not indicate >> any flaw in MyHDL, merely the extent my own limitations. Do not be >> surprised that I am interested in any tool that helps me circumvent >> those limitations! >> >> I actually *like* how Migen slices and dices the process of FPGA >> design: The parts that are left out are ones I doubt newbies like me >> would notice, much less care about, until confronted with unusually >> challenging designs. I suspect Sebastien would agree with much of >> your analysis, the proper response being: "So What?" > Suppose that I teach a class to newbies in embedded software design > based on assembler. Would any of the newbies, except for the > rare genius, miss the capabilities of say, C? Does this prove that > teaching assembler was a good choice? > >> It's not about theoretical power or completeness: It's about barriers >> to entry. It's not about what I can do in 5 years, but about what I >> can do in 5 weeks. Migen is primarily about pragmatism and >> productivity, making powerful circuits quickly and easily, and far >> less about expert capabilities, theoretical purity or even >> consistency. > Again, I find this strange. I understand that you have not been > successful with MyHDL. However, as I understand it you have not > been successful with Migen either. So what is your defense based > upon? Of course, we are about 5 weeks further now :-) > > More to the point. > > Barriers to entry - ok, but what is the task? I told you that I > believe the main problem in HDL-based design is verification, and > how MyHDL (unlike Migen) helps you by the fact that you > can use the same modelling paradigm for high-level models and > test benches as for synthesizable logic. > > You seemed surprized, which I found suprizing in turn. Is > it so different in software? Really, getting those gates into > an FPGA is the easy part. The difficult part is getting > them to work properly. > > You will have noticed that Mr. Bourdeauducq made an error in > the first simple "one-liner" circuit that I presented to him, > as if he wanted to prove my point. Of course, the reason is > not incompetence, but simply that he did not verify his > design. > > There is a pattern however. Mr. Bourdeauducq cried foul because > I didn't accept his "simple patches". What he ignored, and > continued to ignore despite my insistence is that they broke > MyHDL. Perhaps Mr. Bourdeaducq considers verification a "detail". > > Well, I don't. Verification is the problem. The reason why I > think the abstraction level of synthesizable logic should > be as high as possible, is because that leaves more time > for verification. > >> I seek tools that will help me do what I want to get done today, and >> so far Migen seems like it will be most useful. Tomorrow will likely >> require additional tools, and I *absolutely* expect (and want) MyHDL >> to the first of those tools. It is not an either-or proposition: I >> want Migen *and* MyHDL. I realize MyHDL will take more time to >> master, and I'm willing to commit that time. But I also want to >> create something sooner, rather than later. And I believe that 100% >> of what I learn using Migen will later prove useful with MyHDL. I >> believe using Migen will keep me motivated toward learning MyHDL. > Sounds good, but I think it is cheap talk. > > Most of Migen's technical choices, starting with its basic paradigm, > are almost the opposite of MyHDL's. As a result, verification is not > addressed, and it forces you to think at an artificially low level > for synthesizable logic. What good can one learn from that? > >> Right next to me I have a Spartan 3E-500 that contains nothing of my >> own design. That must change! > Perhaps you are too ambitious. > > In your shoes, I would start as follows: > > * isolate a simple function out of your spec > * try to concentrate on what it does, instead of how it should be implemented > * write that behavior in a (clocked) MyHDL process or processes > * also describe it in high-level python, and use that in a unit-test to verify > * experiment with synthesis to see whether it works and the result is acceptable > * iterate with the synthesizable description as necessary > > |
From: Jan D. <ja...@ja...> - 2012-04-23 07:55:02
|
Bob: Now I think we are getting somewhere. A couple of things: First, there is some confusion about what Migen exactly means. My fault partially. Apparently Migen packages both a HDL, FHDL, and some IP blocks. When I criticize Migen, I'm really addressing FHDL, because that's my interest in comparing with MyHDL. The problem is that it is not packaged separately. I don't think it's a good idea to package IP blocks together with an HDL, and I will not do that for MyHDL. It's a totally different, and much bigger and diverse task. However, you do point to an important weakness of the MyHDL ecosystem: the lack of strong open source IP blocks. This is somewhat frustrating, because I think MyHDL is the ideal platform for such developments. There is no reason why it should be less powerful than Migen for this, to the contrary (because of the more powerful underlying HDL). I add that I personally have not plans in this direction. You cannot do it all. The Migen developers apparently included exactly the IP that they need for their specific purposes. I hope to see MyHDL IP in a variety of domains, including ones that I know nothing about. The main conclusion from your post is that I should do a better job in upfront expectation management. The very next thing I'm going to do is to write a page "What MyHDL is not". Jan On 04/22/2012 08:53 PM, Bob Cunningham wrote: > Thank-you Jan for your careful and caring response. It is clear I > haven't gone far enough down my own learning path to make > well-founded criticisms of HDL tools or approaches. While I continue > to closely monitor this list and developments in the general world of > "alternative" HDLs, a recent job change has temporarily eliminated my > free time to pursue learning HDLs: My FPGA development board is > gathering dust on my workbench. Still, when I finally do resume my > efforts, I hope to have a better path to pursue. > > Rather than attempt to respond to your points individually, please > permit me instead to step back and restate my goals and describe how > they have evolved since encountering MyHDL and Migen. > > As an embedded real-time engineer specializing primarily in software, > but with proven skills in system design and high-level circuit design > (mainly processor and peripheral chip selection and integration), my > greatest goal is to minimize the amount of new software in the > products I create. Statistically, nothing adds bugs to a system > faster, and delays product shipping longer, than adding a single new > line of code. The buggiest code tends to occur at the lowest levels: > At the functional level, the error rate for assembler is a multiple > of that for C which in turn is a multiple of that for Python. > > It's not just about levels of abstraction, though that is certainly a > vital factor. It also concerns the availability of proven code, and > how easily that code can be integrated onto the current project. > Part of the issue is how easy or difficult it is to create portable > and reusable code on any system, the rest concerns how to make use of > that code. I love projects where most of my effort is spent writing > "glue" code to integrate known-good libraries to create a new > application. My work as an engineer then becomes finding that good > code, and selecting from among the candidates the code that best > meets my needs and will be easiest to integrate. > > That said, what I love most is writing code from scratch, > implementing something important that has never existed before. But > that's not what gets products out the door: Professionally, I must > hold my own code authoring skills and desires at bay, and use the > rest of my skills to assess what's best for the system and product as > a whole. It's kind of schizophrenic: I ship better systems sooner > when I keep my own coding skills as a last resort, and prioritize > integrating the work of others. But I always yearn for projects > where I get to do something truly new and innovative. > > I view the FPGA as an ideal canvas for an embedded engineer, where I > can work more closely with the EEs while simultaneously using using > my system design and integration skills to further reduce the > software needed to produce a product. > > The first FPGA skill I'd most like to have to be productive would be > the ability to write glue between the FPGA and devices in the outside > world, and between IP cores within the FPGA. I became very excited > when I saw that Migen included the Wishbone interface as a > pre-integrated macro: There are many cores over at OpenCores that use > the Wishbone interface, but getting them to talk to each other > required a level of skill that was way beyond my knowledge. > > Why would I need to know how to implement a Wishbone interface? What > I need to know is how to design and test a system that uses > components integrated with standard interfaces, and those skills are > readily transferred from my embedded experience, where I have had to > test and integrate many hardware systems and components. I like > using logic analyzers and o'scopes far more than I like using > software debuggers! > > I suppose this isn't really an HDL issue: It's much higher than > that. I suppose it's more of a tool-level issue with language > implications. > > When I finally do start to write my own complex circuits from scratch > (something I greatly look forward to, and expect to have lots of fun > doing), I'll want to use a language (or languages) that are > appropriate to the task at hand, much as I use assembler, C and > Python to create embedded software. > > However, identifying the most appropriate language can be difficult, > and may even be a needless effort if the highest-level languages can > "play nice" at the lowest level. Permit me to share a case in > point: > > Several years ago I had to develop an instrument based on a complex > sensor system that used inherently noisy sensors from which we needed > to extract "soft" real-time performance. The statistics alone were > quite daunting, and I decided to prototype the system in Python so I > could use Numpy and related libraries to quickly develop the > algorithms I needed. The prototype was a success, but it had one > interesting characteristic: It met our performance sped when running > on the desktop, and was only a factor of 2-3 slower than our minimum > required performance spec when running on our intended target system. > Surprising performance for a prototype in an interpreted language. > > I profiled the code and found that the Python numeric and statistical > libraries were wicked-fast: They were more than capable of doing all > the math needed well within our timing requirements. That meant the > slow performance was due to the code I had written to interface to > the sensors and prepare their data for use by the fast libraries, and > passing data between those libraries. I first moved as much > functionality as possible into the libraries, which yielded an > immediate 25% speed improvement. Next, I stumbled across Psyco > (these were pre-PyPy days), and used it to more than double the > performance of the non-library code I had written. > > That left me tantalizingly close to meeting my timing requirements, > but I was still missing them by 30%. I had never before considered > using Python to implement a near-real-time embedded system. Not only > that, I also had never shipped Linux in a delivered system: I had > always been driven to minimize system power, which meant small > processors, which in turn meant running no code that didn't directly > support the end result, which in turn meant often running with no OS, > or at most an RTOS. For this system I had a much more generous power > envelope due to the generous maximum weight my battery-powered > instrument could have. If I went with a Linux platform, I'd need to > double my power budget, which meant doubling the current rating and > capacity of the batteries: Switching from NiMh to Li-Ion permitted me > to get the power I needed with only a modest increase in system > weight. > > But how to obtain that last bit of performance I needed? I was > already using the fastest CPU I could get that met our packaging > requirements (no fan, no extensive or expensive passive heatsinks, > which at the time meant a ULV Celeron). A final round of profiling > indicated my interface code was spending lots of time reading and > converting data: Each data element was causing individual library > calls. Implementing larger buffers helped, but still didn't get me > close to the time I needed: Servicing the hardware was the problem. > The final speed boost came when the buffers were moved from software > to hardware, so each interface access returned 1000x more data, a > design change that also improved the timing between the individual > sensors, which in turn simplified the statistics needed to proce ss > the data. > > Sorry for the length of that story, but that experience has affected > all the work I've done since, and also affects my expectations when > it comes to FPGA development. I want my FPGA development to > integrate nicely into the above process, to be a powerful tool that > permits me not just to engineer a fully-specified circuit into a > fully-specified target, but also to experiment with integrating > different components, to move solutions or parts of the problem from > one domain to another, to quickly make use of the work of others, and > be able to create my own when needed. > > The key element that must be present to facilitate such flexibility > is the ability to implement common aspects of interfaces across > multiple domains, to switch interfaces while preserving > functionality, to be able to work with gates, busses, processors, > instructions, libraries, languages and whatever else is needed to > permit a given slice of functionality to be implemented and accessed > in its best and most effective domain. I want everything from simple > FIFOs to complex processor interfaces to be available to me in > multiple domains without having to create them from scratch each > time. > > And THAT is what initially attracted me to MyHDL: If I can use > Python to move software into hardware, and interface with it from the > Python domain, why, I could become a system development god! And the > advent of PyPy makes that approach even more tractable. > > I soon learned that such power is not the goal of MyHDL: The goal of > MyHDL is to use Python to make a incrementally better flavor Verilog > or VHDL, a relatively small but significant evolutionary step > compared to the transformative change I was dreaming of. > > I desire to climb up and skip between levels of abstraction and > implementation domains that are almost orthogonal to what MyHDL > pursues. MyHDL may prove to be a piece of the final process, but it > does not, and perhaps cannot, encompass that process on its own. > > Hence my excitement with Migen: Even in its embryonic, incomplete, > and flawed initial state, it clearly seeks to bridge abstractions in > ways alien to MyHDL, to make interfaces an integral part of the > package and process, rather than something to code to at a low level. > In this respect, MyHDL feels more like C, and Migen aims to be Python > with its powerful libraries being made available to higher-level > code. > > Again, I'm only mapping from domains I know to a domain I hope to > enter and in which I hope become proficient and productive. I'm > certain my mapping has flaws and gaps owing to my lack of general HDL > knowledge. But I suspect this very lack may enable me to conceive of > and pursue a path that may yield a greater whole. > > MyHDL and Migen both seem to me to be stepping stones in a stream I > wish to see crossed by a 4-lane bridge. And arguments between and > about them seem like arguments about C vs. Python: It's about the > solution path and the power of the process, not about any particular > tool! It's about how tools interact and combine, not about how or > why they differ or overlap. > > > And if Python/PyPy doesn't evolve more quickly, it may get devoured > by Julia, an interpreted/JIT language which abstracts into areas > Python presently supports badly (such as multiprocessing and > coroutines), with speeds PyPy has yet to attain. It may be that the > concepts embodied in both MyHDL and Migen could eventually see more > effective and more flexible implementations in the Julia ecosystem. > Python, too, is just one tool in a procession of tools over time. > > > -BobC > > > On 04/22/2012 03:25 AM, Jan Decaluwe wrote: >> I still want to take the time to clarify my position on the many >> issues raised in this post. >> >> On 03/17/2012 09:20 PM, Bob Cunningham wrote: >>> On 03/16/2012 02:03 PM, Jan Decaluwe wrote: >>>> My conclusion is that Migen found an easy target in you, as a >>>> newbie, to confuse you. It made you think it can be used for >>>> serious design work. >>> Sorry, Jan. If I have to be "confused" to play with my FPGA, >>> then so be it. I'm very "serious" about being able to play with >>> my FPGA! >>> >>> Your statement has an obvious implicit context: To me, you are >>> shouting, "MyHDL is for Serious Designers Only! Newbies and >>> Pragmatists should Go Away!" >>> >>> If that's what you are saying, then please be direct: Don't >>> attack Migen for being what it was intentionally created to be, >>> or for being something MyHDL is not intended to be. Are you >>> upset about Migen existing, or that there is an audience MyHDL >>> can't help as well as Migen may be able to? >> I am disturbed by the suggestion that my critique on Migen is based >> on anything else than a purely technical assessment. >> >> Let me be clear. I don't like Mr. Bourdeauducq's attitude for one >> bit. But do you think that would be a reason for me to ignore any >> good idea that he might come up with? Of course not. I am not a >> masochist. >> >> It is quite simple. During my career, I have found that when you >> think you have seen the worst in HDL-based design, it always gets >> worse. To date, Migen is the worst that I have seen. But to >> understand why I am saying this, you have to be prepared to follow >> my technical arguments and to engage in technical discussions. I >> have made a few starts, but I certainly was not done yet. However, >> I see close to zero enthousiasm to continue such discussions. >> >> I am therefore frustrated by the fact that I hear all kinds of >> opinions and suggestions to "merge" but that whenever things get a >> little technical then the "I am a beginner" umbrella opens. >> >> Migen is not my problem. It will disappear in the milky mist of HDL >> history, just like the many HDLs based on the same flawed paradigm. >> I am addressing it simply because misleading posts about it appear >> on this newsgroup. >> >> What I am really targetting instead is the conventional wisdom in >> mainstream HDL design, which often has it all wrong. >> >>> If you'd rather beginners like myself go elsewhere, just say so. >> MyHDL welcomes beginners. It is the first item on "Why MyHDL": >> >> http://myhdl.org/doku.php/why#you_are_new_to_digital_hardware_design >> >> >> In fact, I put most of my hopes on beginners, as they have not >> yet been brainwashed by the conventional wisdom. >> >>> Remember, Migen wasn't created to be useful to beginners: It was >>> created by an experienced FPGA designer concerned about >>> practicality and productivity for a specific Open Source hardware >>> project. It >> I am not impressed by "arguments from experience". The >> conventional wisdom that I am targetting was created by experienced >> designers, mostly Verilog ones. Experience can lead to conservatism >> and can become a hindrance to clear thinking. >> >>> simply happened that some things became a bit clearer to me >>> after looking at Migen and the problems it was created to >>> address. >> Was it because of Migen or simply because you were spending more >> time on the problem? >> >>> Whatever Migen leaves out may have been what was getting in my >>> way! >> I find that strange. I understand that you have a lot of experience >> with embedded software. Therefore, you must know procedural >> software techniques very well. That is exactly what Migen leaves >> out. What it leaves is low-level concurrency at the statement >> level, which must be new to you. And now you suggest that the >> obstacle is exactly that what you are most familiar with. Beats >> me. >> >>> I'm actually quite lazy: What is the *least* I need to know to >>> make useful digital designs *now*? >> No secrets here. The first sentence of "Why MyHDL" warns you: >> "There's a lot to learn and it will be hard work". Therefore, if >> you are intellectually lazy (not prepared to learn new things even >> when they will save you lots of time and effort later on), MyHDL or >> HDL-based design is not for you. >> >> MyHDL is for those who are lazy in the good engineering sense, >> wanting to accomplish more with less effort eventually. >> >>> I'm a beginner: Though I'd love to someday be able to design >>> circuits like Shakespeare wrote sonnets, I'd be more than happy >>> today if I were able to work at the level of "Green Eggs and >>> Ham", a true masterpiece written with an absolute minimum of >>> linguistic complexity. >> Come on, let's keep some perspective here. It's not *that* >> difficult or complex either. And there is a cookbook that shows you >> the way. >> >>>> When Migen claims that the "event-driven" paradigm is too >>>> general, what it really dumps is procedural support in your HDL >>>> descriptions - the interesting stuff. >>> What's "interesting" to you can be a frustrating block for a >>> newbie. I hope to one day also be very interested in those >>> aspects of MyHDL, but it seems to have little to do with what I >>> want to get done today, >> I don't understand. Your specification seems very extensive and >> ambitious. It would seem that you have a big need for raising the >> abstaction level as high as possible, and for an easy path to >> strong verification. >> >>> which is to find the simplest path to get basic circuits out of >>> my mind and in to my FPGA. Then use them as building-blocks in >>> my hobby projects. >> There is a broad concensus about the "building blocks" paradigm in >> hardware design. That is really not the issue. The issue is what >> the abstraction level of the building blocks should be. >> >>> I am a MyHDL fan. Unfortunately, I simply remain unable to use >>> MyHDL to accomplish my own immediate hobby needs. That does not >>> indicate any flaw in MyHDL, merely the extent my own limitations. >>> Do not be surprised that I am interested in any tool that helps >>> me circumvent those limitations! >>> >>> I actually *like* how Migen slices and dices the process of FPGA >>> design: The parts that are left out are ones I doubt newbies like >>> me would notice, much less care about, until confronted with >>> unusually challenging designs. I suspect Sebastien would agree >>> with much of your analysis, the proper response being: "So >>> What?" >> Suppose that I teach a class to newbies in embedded software >> design based on assembler. Would any of the newbies, except for >> the rare genius, miss the capabilities of say, C? Does this prove >> that teaching assembler was a good choice? >> >>> It's not about theoretical power or completeness: It's about >>> barriers to entry. It's not about what I can do in 5 years, but >>> about what I can do in 5 weeks. Migen is primarily about >>> pragmatism and productivity, making powerful circuits quickly and >>> easily, and far less about expert capabilities, theoretical >>> purity or even consistency. >> Again, I find this strange. I understand that you have not been >> successful with MyHDL. However, as I understand it you have not >> been successful with Migen either. So what is your defense based >> upon? Of course, we are about 5 weeks further now :-) >> >> More to the point. >> >> Barriers to entry - ok, but what is the task? I told you that I >> believe the main problem in HDL-based design is verification, and >> how MyHDL (unlike Migen) helps you by the fact that you can use the >> same modelling paradigm for high-level models and test benches as >> for synthesizable logic. >> >> You seemed surprized, which I found suprizing in turn. Is it so >> different in software? Really, getting those gates into an FPGA is >> the easy part. The difficult part is getting them to work >> properly. >> >> You will have noticed that Mr. Bourdeauducq made an error in the >> first simple "one-liner" circuit that I presented to him, as if he >> wanted to prove my point. Of course, the reason is not >> incompetence, but simply that he did not verify his design. >> >> There is a pattern however. Mr. Bourdeauducq cried foul because I >> didn't accept his "simple patches". What he ignored, and continued >> to ignore despite my insistence is that they broke MyHDL. Perhaps >> Mr. Bourdeaducq considers verification a "detail". >> >> Well, I don't. Verification is the problem. The reason why I think >> the abstraction level of synthesizable logic should be as high as >> possible, is because that leaves more time for verification. >> >>> I seek tools that will help me do what I want to get done today, >>> and so far Migen seems like it will be most useful. Tomorrow >>> will likely require additional tools, and I *absolutely* expect >>> (and want) MyHDL to the first of those tools. It is not an >>> either-or proposition: I want Migen *and* MyHDL. I realize MyHDL >>> will take more time to master, and I'm willing to commit that >>> time. But I also want to create something sooner, rather than >>> later. And I believe that 100% of what I learn using Migen will >>> later prove useful with MyHDL. I believe using Migen will keep >>> me motivated toward learning MyHDL. >> Sounds good, but I think it is cheap talk. >> >> Most of Migen's technical choices, starting with its basic >> paradigm, are almost the opposite of MyHDL's. As a result, >> verification is not addressed, and it forces you to think at an >> artificially low level for synthesizable logic. What good can one >> learn from that? >> >>> Right next to me I have a Spartan 3E-500 that contains nothing of >>> my own design. That must change! >> Perhaps you are too ambitious. >> >> In your shoes, I would start as follows: >> >> * isolate a simple function out of your spec * try to concentrate >> on what it does, instead of how it should be implemented * write >> that behavior in a (clocked) MyHDL process or processes * also >> describe it in high-level python, and use that in a unit-test to >> verify * experiment with synthesis to see whether it works and the >> result is acceptable * iterate with the synthesizable description >> as necessary >> >> > > ------------------------------------------------------------------------------ > > For Developers, A Lot Can Happen In A Second. > Boundary is the first to Know...and Tell You. Monitor Your > Applications in Ultra-Fine Resolution. Try it FREE! > http://p.sf.net/sfu/Boundary-d2dvs2 -- Jan Decaluwe - Resources bvba - http://www.jandecaluwe.com Python as a HDL: http://www.myhdl.org VHDL development, the modern way: http://www.sigasi.com World-class digital design: http://www.easics.com |
From: Tom D. <td...@di...> - 2012-04-24 13:58:38
|
Hi, A couple of quick comments; We have had the discussion about being able to group signals. That would be a good thing to have, I would use it for complex numbers all the time. It only really matters at the top level I/O, as conversion and simulation will choke. I pass around an object of a complex class to make internal connections and that works great, then break them out at the top level. My other 2 cents is, for something as complex as a floating point operator, say multiply I don't think you would have much luck overriding the operator as in: c = a * b I think you really want a module instantiation, since you want to be able to pass parameters along with the signals. Such as number of pipelines, type of rounding and so on. That would lead to a much more worth while library component. Tom On 04/24/2012 05:53 AM, Christopher Lozinski wrote: > Here is a MEP for floating point numbers. > > http://wiki.myhdlclass.com:8080/FloatingPoint > > The basic idea is to create a new signal type, float, out of sign, > mantissa and exponent signals. In MyHDL model the calculation using > python floating point operators with the appropriate delay. When > exporting it call an existing Verilog or VHDL library. The big change > is that MyHDL would need to understand hierarchical signals. Maybe it > is not that hard. When dealing with a signal in a sensitivity list, > MyHDL would first check it it were hierarchical, if it were, MyHDL > would add all of the sub signals to the sensitivity list. And from > there MyHDL could continue operating as before. > > If you are interested, I invite you to read the details in the MEP. > > http://wiki.myhdlclass.com:8080/FloatingPoint > > -- > Regards > Christopher Lozinski > > Check out my iPhone apps TextFaster and EmailFaster > http://textfaster.com > > Expect a paradigm shift. > http://MyHDL.org > > > ------------------------------------------------------------------------------ > Live Security Virtual Conference > Exclusive live event will cover all the ways today's security and > threat landscape has changed and how IT managers can respond. Discussions > will include endpoint security, mobile security and the latest in malware > threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ > > > _______________________________________________ > myhdl-list mailing list > myh...@li... > https://lists.sourceforge.net/lists/listinfo/myhdl-list |
From: Christopher L. <loz...@fr...> - 2012-04-24 15:57:04
|
On 4/24/12 8:40 AM, Tom Dillon wrote: > My other 2 cents is, for something as complex as a floating point > operator, say multiply I don't think you would have much luck > overriding the operator as in: > > c = a * b > > I think you really want a module instantiation, since you want to be > able to pass parameters along with the signals. Such as number of > pipelines, type of rounding and so on. That would lead to a much more > worth while library component. Agreed. Sorry if my posting was not clear. I have edited the wiki, it now says. Implementing Floating Point Multiply Conceptually, given a binary sign, an intbv exponent, and a fixed point mantissa, it is not that hard to implement a floating point multiply. http://www.cs.umd.edu/class/sum2003/cmsc311/Notes/BinMath/multFloat.html But, that does look like a lot of work. And it is complex to figure out the pipelining. So my plan is to create MyHDL modules for floating point multiplication, division and addition/subtraction. Each such module will accept inputs of two floating point signals. Internally it will convert the signals into a python floating point number, and then in MyHDL simulation it will just use python floating point multiplication. * output = input1 * input2 Then the output signals will be set using the results from the python computation. Of course this will not work on export, so on export there will be a call to a Verilog library to do the hard work. And then pipelining also has to be represented. And delays have to be added. Is that better? That page now also has the creative commons license. -- Regards Christopher Lozinski Check out my iPhone apps TextFaster and EmailFaster http://textfaster.com Expect a paradigm shift. http://MyHDL.org |
From: Tom D. <td...@di...> - 2012-04-24 16:33:47
|
I guess I misunderstood the first time around. I just scanned your post. I thought you were talking about an IP module, that could be used to make logic. I think that would be very useful, and could attract new users to MyHDL. I don't think it needs to be built into MyHDL, but could just be a package used with it. When I read it over more, I think you are really talking about being able to model floating point in MyHDL, with no plans to convert it to logic. I think that is quite simple and really you are just talking about making conversions to/from MyHDL types to/from Python floating point numbers? I don't think any changes are needed in MyHDL to support this effort. I think having IP available built with MyHDL would be a good way to attract new interest in MyHDL. Also a good way to showcase the power of MyHDL. That is really what I use it for, making parametric modules that are easy to reuse and have are easy to test. Keep in mind, that floating point modules are available from all FPGA vendors for free. So the effort to make them is really only worth it if you are targeting an ASIC. If you are targeting an ASIC, there is a good chance the extra effort will be used to make fixed point work, to save valuable area and power. On 04/24/2012 10:56 AM, Christopher Lozinski wrote: > On 4/24/12 8:40 AM, Tom Dillon wrote: >> My other 2 cents is, for something as complex as a floating point >> operator, say multiply I don't think you would have much luck >> overriding the operator as in: >> >> c = a * b >> >> I think you really want a module instantiation, since you want to be >> able to pass parameters along with the signals. Such as number of >> pipelines, type of rounding and so on. That would lead to a much more >> worth while library component. > Agreed. Sorry if my posting was not clear. I have edited the wiki, > it now says. > > > Implementing Floating Point Multiply > > Conceptually, given a binary sign, an intbv exponent, and a fixed > point mantissa, it is not that hard to implement a floating point > multiply. > > http://www.cs.umd.edu/class/sum2003/cmsc311/Notes/BinMath/multFloat.html > > But, that does look like a lot of work. And it is complex to figure > out the pipelining. > > So my plan is to create MyHDL modules for floating point > multiplication, division and addition/subtraction. Each such module > will accept inputs of two floating point signals. Internally it will > convert the signals into a python floating point number, and then in > MyHDL simulation it will just use python floating point multiplication. > > * output = input1 * input2 > > Then the output signals will be set using the results from the python > computation. > > Of course this will not work on export, so on export there will be a > call to a Verilog library to do the hard work. > > And then pipelining also has to be represented. And delays have to be > added. > > Is that better? > > That page now also has the creative commons license. > > -- > Regards > Christopher Lozinski > > Check out my iPhone apps TextFaster and EmailFaster > http://textfaster.com > > Expect a paradigm shift. > http://MyHDL.org |
From: Christopher F. <chr...@gm...> - 2012-03-13 03:15:04
|
On 3/12/12 8:07 PM, Bob Cunningham wrote: > On 03/12/2012 02:53 PM, Jan Decaluwe wrote: >> On 03/12/2012 10:23 PM, Thomas Heller wrote: >>> Am 12.03.2012 16:10, schrieb David Greenberg: >>>> I like hearing about progress in the Python HDL front, whomever is >>>> doing it. I don't think it makes sense to fracture these already tiny >>>> communities. >>> +1 from me too. >>> >>> Thomas >>> >> What do you both mean? Obviously Migen "fractures" >> the community. Also, obviously they think they are making >> progress. Do you want to hear about it or not? > > From my perspective as an HDL newbie (but a 30+ year engineering veteran otherwise), major feature development in MyHDL appears to have stalled, and Migen provides important capabilities MyHDL either lacks or can implement only with cumbersome and fragile kluges or work-arounds. (At least, that's what they look like to me, with my newbie eyes and brain.) > There are some reasons for this. One, not many people are willing or able to contribute. Two, things happen; people have babies (no sleep I get) and other events happen in peoples lives that can stall a project. MyHDL has no intent of stopping (sorry if I am speaking for others). I believe MyHDL would be more than happy to have people contribute. Sebastien's case is well documented in this mailing list. I don't think it was an unfair handling. It is very easy for outsiders to come and ask for new features (there is a similar thread on scipy-dev). To use a project but not to give back and constantly ask for the latest buzz word feature, it can be taxing on the developers. I am guilty of this as well, I have used MyHDL extensively and I haven't given back a whole lot. I would like to discuss these "fragile kluges" or work arounds some more. The best I recall (I haven't reread the mailing-list) there are enhancements people would like but no "fragile" kludge. > MyHDL is more powerful overall, but Migen has attacked some specific problems is was designed to solve. Problems, IIRC, that were prompted by perceived weaknesses in MyHDL. As I understand it, Migen started as a rejected patch to MyHDL that was forced to find a life of its own. > The rejected patch has been discussed. The idea wasn't rejected but the contempt of the process that had been established. It isn't about a rejected patch it is about philosophical differences. Constructive conversations about the process could be useful but no more about the rejected *broken* patch. I think the process is fair, efficient, and required to make a successful project. > I believe Sébastien continues to post here not to diminish MyHDL or its use in any way, but instead to show at least one way to more easily do things that are difficult in MyHDL. If you look at his many non-Migen posts, you will see Sébastien has proven himself to be a very capable contributor to this list. > Sorry but I totally disagree. Again, look back to the old posts. He is not interested in the technical discussions as he tries to claim. As you state, you have 30+ years as an Engineer. Technical conversations can be long and in depth and don't end with one party exclaiming "you are just ...". > My personal hope is that MyHDL and Migen will evolve toward one another, and eventually find ways to inter-operate, share features, or even merge capabilities. I fully expect MyHDL may one day make Migen obsolete, but that day does not seem to be coming any time soon. > This is where I get confused. Why is the burden on the MyHDL developers (Jan) to add the features and have the quality that people are use to with this project. What is it about the development process that people object with that they would rather see a separate project created? > Until the parties involved agree to work together toward shared goals, it may be best for Migen to start its own list. When Sébastien gets his list going, he should be permitted to post sign-up instructions here. > > However, I hope a separate list won't be necessary: When Sébastien has referred to his own Migen posts in other forums (such as MilkyMist), he has referred people to *this* *list*. If anything, he has actively worked to increase MyHDL awareness, not to diminish it, or compete with it, or fracture it. From this side it just doesn't seem like that. Some links to these conversations ... seeing is believing? I don't see how responding to a question on the MyHDL mailing-list with "you should checkout migen" before any other responses is not deliberate: fracturing, diminishing, and competing. > > To clear the air, perhaps we could have a Migen-MyHDL shoot-out, where Sébastien posts some of what Migen does best, and the MyHDL pros can show us how best to do it in MyHDL. The posts should be in the form of modules that can be integrated into other projects (MyHDL or Verilog or VHDL) for testing and comparison. I know it would prove to be extremely educational for me, and may serve us all to better understand the underlying issues. > Who has the time. And what does it prove, I simply have the opinion that it is rude to advertise and self promote on a project you branched from. This is where this sucks, if Migen decided to branch for whatever reason, I don't care. It is their right to try something different, do it their way. Nothing wrong with branching and starting a new project. I wish them the best, go conquer the world. But I don't think the projects have similar goals. And instead of making things personal and dragging this out it is best Migen goes on its own and stops exploiting this group. It might not be fair for me to make such a statement, Jan has built this project/group and others have contributed as well, but that is my opinion. > Sébastien/Migen has to be doing something right: It got Jan back on this list! Yay!!! You don't realize it but you are being a complete *ASS* with that statement. > > > -BobC > > > ------------------------------------------------------------------------------ > Keep Your Developer Skills Current with LearnDevNow! > The most comprehensive online learning library for Microsoft developers > is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3, > Metro Style Apps, more. Free future releases when you subscribe now! > http://p.sf.net/sfu/learndevnow-d2d |
From: Bob C. <fl...@gm...> - 2012-03-13 12:32:34
|
On 03/12/2012 08:14 PM, Christopher Felton wrote: > On 3/12/12 8:07 PM, Bob Cunningham wrote: >> Sébastien/Migen has to be doing something right: It got Jan back on this list! Yay!!! > You don't realize it but you are being a complete *ASS* with that statement. I am very aware that Jan took time away from the list under difficult personal circumstances, and my thoughts and best wishes have been with him every time I read this list. I'm saying I've missed Jan's presence here. Aside from occasional comments like the above, Chris has done a great job carrying the torch and keeping this list highly responsive. But nothing can replace the presence of MyHDL's creator and guiding light. I also find that Jan has a great way of providing context for his technical answers, which means a lot to me as a beginner, where 'why' often matters more to me than the details of 'what' and 'how'. I'm amazed Chris finds that my simple joy at seeing Jan's posts makes me an "ass". Back at you, buddy. -BobC |