This is an interesting idea, but somewhat tricky. The
`timescale directive is defined in Verilog to have scope
that spans file boundaries, so it may not be clear where
such a default timescale would apply. If I can come up with
a reasonable set of rules, I think it's worth doing.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Intiuitively (for me), a command-line switch for a timescale
would have the highest scope.
In other words,
iverilog --timescale=1ns/10ps *.v
would imply that all the files ending in .v would have a
timescale of 1ns/10ps.
I would use this type of timescale override to get rid of
the problems encountered when I have a project with 100's of
verilog modules from different projects where designers felt
the need in include a `timescale directive (with a different
timescale, of course) in each file.
I don't care about the `timescale directives in any one
file, I just want to use the compile-wide timescale
specified in --timescale.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
> would imply that all the files ending in .v would have
> a timescale of 1ns/10ps.
This is most certainly not what we want! If I design a model using a particular timescale and you change the timescale the model is no longer valid. To me the initial proposal to provide a default is the best solution. The questions is do we provide a default at the beginning of every file and let any following `timescale override the default or do we only provide a default timescale for files that do not have a `timescale directive? To me the former seems right, but I need to think on this a bit more.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
My proposal for this is that we add a command line/command file flag that can be used to set the default simulation units and precision. The rest of the time constructs will work exactly as expected/specified. This flag will only be used to set the default value or the value after a `resetall. My recent patch that added `resetall provided hooks for this. I have the basic flow worked out. All that is needed now is some time and waiting for the command line/file parameter passing code to be finished since any work we would do to add this would likely conflict with that code.
I?t should not be too long before we have something implemented.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I have submitted a patch to the patch tracker that adds +timescale support to the command file. It can be used to set the default time units and precision for a simulation. We need to decide on an appropriate command line flag before it can be added there. This is documented in the manual page and should work with either the 0.9 or development branches. Steve will decide if this gets applied to V0.9 or not.
Substitution is run on this string so the value can come from an environment variable.
I will not close this report until we have the command line support added.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Actually I disagree. The reason I am looking at adding getopt_long() to Icarus is to support this (e.g. --timescale). If I wasn't spending every free moment working on the vlog95 converter this would be finished by now. I have separated the required files from glibc, but I have not taken the time to integrate them into the driver program or add the new long options. I believe we need to map out a full complement of long options, but ones like --timescale and --define, etc are obviously the correct choice.
I'm going to reopen this, but if you disagree with my thoughts on this we can discuss it more.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Logged In: YES
user_id=97566
This is an interesting idea, but somewhat tricky. The
`timescale directive is defined in Verilog to have scope
that spans file boundaries, so it may not be clear where
such a default timescale would apply. If I can come up with
a reasonable set of rules, I think it's worth doing.
Logged In: YES
user_id=97566
I've moved this bug report into "Feature Requests", where it really belongs.
Logged In: YES
user_id=595282
Intiuitively (for me), a command-line switch for a timescale
would have the highest scope.
In other words,
iverilog --timescale=1ns/10ps *.v
would imply that all the files ending in .v would have a
timescale of 1ns/10ps.
I would use this type of timescale override to get rid of
the problems encountered when I have a project with 100's of
verilog modules from different projects where designers felt
the need in include a `timescale directive (with a different
timescale, of course) in each file.
I don't care about the `timescale directives in any one
file, I just want to use the compile-wide timescale
specified in --timescale.
Logged In: YES
user_id=1651735
Originator: NO
> would imply that all the files ending in .v would have
> a timescale of 1ns/10ps.
This is most certainly not what we want! If I design a model using a particular timescale and you change the timescale the model is no longer valid. To me the initial proposal to provide a default is the best solution. The questions is do we provide a default at the beginning of every file and let any following `timescale override the default or do we only provide a default timescale for files that do not have a `timescale directive? To me the former seems right, but I need to think on this a bit more.
My proposal for this is that we add a command line/command file flag that can be used to set the default simulation units and precision. The rest of the time constructs will work exactly as expected/specified. This flag will only be used to set the default value or the value after a `resetall. My recent patch that added `resetall provided hooks for this. I have the basic flow worked out. All that is needed now is some time and waiting for the command line/file parameter passing code to be finished since any work we would do to add this would likely conflict with that code.
I?t should not be too long before we have something implemented.
I have submitted a patch to the patch tracker that adds +timescale support to the command file. It can be used to set the default time units and precision for a simulation. We need to decide on an appropriate command line flag before it can be added there. This is documented in the manual page and should work with either the 0.9 or development branches. Steve will decide if this gets applied to V0.9 or not.
Substitution is run on this string so the value can come from an environment variable.
I will not close this report until we have the command line support added.
I think we can skip command line support for this. Having it in the command file is sufficient, I think.
Actually I disagree. The reason I am looking at adding getopt_long() to Icarus is to support this (e.g. --timescale). If I wasn't spending every free moment working on the vlog95 converter this would be finished by now. I have separated the required files from glibc, but I have not taken the time to integrate them into the driver program or add the new long options. I believe we need to map out a full complement of long options, but ones like --timescale and --define, etc are obviously the correct choice.
I'm going to reopen this, but if you disagree with my thoughts on this we can discuss it more.