The Delta Cycle logic is actually quite similar to functional reactive programming. It separates how a value changes from when a process responds to that change.
VHDL had this figured out as early as 1987. I spent many years writing Verilog test benches and chasing numerous race conditions; those types of bugs simply don't exist in VHDL.
The Verilog rules—using non-blocking assignments for sequential logic and blocking assignments for combinational logic—fail as soon as the scenario becomes slightly complex. Verilog is suitable when you already have the circuit in your head and just need to write it down quickly. In contrast, VHDL forces you to think about concurrent processes in the correct way. While the former is faster to write, the latter is the correct approach.
Even though SystemVerilog added some patches, the underlying execution model still has inherent race conditions.
I used to be a huge VHDL proponent, talk about the delta cycle stuff, give VHDL classes at work to new college grads and such. And then I moved to the West Coast and was forced to start using Verilog.
And in the 21 years since, I’ve never once ran into an actual simulation determinism issues.
It’s not bad to have a strict simulation model, but if some very basic coding style rules are followed (which everybody does), it’s just not a problem.
I don’t agree at all with the statement that Verilog fails when things become too complex. The world’s most complex chips are built with it. If there were ever a slight chance that chips couldn’t be designed reliably with it, that could never be the case.
On a practical level, you're right, most of my team's work is done in Verilog.
That being said, I still have a preference for the VHDL simulation model. A design that builds correctness directly into the language structure is inherently more elegant than one that relies on coding conventions to constrain behavior.
My memory is definitely rusty on this, but you can easily construct cases where the VHDL delta cycle model creates problems where it doesn’t for Verilog.
I remember inserting clock signal assignments in VHDL to get a balanced delta cycle clock tree. In Verilog, that all simply gets flattened.
I can describe the VHDL delta cycle model pretty well, and I can’t for Verilog, yet the Verilog model has given me less issues in practice
As for elegance: I can’t stand the verboseness of VHDL anymore. :-)
The question for me is, where do I catch, describe the physical reality the model describes? A simulation model can be very elegant. But does it represent how physical things really behave? Can we even expect to do that at RTL, or further down the design flow? As the name suggest, we are talking about transferring data between registers. In the RTL that is what I can expect to describe.
At the end of the day, what I write will become an electrical circuit - in a FPGA or an ASIC (or both), having the complex exact modelling with wire delays, capacitance, cross talk, cell behavior too early makes it impossibly to simulate fast enough to iterate. So then we need to have a more idealized world, but keeping in mind that (1) it is an idealized world and (2) sooner or later the model will be the rubber on the road.
To me, Verilog and SystemVerilog allow me to do this efficiently. Warts and all.
Oh, and also, where in my toolchain is my VHDL model translated/transformed into Verilog? How good is that translation? How much does the dual licensing cost.
Things like mixed language simulation, formal verification between a verilog netlist and RTL in Verilog, mapping to cell libraries in Verilog. Integration of IP cores written in SystemVerilog with your model?
Are the tools for VHDL as well tested as with code in Verilog? How big is the VHDL team at the tool vendor, library vendor, IP vendor, fab vendor compared to the Verilog, SV team? Can I expect the same support as a VHDL user as for Verilog? How much money does a vendor earn from VHDL customers compared to Verilog, SV? How easy is it to find employees with VHDL experience?
VHDL may be a very nice language for simulation. But the engineering, business side is messy. And dev time, money can't be ignored. Getting things as fast and cheap as possibly still meeting a lot of functional, business requirements is what we as engineers are responsible for. Does VHDL make that easier or not?
> where in my toolchain is my VHDL model translated/transformed into Verilog?
It's not? Why would it?
As much as I like Verilog, VHDL is a first class RTL language just like Verilog. I've done plenty of chips that contain both VHDL and Verilog. They both translate directly to gate level.
These days, most EDA tools use Verific parser and elaborator front-ends. The specific tool magic happens after that and that API is language agnostic.
> How easy is it to find employees with VHDL experience?
On the East Coast and in Europe: much easier than finding employees with Verilog experience. (At least that was the case 20 years ago, I have no clue how it is today.)
One thing that has changed a lot is that SystemVerilog is now the general language of choice for verification, which helps give (System)Verilog an edge for RTL design too.
Involved in FPGA and ASIC projects since 1997. Predominantly in Europe, nowadays more Asia and some in the US. Since ~2010 I have only seen VHDL in small chops targeting only FPGAs, and in government-heavy projects like defence and space. Nowadays these are also by and large SV. The ratio is something like one in VHDL for 20 Verilog, SV projects. They teach VHDL at universities, and then ppl get to experience SV as soon as they enter the market.
Typical issues are still as given before. Many small IP vendors, esp for communication, networking are using and understand, support only SV. I agree on SV for verification is a big driver.
the geographic constraint is probably the real answer to "which is better" for most people. you learn what your team uses, what your local jobs demand. theoretical elegance matters less than "can i get hired next month"
Over the years I have run Altera, Lattice, and Xilinx... and almost all reasonably complex projects were always done in Verilog. If I recall Xilinx fully integrated its Synopsys export workflow a few years back, but not sure where that went after the mergers.
This actually sounds a bit like a C/C++ argument. Roughly: Yes, you can easily write incorrect code but when some basic coding conventions are followed, UAF/double free/buffer overflows/... are just not a problem. After all, some of the world's most complex software is built with C / C++. If you couldn't write software reliably with C / C++, that could never be the case.
I.e. just because teams manage to do something with a tool does not mean the tool didn't impede (or vice versa, enable) the result. It just says that it's possible. A qualitative comparison with other tools cannot be established on that basis.
I'm a long time verilog user (30+ years, a dozen or so tapeouts), even written a couple of compilers so I'm intimate with the gory details of event scheduling.
Used to be in the early days that some people depended on how the original verilog interpreter ordered events, it was a silly thing (models would only run on one simulator, cause of lots of angst).
'<=' assignment fixed a lot of these problems, using it correctly means that you can model synchronous logic without caring about event ordering (at the cost of an extra copy and an extra event which can be mostly optimised away by a compiler).
In combination 'always @(*)' and '=', and assign give you reliable combinatorial logic.
In real world logic a lot of event ordering is non deterministic - one signal can appear before/after another depending on temperature all in all it's best not to design depending it if you possibly can, do it right and you don't care about event ordering, let your combinatorial circuits waggle around as their inputs change and catch the result in flops synchronously.
IMHO Verilog's main problems are that it: a) mixes flops and wires in a confusing way, and b) if you stay away from the synthesisable subset lets you do things that do depend on event ordering that can get you into trouble (but you need that sometimes to build test benches)
Naively as a West Coast Verilog person, VHDL Delta cycles seem like a nice idea, but not what actual circuits are doing by default. The beauty and the terror of Verilog is the complete, unconstrained parallel nature of it’s default - it all evaluates at t=0 by default, until you add clocks and state via registers. VHDL seems easy to create latches and other abominations too easily. (I am probably wrong at least partially.)
Verilog gives you enough rope.
Once the design gets past toy size, you spend time chasing sim vs synthesis mismatches because the language leaves ordering loose in places where humans read intent into source order.
VHDL's delta cycles are weird, and there's edge cases there too, but the extra ceremony works more like a childproof cap than a crown jewel.
Reminds me a lot of "Logical Execution Time" and the work of Edward Lee ("The Problem With Threads") for a software equivalent. Determinism needs sparation of computation from communication.
The real question is, why do we even need this? Why don't VHDL and Verilog just simulate what hardware does? Real hardware doesn't have any delta cycles or determinism issues due to scheduling. Same thing with sensitivity lists (yes we have */all now so that's basically solved), but why design it so that it's easy to shoot in your own foot?
What do you mean by simulate? Do you want the language to be aware of the temperature of the silicon? Because I can build you circuits whose behaviour changes due to variation in the temperature of the silicon. Essentially all these languages are not timing aware. So you design your circuit with combinatorial logic and a clock, and then hope (pray) that your compiler makes it meet timing.
The fundamental problem is that we're trying to create a simulation model of real hardware that is (a) realistic enough to tell us something reasonable about how to expect the hardware to behave and (b) computationally efficient enough to tell us about a in a reasonable period of time.
> Why don't VHDL and Verilog just simulate what hardware does?
Real hardware has hold violations. If you get your delta cycles wrong, that's exactly what you get in VHDL...
They're both modeling languages. They can model high-level RTL or gate-level and they can behave very different if you're not careful. "just simulation what the hardware does" is itself an ambiguous statement. Sometimes you want one model, sometimes the other.
Please stop bickering about verilog vs vhdl - if you use NBAs the scheduler works exactly the same in modern day simulators. There is no crown jewel in vhdl anymore. Also type system is annoying. Its just in your way, not helping at all.
The Delta Cycle logic is actually quite similar to functional reactive programming. It separates how a value changes from when a process responds to that change.
VHDL had this figured out as early as 1987. I spent many years writing Verilog test benches and chasing numerous race conditions; those types of bugs simply don't exist in VHDL.
The Verilog rules—using non-blocking assignments for sequential logic and blocking assignments for combinational logic—fail as soon as the scenario becomes slightly complex. Verilog is suitable when you already have the circuit in your head and just need to write it down quickly. In contrast, VHDL forces you to think about concurrent processes in the correct way. While the former is faster to write, the latter is the correct approach.
Even though SystemVerilog added some patches, the underlying execution model still has inherent race conditions.
I used to be a huge VHDL proponent, talk about the delta cycle stuff, give VHDL classes at work to new college grads and such. And then I moved to the West Coast and was forced to start using Verilog.
And in the 21 years since, I’ve never once ran into an actual simulation determinism issues.
It’s not bad to have a strict simulation model, but if some very basic coding style rules are followed (which everybody does), it’s just not a problem.
I don’t agree at all with the statement that Verilog fails when things become too complex. The world’s most complex chips are built with it. If there were ever a slight chance that chips couldn’t be designed reliably with it, that could never be the case.
Anyway, not really relevant, but this all reminds me of the famous Verilog vs VHDL contest of 1997: https://danluu.com/verilog-vs-vhdl/
On a practical level, you're right, most of my team's work is done in Verilog.
That being said, I still have a preference for the VHDL simulation model. A design that builds correctness directly into the language structure is inherently more elegant than one that relies on coding conventions to constrain behavior.
My memory is definitely rusty on this, but you can easily construct cases where the VHDL delta cycle model creates problems where it doesn’t for Verilog.
I remember inserting clock signal assignments in VHDL to get a balanced delta cycle clock tree. In Verilog, that all simply gets flattened.
I can describe the VHDL delta cycle model pretty well, and I can’t for Verilog, yet the Verilog model has given me less issues in practice
As for elegance: I can’t stand the verboseness of VHDL anymore. :-)
The question for me is, where do I catch, describe the physical reality the model describes? A simulation model can be very elegant. But does it represent how physical things really behave? Can we even expect to do that at RTL, or further down the design flow? As the name suggest, we are talking about transferring data between registers. In the RTL that is what I can expect to describe.
At the end of the day, what I write will become an electrical circuit - in a FPGA or an ASIC (or both), having the complex exact modelling with wire delays, capacitance, cross talk, cell behavior too early makes it impossibly to simulate fast enough to iterate. So then we need to have a more idealized world, but keeping in mind that (1) it is an idealized world and (2) sooner or later the model will be the rubber on the road.
To me, Verilog and SystemVerilog allow me to do this efficiently. Warts and all.
Oh, and also, where in my toolchain is my VHDL model translated/transformed into Verilog? How good is that translation? How much does the dual licensing cost.
Things like mixed language simulation, formal verification between a verilog netlist and RTL in Verilog, mapping to cell libraries in Verilog. Integration of IP cores written in SystemVerilog with your model?
Are the tools for VHDL as well tested as with code in Verilog? How big is the VHDL team at the tool vendor, library vendor, IP vendor, fab vendor compared to the Verilog, SV team? Can I expect the same support as a VHDL user as for Verilog? How much money does a vendor earn from VHDL customers compared to Verilog, SV? How easy is it to find employees with VHDL experience?
VHDL may be a very nice language for simulation. But the engineering, business side is messy. And dev time, money can't be ignored. Getting things as fast and cheap as possibly still meeting a lot of functional, business requirements is what we as engineers are responsible for. Does VHDL make that easier or not?
> where in my toolchain is my VHDL model translated/transformed into Verilog?
It's not? Why would it?
As much as I like Verilog, VHDL is a first class RTL language just like Verilog. I've done plenty of chips that contain both VHDL and Verilog. They both translate directly to gate level.
These days, most EDA tools use Verific parser and elaborator front-ends. The specific tool magic happens after that and that API is language agnostic.
> How easy is it to find employees with VHDL experience?
On the East Coast and in Europe: much easier than finding employees with Verilog experience. (At least that was the case 20 years ago, I have no clue how it is today.)
One thing that has changed a lot is that SystemVerilog is now the general language of choice for verification, which helps give (System)Verilog an edge for RTL design too.
Involved in FPGA and ASIC projects since 1997. Predominantly in Europe, nowadays more Asia and some in the US. Since ~2010 I have only seen VHDL in small chops targeting only FPGAs, and in government-heavy projects like defence and space. Nowadays these are also by and large SV. The ratio is something like one in VHDL for 20 Verilog, SV projects. They teach VHDL at universities, and then ppl get to experience SV as soon as they enter the market.
Typical issues are still as given before. Many small IP vendors, esp for communication, networking are using and understand, support only SV. I agree on SV for verification is a big driver.
the geographic constraint is probably the real answer to "which is better" for most people. you learn what your team uses, what your local jobs demand. theoretical elegance matters less than "can i get hired next month"
Over the years I have run Altera, Lattice, and Xilinx... and almost all reasonably complex projects were always done in Verilog. If I recall Xilinx fully integrated its Synopsys export workflow a few years back, but not sure where that went after the mergers.
The Amaranth HDL python project does look fun =3
https://github.com/amaranth-lang/amaranth
This actually sounds a bit like a C/C++ argument. Roughly: Yes, you can easily write incorrect code but when some basic coding conventions are followed, UAF/double free/buffer overflows/... are just not a problem. After all, some of the world's most complex software is built with C / C++. If you couldn't write software reliably with C / C++, that could never be the case.
I.e. just because teams manage to do something with a tool does not mean the tool didn't impede (or vice versa, enable) the result. It just says that it's possible. A qualitative comparison with other tools cannot be established on that basis.
I'm a long time verilog user (30+ years, a dozen or so tapeouts), even written a couple of compilers so I'm intimate with the gory details of event scheduling.
Used to be in the early days that some people depended on how the original verilog interpreter ordered events, it was a silly thing (models would only run on one simulator, cause of lots of angst).
'<=' assignment fixed a lot of these problems, using it correctly means that you can model synchronous logic without caring about event ordering (at the cost of an extra copy and an extra event which can be mostly optimised away by a compiler).
In combination 'always @(*)' and '=', and assign give you reliable combinatorial logic.
In real world logic a lot of event ordering is non deterministic - one signal can appear before/after another depending on temperature all in all it's best not to design depending it if you possibly can, do it right and you don't care about event ordering, let your combinatorial circuits waggle around as their inputs change and catch the result in flops synchronously.
IMHO Verilog's main problems are that it: a) mixes flops and wires in a confusing way, and b) if you stay away from the synthesisable subset lets you do things that do depend on event ordering that can get you into trouble (but you need that sometimes to build test benches)
Naively as a West Coast Verilog person, VHDL Delta cycles seem like a nice idea, but not what actual circuits are doing by default. The beauty and the terror of Verilog is the complete, unconstrained parallel nature of it’s default - it all evaluates at t=0 by default, until you add clocks and state via registers. VHDL seems easy to create latches and other abominations too easily. (I am probably wrong at least partially.)
((Shai-Hulud Desires the Verilog))
AFAIK, creating latches is just as easy in Verilog as in VHDL. They use the same model to determine when to create one.
But with a solid design flow (which should include linting tools like Spyglass for both VHDL and Verilog), it’s not a major concern.
Verilog gives you enough rope. Once the design gets past toy size, you spend time chasing sim vs synthesis mismatches because the language leaves ordering loose in places where humans read intent into source order.
VHDL's delta cycles are weird, and there's edge cases there too, but the extra ceremony works more like a childproof cap than a crown jewel.
Reminds me a lot of "Logical Execution Time" and the work of Edward Lee ("The Problem With Threads") for a software equivalent. Determinism needs sparation of computation from communication.
The real question is, why do we even need this? Why don't VHDL and Verilog just simulate what hardware does? Real hardware doesn't have any delta cycles or determinism issues due to scheduling. Same thing with sensitivity lists (yes we have */all now so that's basically solved), but why design it so that it's easy to shoot in your own foot?
What do you mean by simulate? Do you want the language to be aware of the temperature of the silicon? Because I can build you circuits whose behaviour changes due to variation in the temperature of the silicon. Essentially all these languages are not timing aware. So you design your circuit with combinatorial logic and a clock, and then hope (pray) that your compiler makes it meet timing.
The fundamental problem is that we're trying to create a simulation model of real hardware that is (a) realistic enough to tell us something reasonable about how to expect the hardware to behave and (b) computationally efficient enough to tell us about a in a reasonable period of time.
> Why don't VHDL and Verilog just simulate what hardware does?
Real hardware has hold violations. If you get your delta cycles wrong, that's exactly what you get in VHDL...
They're both modeling languages. They can model high-level RTL or gate-level and they can behave very different if you're not careful. "just simulation what the hardware does" is itself an ambiguous statement. Sometimes you want one model, sometimes the other.
Draw yourself an SR latch and try simulating. Or a circuit what is known as „pulse generator“
Please stop bickering about verilog vs vhdl - if you use NBAs the scheduler works exactly the same in modern day simulators. There is no crown jewel in vhdl anymore. Also type system is annoying. Its just in your way, not helping at all.
Sounds like reachability problem in Petri nets to me?