I agree, it is an interesting idea. I suspect that just having "Haskell" got the link a lot of upvotes :-) As stated on the "Why CλaSH", the advantage is obvious for combinational circuits. But it doesn't seem to help much for synchronous logic (which arguably represents the majority of hardware designs). You'll still be writing everything as "how to update register X in state S".
Helpful explanation. Thanks! Looking at this I have a very specific question, from a practical getting work done point of view, is this better than what we have, or just different to what we have. If the answer is "different" I don't mind, but it will hinder my adoption ;-) I guess this is why I want to see a side-by-side comparison of real hardware constructs. Things I use regularly. It would aid greatly in understanding the (potential) benefit of expressing things these ways.
Hard to say, it depends on what you consider better. It certainly is more concise than the equivalent Verilog (just like Haskell is more concise than pretty much any language I know). This seems especially true when you want to describe repetitive structures (such as their FIR filter).
CλaSH also has a much better type system than Verilog (again, thanks to Haskell), but if you wanted a good type system when describing hardware, you might as well just switch to VHDL ^^
My concern is with the description of state machines. You need to specify if you want a Mealy or a Moore machine, something that is usually implicit. And you're still describing the transfer function between states; CλaSH does not seem to allow you to describe your program in a structured way (such as loop until x becomes true, wait for 3 cycles, read z, while z > 0 decrement z, etc.)
Well, when I used VHDL back in college, I noticed that it had a really hard time with math. So for example it could do look up tables all day (basically switch commands) but if you tried to encode that logic into the kind of math we're used to in a C-based language where A = B (insert operator here) C, it fell down hard and the circuit would be so unstable that it would only run a few cycles before spinning off into some exceptional state that was nothing like we expected.
I think that's because humans have a hard time considering the ramifications of things like boundary conditions and edge cases with respect to types. So maybe we can visualize one register being added to another, but we can't intuitively extrapolate what happens when one is signed and one is unsigned, or their widths are different, or one is floating point, etc etc etc. VHDL doesn't touch on all of these edge cases very well (because for one thing they are hard!) it just does exactly what it’s told. That often flies in the face of intuition, once we’ve analyzed the circuit and seen how much we underestimated the complexity of what we were asking. In other words elegant math doesn’t always translate to simple circuits, and vice versa. So it really needs a meta language that can grapple with these subtle nuances and compile to VHDL without a lot of friction.
Probably what’s going to happen is we’ll see DSP logic (and limited subsets of it like GPU shaders/OpenCL/CUDA) and VHDL/Verilog merge into a functional concurrent language that can cover all of it. It won’t be as explicit as Rust because it will infer what the user is after but allow for overriding default assumptions. It won’t have opaque syntax either like most functional languages today. I’m thinking probably it will look more like MATLAB/Octave but have access to some of the more concise notation of Mathematica. So think Excel except having cells arranged arbitrarily in some ND space rather than 2D, and we’ll be able to specify formulas on groups of cells rather than individually, and in any language we desire that’s then compiled to Lisp and either run on distributed economy hardware or translated to a hardware description language. CλaSH probably isn’t it, but its approach and open source license is certainly a start.
Realized I didn't answer the question - different yes, but probably not different enough to be compelling for mainstream use at this point. Without having ever used it, I have concerns that circuits will still fall down or take up gratuitous chip area because handling the edge cases is one of the more complex problems to solve, and I'm not convinced that functional programming alone is enough.
you write: counter = s where s = register 0 (s + 1)
per http://hackage.haskell.org/package/clash-prelude-0.7.5/docs/...
I agree, it is an interesting idea. I suspect that just having "Haskell" got the link a lot of upvotes :-) As stated on the "Why CλaSH", the advantage is obvious for combinational circuits. But it doesn't seem to help much for synchronous logic (which arguably represents the majority of hardware designs). You'll still be writing everything as "how to update register X in state S".