<lekernel>
this works with python 2, but not with 3 ...
<wpwrak>
python likes its "self" :)
<wpwrak>
go you actually use python for more as a glue and for infrastructure here ?
<wpwrak>
s/as/than as/
<wpwrak>
by shoehorning all this into python, you get a very cluttered syntax. tons of parentheses, etc., which aren't really necessary
<lekernel>
it's still not as bad as VHDL :-)
<wpwrak>
would it be possible to do worse ? ;-)
<lekernel>
the main source of clutter imo is the "self."
<lekernel>
parentheses are still ok, especially with indentation
<lekernel>
we should have "self" only for signals that are accessed outside the object... internal signals should be on the get_fragment local scope
<wpwrak>
i see mainly four problem areas: the self, having to avoid reserved words, the parentheses, and generally having to form a list
<lekernel>
having to form a list is fine, because you can generate it algorithmically too
<lekernel>
which can be quite powerful
<wpwrak>
maybe a dedicated language could solve this in a considerably cleaner way. since the procedural part of the language would only execute at run time, code generation could be kept simple (you'd just "execute" the ast)
<wpwrak>
s/run time/build time/
<lekernel>
there's already the ast, only we manipulate it directly with the python syntax atm :p
<wpwrak>
yeah. python is where the problem is :)
<wpwrak>
the idea of putting a higher-level language over one that lacks expressivity is sound. just fitting this into python makes it complicated
r33p [r33p!~rep@bon31-2-89-85-157-97.dsl.sta.abo.bbox.fr] has joined #milkymist
<wpwrak>
and why do you prefix all the signals names with underscores ? again keyword avoidance ?
<lekernel>
not all, only internal signals (not meant to be accessed from other objects)
<lekernel>
as I said, those should be moved to the get_fragment local namespace (and the underscore unnecessary and removed)
<lekernel>
I can simply say "tx_busy = Signal()"
<lekernel>
but the problem is that the Signal class then doesn't know the variable name, and generates an anonymous signal in the final Verilog
<lekernel>
it works, but can make the generated code difficult to understand
<lekernel>
tx_busy = Signal(name="tx_busy") also works, but is redundant
<wpwrak>
can't you overload the assignment ? ;-)
<lekernel>
a macro would be convenient here
<kristianpaul>
hmm, i tought code difficult to understand wasnt something to care about
<wpwrak>
kristianpaul: you mean "it's okay" or "it should be avoided" ?
<kristianpaul>
not okay. no never :)
<kristianpaul>
but anyway if the thing build a system.v in seconds..
<kristianpaul>
s/care/dont care
<kristianpaul>
but yeah hacking conbus is more messy..
<wpwrak>
(overload the assignment) i'd still get rid of python, though. what i usually do is make a few code examples and see what the "ideal" language would look like. then i implement it with lex and yacc and put cpp in front of it all. that way, i get includes, conditional compilation, macros, and also comments for free
<lekernel>
no but here, we also need python to generate complex structure procedurally
<wpwrak>
as long as you don't need much more than if, loops, maybe recursion, and basic variables, you can just do that on your domain-specific language. it's not very hard to make an interpreter.
<lekernel>
do what?
<wpwrak>
make procedural constructs like if (...) ... else ..., while (...) ..., etc.
<lekernel>
no but half the point of FHDL is that you can create constructs from anywhere in Python
<lekernel>
and that's still a simple one, I want to build way more complex stuff later
<wpwrak>
yeah, but while it helps you get rid of semantical complexity, python adds substantial syntactical complexity.
<wpwrak>
and a for or if construct is entirely feasible in an interpreted customized language
<lekernel>
the round-robin arbiter example generates nested ifs
<lekernel>
it's not a simple for/if construct
<wpwrak>
you could probably even get rid of those "accumulators" (cases, comb, etc.)
<lekernel>
I see the point for modules that are manually coded, like the UART
<lekernel>
but for generating procedural structures, they aren't bad
<wpwrak>
the RR is a bit tricky indeed. there you have both languages working. if what they do would be completely orthogonal, then you would have no choice. but i don't think it is. lemme ponder this for a bit ...
kristianpaul [kristianpaul!~kristianp@cl-498.udi-01.br.sixxs.net] has joined #milkymist
kristianpaul [kristianpaul!~kristianp@unaffiliated/kristianpaul] has joined #milkymist
<wpwrak>
i guess recursion (in the "outer" language) should do the trick
<wpwrak>
well, you can generalize that
<wpwrak>
a "function" could be similar to a macro, only that some of the parameter evaluation happens as build time
<wpwrak>
the return value would be a bit of verilog or modified verilog. you'd probably also want functions that can do math and such, which you could solve with the explicit return of a value. a function that tries to do both would be an error. (like in many other languages)
<wpwrak>
then you'd just use that function inside the code, acting like a fancy macro
<wpwrak>
you could extend this with a mechanism that allows you to put the code generated into a variable. e.g., foo = { <code> } then you wouldn't even need recursion.
<wpwrak>
there <code> could contain loops etc.
<wpwrak>
you probably don't need full separation of name spaces. a common name space also makes the code easier to read, as long as the semantics aren't easily confused
<wpwrak>
thinking of it, you can probably use the same namespace for both languages. control constructs would just change their meaning depending on how you use them.
<wpwrak>
this may seem hackish but it's not different from what compilers for languages as down to earth like C do
<wpwrak>
e.g., if (<const>) -> resolve at build time
<wpwrak>
if (<expression-of-only-custom-language-vars>) -> also resolve at build time
<wpwrak>
if (<expression-of-only-lower-level-language-vars>) -> emit code
<wpwrak>
if (<mixed-expression>) -> emit code, replacing custom language vars with their respective value
<wpwrak>
in the end, you'd just have a self-optimizing super-verilog :)
Gurty [Gurty!~princess@ALyon-256-1-111-225.w90-9.abo.wanadoo.fr] has joined #milkymist
devn_ [devn_!~devn@rot13.pbqr.org] has joined #milkymist
mumptai [mumptai!~calle@brmn-4dbccd41.pool.mediaWays.net] has joined #milkymist
azonenberg [azonenberg!~azonenber@cpe-67-246-33-188.nycap.res.rr.com] has joined #milkymist
r33p [r33p!~rep@bon31-2-89-85-157-97.dsl.sta.abo.bbox.fr] has joined #milkymist
elldekaa [elldekaa!~hyviquel@abo-168-129-68.bdx.modulonet.fr] has joined #milkymist
<lekernel>
wpwrak: hmm, but in the end I'll need more complex stuff than the RR
<lekernel>
I'll probably end up building FHDL constructs with about every Python feature - and I do not want to reinvent the wheel...
<lekernel>
how about a dumb preprocessor to smooth out the ugly syntax bits...?
elldekaa [elldekaa!~hyviquel@abo-168-129-68.bdx.modulonet.fr] has joined #milkymist
<roh>
lekernel i sometimes wonder if it wouldnt be easier to have some processing or ardunino-esque language for patches
<roh>
easier to teach/understand for people
<roh>
like.. have one 'init/setup' and one 'event demux main function' as a stub and let people hack from there
lekernel [lekernel!~lekernel@g231250137.adsl.alicedsl.de] has joined #milkymist
<wpwrak>
lekernel: every problem in IT can be solved by adding yet another layer of abstraction ;-)
<wpwrak>
lekernel: regarding every feature of python - once the language is turing-complete, you have no limits :)
<wpwrak>
roh: sections is what i'm after :)
<lekernel>
yes, so let's program this stuff in assembly, it's turing-complete too :)
<wpwrak>
roh: in fact, you have them right now. just the syntax is a bit unusual
<lekernel>
or even better, writing the CPU instructions manually in hex
<wpwrak>
ah, the good old days ;-)
<wpwrak>
don't forget that you still have all the power of C at your disposal to implement the primitives of that language. if you need really complex stuff but that's at least somewhat general, then you can just add a primitive. or make a kind of library.
<wpwrak>
also, you'll never find a language that can do everything and still look sane
<wpwrak>
so if you need some really crazy stuff, you can always write a script that generates your "extra-high-level language"
<lekernel>
so far, the only problem I see with using the python interpreter for FHDL is slightly cluttered syntax
<lekernel>
it's minor, and there are ways to improve it, like a preprocessor
<lekernel>
for the rest, it seems pretty powerful
<wpwrak>
oh, python can do a lot. it may get slow, but that's probably not an issue in this case
<wpwrak>
i'd be more concerned about having a deep stack of translators
<lekernel>
the stack is about as deep in the traditional parser -> AST vs. preprocessor -> python parser -> AST case
<wpwrak>
particularly in the first ten years of that thing maturing, you'll constantly have to check things at all layers
<lekernel>
the 2nd case is more powerful, and the preprocessor can easily be made optional
<lekernel>
(should it cause issues in some cases)
<wpwrak>
hmm, how do you like gcc's parse trees ? :-)
<wpwrak>
and how often do you use them ?
<lekernel>
never
<wpwrak>
the generated python would become just that - an intermediate language used understood by a vanishingly small minority of people
<wpwrak>
so the preprocessor is no longer optional
<lekernel>
removing cluttering parentheses and redundancy in signal declarations doesn't look like a complex transform, does it?
<wpwrak>
you also need to disentangle namespaces
<lekernel>
nope. everything happens in the python namespace.
<wpwrak>
and you may end up with a different scope structure
<wpwrak>
so anyone using the language needs to know not only the rules of that language itself, but also python to avoid python name collisions, plus verilog to avoid verilog name collisions ?
<lekernel>
verilog name collisions are already taken care of in the backend
<lekernel>
it automatically renames conflicting signals
<wpwrak>
furthermore, if you don't have a full parser on top, you get cryptic error messages from the layers below
<lekernel>
the language itself is python with some optional syntactic sugar based on trivial macro expansions
<lekernel>
that's all
<wpwrak>
then you probably still miss a lot of potential for syntactical simplicity
<lekernel>
nothing's for free. and it's already a step forward from VHDL in this respect...
<wpwrak>
you could try this: pick a simple but not trivial piece of verilog, then write it down how you'd wish you could write it with verilog, i.e., staying close to the original structure, but getting rid of clutter and with access to constructs that reduce redundancy. then see if a language pattern emerges
<lekernel>
as I said: half the point of FHDL is to be able to use all that python provide to generate complex structures
<lekernel>
you'd miss that with a brand new language, and/or reinvent lots of wheels
<wpwrak>
what sort of python features do you expect to need for these complex structures ?
<lekernel>
classes, dynamic typing, maybe libraries so you can pre-compute things like filter coefficients for DSP easily
wolfspraul [wolfspraul!~wolfsprau@p5B0AA2EA.dip.t-dialin.net] has joined #milkymist
<wpwrak>
dynamic typing is easy. classes are harder (but rarely necessary). the occasional numbercrunching can be done outside, like where you use scilab to generate a table. (forgot what it was)
<wpwrak>
in fact, when i look at how many people use classes (i.e., each problem is solved by adding more of them, until you get these thickets that can only be traversed with an IDE), i'd say not having them is a good thing ;-)
<lekernel>
for the dataflow system I'm planning, I'll pretty much need classes - or at least, records
<lekernel>
just look at how much mess there is in the TMU to pass current pixel information from one pipeline stage to the next
<wpwrak>
records are basically a syntactical variant of generalized arrays
<wpwrak>
so if you have generalized (= index can be of any time) arrays, you're good
<wpwrak>
you could probably rip most from my umlsim language ;-)) (which actually happens to have classes)
<lekernel>
now I want a generic component that buffers (FIFO-style, and with configurable size) whatever records are fed to it
<wpwrak>
you get that with dynamic typing. and does the size even have to be limited ?
<lekernel>
yes, because that component has to be translated to Verilog and synthesized
<lekernel>
you don't have unlimited on-chip storage
<lekernel>
and by the way, we may want that FIFO to be able to grow automatically a DMA port to move the storage off-chip
<lekernel>
of course, that DMA port will then be semi-automatically connected to the appropriate bus in the SoC
<wpwrak>
but would there also be records that get buffered in m1gen ? or would you just have a structure that says there is a fifo but don't populate it with data in m1gen ?
<lekernel>
migen doesn't run anything... it's just a verilog translator
<wpwrak>
so what you;re looking for is something that says "fifo-structure", plus a type that defines what goes inside
<wpwrak>
and the size limit for verilog
<lekernel>
"hardware" thingies like TLM and System-C, which run everything in a simulator but can't synthesize a thing, are merely only good for writing PhDs
<wpwrak>
;-)
<lekernel>
and yes
<lekernel>
that's exactly what the FIFO component should do
<wpwrak>
well, you said yourself that you want to do number-crunching and that you need "about every Python feature" :)
<lekernel>
and it should provide a function that manipulates the FHDL to implement the FIFO functionality
<wpwrak>
manipulates .. in what way ?
<lekernel>
generate a "fragment" that can be combined with the rest of the system
<wpwrak>
a module ? or part of a module ? or an ensemble of modules ?
<lekernel>
yes, that's close to what Verilog calls a "module"
<wpwrak>
the "combining" would then mean that it's placed in the same hierarchy. i.e., not that it would go and do substitutions in the rest of the verilog ? e.g., to place accessors
<lekernel>
atm FHDL generates a completely flattened hierarchy, but maybe I'll implement some heuristics in the back-end to make the generated code more readable by splitting it into Verilog modules
<wpwrak>
ah, i see
<lekernel>
but I want to make the Verilog hierarchy completely transparent from the FHDL user's point of view
<wpwrak>
at what point do xilinx' tools choke anyway ? ;-))
<lekernel>
I don't think they do... I've done more horrible things to them
<wpwrak>
"FATAL ERROR: maximum module size 64 kB) exceeded" ;-)
<wpwrak>
*grin*
<lekernel>
like synthesizing the full LM32 for a different FPGA, translating the netlist back into Verilog, doing some hand tweaking of the file to make it synthesizable, and replace the original LM32 with that netlist written in Verilog and instantiating LUTs/flip-flops/etc. directly
<lekernel>
that was to track down a difficult synthesizer bug ...
<lekernel>
iirc the Verilog file was several MB
<wpwrak>
okay, so you'd so something like: my_fifo = new fifo(size, type); ... my_fifo->push(foo); ... bar = my_fifo->pop(); ? or, equivalent, my_fifo = fifo_new(size, type); ... fifo_push(my_fifo, foo); ... bar = fifo_pop(my_fifo, bar); ...
<wpwrak>
good. so no file size limit :)
<lekernel>
no, the FIFO will be part of a data flow graph
<wpwrak>
"foo" and "bar" would be "fragments" ? or some sort of endpoints ? or both ?
<lekernel>
no, endpoints
<lekernel>
but that'll be part of the dataflow system which I have not started coding, so don't look for this yet :)
<wpwrak>
okay, so you'd declare the endpoints, then do something(foo) in a "fragment" when you want to access the data ?
<lekernel>
a class declares endpoints
<lekernel>
the same class produces a fragment that implements its functionality
<lekernel>
the dataflow system connects endpoints together
<lekernel>
the dataflow system can generate another fragment that implements the endpoint interconnect
<wpwrak>
btw, i'm not sure "fragment" is a good word. it means a part of something that was whole before it got broken. those code snippets work just in the opposite direction: they are fused to build a whole
<stekern>
lekernel: it sounds like you want systemverilog, but dys in python :P
<lekernel>
non-synthesizable "hardware" is only for PhD-ware lurking in the darkness in some academic institution and spending 90% of their time writing grant applications
<stekern>
hehe
<wpwrak>
(dataflow) okay, but how will the endpoint be accessed ? e.g., if i have a uart-rx and i just got a nice new byte in "shifter". how do i send it into the fifo at endpoint "foo" ?
<lekernel>
there will be a formal exchange protocol, similar to what is currently implemented between TMU pipeline stages
<wpwrak>
naw, phds don't spend on average quite so much time writing applications ;)
<lekernel>
basically, strobe_out, ack_in for outputs
<lekernel>
(sources)
<wpwrak>
that's open-coded or would there be some wrapper ?
<lekernel>
and strobe_in, ack_out for inputs/sinks
<lekernel>
that would depend on the "scheduling model" of that component
<lekernel>
there will be predefined scheduling models for common behaviours, such as a fixed-latency pipeline, or a sequential unit that produces its result after a fixed number of cycles
<lekernel>
the exchange signals would be autogenerated for those, and the dataflow system would also try to remove completely those exchange signals and replace them with static scheduling and a FSM
<lekernel>
but for more complex cases, you can also go for the dynamic scheduling model that lets you deal directly with those signals (and disable the dataflow system optimizations)
<wpwrak>
i mean just the endpoint access. i guess you could have "blocking" and "non-blocking" there. though they may end up being the same, with "it's still busy" translating to "error/ignore" in one case and "go something/defer" in the other
<wolfspraul>
if a hardware description language is non-synthesizable, what's the point?
<wolfspraul>
doesn't that make it a pure software thing?
<wolfspraul>
or is it meant to anticipate future advances in what is manufacturable?
<lekernel>
wolfspraul: writing PhDs and doing test benches
<wolfspraul>
(synthesizable I mean)
<lekernel>
those are the main applications of TLM, SystemVerilog and SystemC
<wpwrak>
wolfspraul: it could still be useful for testing certain model properties
<wolfspraul>
sure, but it's pure software then, and due do it not being synthesizable (ever?) it cannot become hardware, right?
<wpwrak>
wolfspraul: e.g., the Spin language is, afaik, not used for code (or whatever) generation.
<wpwrak>
err. sprry, promela. spin is the model checker
<wolfspraul>
oh sure, I understand. it's just not hardware then, and I guess never (?)
<wpwrak>
that would depend on language characteristics. in theory, everything is synthesizable. in practice, you'd draw a line somwhere :)
<stekern>
well AFAIK systemverilog wasn't solely designed for verification (as systemc is), it just hasn't caught on
<wolfspraul>
ok but I was surprised about terminology
<wolfspraul>
how can a 'hardware description language' be non-synthesizable?
<wolfspraul>
it then describes a theoretical hardware, I guess
<stekern>
well it's describing hardware, but only for simulation purposes
<wolfspraul>
yes, so by definition that hardware can never be hardware, no?
<lekernel>
you'd be surprised what design committees and academia are capable of doing ;)
<wolfspraul>
I'm just surprised about terminology, that's all. I'm learning.
<wolfspraul>
maybe hardware includes 'theoretical hardware' - as long as everybody knows that so be it...
<wpwrak>
well, if that's their goal, their time is limited, and nobody takes over after them, ...
<lekernel>
and yes - 'theoretical hardware' ...
<wpwrak>
the problem with academia is that many a phd project ends abruptly once that phd has been awarded. the problem of industrial committees is that their horrid designs often do see the light of day :-)
<wolfspraul>
but isn't it highly confusing that a 'hardware description language' is capable of describing something that cannot become hardware?
<wpwrak>
that would depend on what exactly the "hardware" in there constitutes. could range from "it simply hasn't been implemented" to "we incorporate features and constraints typically found in hardware"
<wpwrak>
there's also the issue of which subset of the language you'd actually consider synthesizing. and which subset of the chip's feature set you'd use.
<wpwrak>
make each too small and it becomes pointless
<wpwrak>
make each too big and you're crushed by complexity
<wpwrak>
and then, some languages have the problem that some of their properties just don't make sense. so the usable subset is extracted and then extended. pascal was a nice example for that ;-)
<lars_>
wpwrak: for the young folks among us: which properties of pascal didn't make sense?
<wolfspraul>
well it's a good thing for me to learn anyway
<wolfspraul>
that 'hardware' description language may well describe something that cannot exist in hardware :-)
<wpwrak>
strings, the i/o subsystem, for larger projects the lack of separate compilation, then of course operator precedence
<wpwrak>
wolfspraul: i don't think it's like that. most likely, there simply is no direct way from the description to, say, verilog
<wpwrak>
wolfspraul: of course, at the extremes, you will always find that you can code something that cannot be implemented. be it in verilog, c, whatever.
<wpwrak>
lars_: you still see the pascal bug of operator precedence when people write, in C, things like ((foo == 1) || (bar == 2)) instead of simply (foo == 1 || bar == 2)
<wpwrak>
in pascal, these parentheses were indeed needed. then people who grew up with pascal moved to C, without bothering to properly learn the language. so they still wrote things in pascal style. and then others, learning from example, copied that ...
<wpwrak>
another bad things pascal was at least partially responsible of is the shunning of multiple exit points in loops and functions. pascal has "goto" but the use of that was maligned at that time. and it didn't have more structured mechanisms, like "break", "continue", or "return" (except the implicit one at the end of the function)
mumptai [mumptai!~calle@brmn-4db77c3e.pool.mediaWays.net] has joined #milkymist
<wpwrak>
so people wrote code that had endlessly nested ifs. If precondition1 Then If precondition2 Then If precondition3 Then finally-to-the-thing; Else error3; Else error2; Else error1;
<wpwrak>
since the only exit was at the end of that function, you could never be quite sure nothing else would happen in a given branch, unless you checked all the way to the very end
<wpwrak>
and of course, if there was something else to do after that monster if, you'd set a boolean that basically said "skip all the rest" and then put ifs around all the other things
<wpwrak>
similar for doing the equivalent of "break". it was then While real-condition And Not getmeoutofhere Do Begin stuff If error Then getmeoutofhere := True; ... If Not getmeoutofhere Then ...; End
<wpwrak>
all very messy
<lars_>
and people used this?
<wpwrak>
you mean the construct or the language ?
<lars_>
the language ;)
<wpwrak>
oh yes. it was extremely popular as an educational language (for forcing you to program "properly") and then also as a language for PC-class machines (CP/M 80, then PC-DOS, etc.)
<wpwrak>
it was competing with BASIC at that time. many implementations of BASIC didn't have any WHILE or such. just GOTO. you can imagine what that did in a language aimed at beginners :)
<wpwrak>
(beginners) e.g., all the "home computers" came with BASIC. in the 1980es, borland made turbo pascal. it ran on CP/M80 and it was *fast*. not necessarily the code it generated (i.e., i once wrote a compiler that did better), but the compilation speed was breathtaking
<wpwrak>
in that era, compilers were heavy and slow beasts
<wpwrak>
and C compilers for CP/M were either prohibitively expensive or barely usable. sometimes both.
<lars_>
i see
* kristianpaul
learn turboc back in college
<kristianpaul>
learnt*
<wpwrak>
turbo c came long after turbo pascal
<wpwrak>
also, ms-dos didn't have virtual memory, so it was very easy to crash the system with bad pointers (that's an issue less common in pascal)
<wpwrak>
of course, the meme that languages like C, which give you fairly free use of pointers are therefore dangerous probably comes from that era
<wpwrak>
already with an MMU, you catch most pointer issues. and for the last decade or so we've had tools like valgrind that get rid of the remaining problems as well
<wpwrak>
today, the main risk are programs with buggy code paths that aren't visited in testing. of course, there are tools for avoiding that as well ...
<kristianpaul>
you mean like python? in wich still the risk hidden bugs as there is no compilation at all
<wpwrak>
yeah. some of the "safe" languages like python have this sort of trap. talk about replacing one problem with another :)
<kristianpaul>
good point
<lars_>
so the logical conclusion is to use managed code at kernel level and use unmanaged code for userspace applications
<wpwrak>
"managed code" ?
<kristianpaul>
yes, what does mean?
<kristianpaul>
non interpreted?
<lars_>
it's what microsoft calls languages like c#
<wpwrak>
and what would be unmanaged then ?
<lars_>
c
<wpwrak>
;-)
<wpwrak>
naw, C is fine for everything. in the kernel you need massive reviews anyway.
<lars_>
which are often not done
<kristianpaul>
everything, yay !
<wpwrak>
in linux ? well, in general it is. perhaps not perfectly all the time, but people do have a look
<kristianpaul>
rtems counts as kernel btw?
<wpwrak>
yup
<wpwrak>
there you have an example of weak reviews. they're using regression tests, which is a good idea, but they don't seem to have good coverage
<lars_>
i've become a big fan of coccinelle recently
<lars_>
or for example there is patch which finds users of GFP_KERNEL where GFP_ATOMIC should have been used
<wpwrak>
that NULL pointer check could actually go into gcc :)
<wpwrak>
"warning; de facto constant expression" :-)
<lars_>
i've mainly been using coccinelle for automated refactoring. e.g. changing a functions signature. it's much nicer than grep and sed
<lars_>
or replacing open coded versions of macros or functions
<lars_>
it's good for people like me who are a bit OCDish with code style issues sometimes.
<wpwrak>
hehe ;-) so i guess coccinelle is good at observing proper indentation
<lars_>
it can do that too
<lars_>
but what i meant was e.g. using const for things which should be const
<wpwrak>
that's tricky to identify, though
<lars_>
depends
<wpwrak>
well, you could automate it. tentatively add a "const" at the first place where there could be one, adapt all the callers, see if you get errors/warnings. if yes, don't do const. if no, keep it. then proceed with the next.
<lars_>
if you know that a function takes a const parameter
<lars_>
you can find all calls to this function and check whether the variable which is passed as the parameter is ever modified. if not make it const
<wpwrak>
coccinelle can find such thing ? wow
<wpwrak>
s/thing/things/
<lars_>
yes
<wpwrak>
that's pretty complex. does the code have to be -Wshadow-clean ?
<wpwrak>
and does it see what goes on in macros ?
<lars_>
well, if you include the macro defintion yes
<wpwrak>
e.g., #define POKE(p) (*(p) = 0)
<wpwrak>
good
<lars_>
but by default includes are not followed
<lars_>
because it increases execution time quite a bit
<wpwrak>
yeah, i can imagine
elldekaa [elldekaa!~hyviquel@abo-168-129-68.bdx.modulonet.fr] has joined #milkymist
kristianpaul [kristianpaul!~kristianp@cl-498.udi-01.br.sixxs.net] has joined #milkymist
kristianpaul [kristianpaul!~kristianp@unaffiliated/kristianpaul] has joined #milkymist
<wpwrak>
(NOR) passed 10'000. this time with a full check (including non-fatal corruption, which i forgot the last time).
<lekernel>
great! and thanks for the tests!
<wpwrak>
yeah, i think it's time to schedule the last rites for the bug ;)
<kristianpaul>
awesome :)
rejon [rejon!~rejon@72-161-255-93.dyn.centurytel.net] has joined #milkymist
rejon [rejon!~rejon@72-161-255-93.dyn.centurytel.net] has joined #milkymist
<GitHub154>
[migen/master] fhdl: automatic signal name from assignment - Sebastien Bourdeauducq
<lekernel>
dirty, but got it to work =]
<lars_>
dirty indeed. if all objects which gets signals assigned were of a certain base class you can override __setattr__
<lekernel>
I also want to declare local signals in functions, and there isn't any way to do this with py3k (except with a preprocessor, which is also quite messy)
<lars_>
may you can modify locals() __setattr__ ;)