Topic for #qi-hardware is now Copyleft hardware - http://qi-hardware.com | hardware hackers join here to discuss Ben NanoNote, atben / atusb 802.15.4 wireless, and other community driven hw projects | public logging at http://en.qi-hardware.com/irclogs
guanucoluis has joined #qi-hardware
nikescar has quit [Read error: Connection reset by peer]
nikescar has joined #qi-hardware
xiangfu has joined #qi-hardware
nikescar has quit [Read error: Connection reset by peer]
nikescar has joined #qi-hardware
urandom__ has quit [Quit: Konversation terminated!]
<kyak>
mth: jz-3.5, Windows 7, infamous error code 10 :(
<kyak>
i'll be waiting for 3.6 then
<kyak>
mth: perhaps it makes sense to revert the RNDIS commit, since it doesn't work anyway (giving people a chance to use the CDC driver on Windows). Also, qi_lb60_defconfig should be changed to add DEVTMPFS support, otherwise it freezes at boot
<mth>
I only changed the a320 defconfig, not the qi_lb60 defconfig
<kyak>
oh, sorry
<kyak>
larsc: could you make this change to qi_lb60_defconfig?
jluis has quit [Ping timeout: 245 seconds]
jluis has joined #qi-hardware
<qi-bot>
[commit] Paul Cercueil: Fix a bug where only some parameters were read from the links files. (packages) http://qi-hw.com/p/gmenu2x/876f2cf
<qi-bot>
[commit] Paul Cercueil: Mount the OPK packages in order to execute the binary included. (packages) http://qi-hw.com/p/gmenu2x/f6c19d0
jluis|work has joined #qi-hardware
pcercuei has quit [Quit: dodo]
jekhor has joined #qi-hardware
jluis has quit [Ping timeout: 245 seconds]
jluis has joined #qi-hardware
<wpwrak>
roh: hmm, pretty ambitious
<wpwrak>
roh: seems that there are two main directions: form printing and heavy duty printing. the former (i love puns :) is the one most likely to yield devices affordable to "the masses". the heavy duty stuff is a lot more difficult.
<wpwrak>
you can probably also grow heavy duty machines from light duty parts
<wpwrak>
an ad hoc name :) i mean 3d printing where the form is the most important result, not mechanical strength or special material properties
<wpwrak>
i think OSE will need a lot of iterations :) and i don't quite see the point. they're solving a non-problem. we already have quite capable industries to do all these things.
<wpwrak>
that sort of stuff may be interesting some day when it comes to colonializing distant planets. but that's still a while out :)
<whitequark>
wpwrak: well, I like the idea of self-sustainability
<whitequark>
through it is years behind being achieved
<whitequark>
I also dislike the idea of cloud computing for some reason, and all cool kids love cloud computing, so...
<wpwrak>
i don't see the point of cloud computing either :)
<wpwrak>
it feels like 1960es mainframe thinking transplanted into the internet age
<viric>
:)
<viric>
I think that for me "all cool kids love ..." is an indicator of a bad thing, usually
<wpwrak>
indeed :) it means it stopped being avantgarde a decade ago
<wpwrak>
which kinda fits, thinking of it ;-)
<roh>
wpwrak: well.. what i still miss is 'hacking injection molding'
<roh>
i am just yet to stupid and moneyless to actively figure it out
<roh>
my measurement is: if if can be done autonomous and unattended. e.g. as vending machine, its not impossible to do it in a hackerspace/homebrew lab under semi-professional environments
<roh>
and since there is/was mold-o-rama ...
<whitequark>
wpwrak: well, there are two kinds of "cloud computing"
<whitequark>
the first is "let's put ten underutilized virtual servers on a single hardware", it was done by IBM in 1960s and it's really good
<whitequark>
the second is "let's deprive user from their data and force them to use our remote interface"
<whitequark>
which was also done by IBM in 1960s but is nowhere as good
<whitequark>
speaking about it, I'm currently implementing a piece of software using 1970's tech, and I do that because there are no widespread implementations of that, except maybe JVM in a limited fashion
<whitequark>
well, JVM/V8
<whitequark>
both are Lars Bak's Strongtalk implementations gone wrong :)
paul_boddie has joined #qi-hardware
<paul_boddie>
roh: Hacking injection moulding would be cool, but surely you always return to the problem of making the moulds and tooling, and that is another CAD problem.
<paul_boddie>
whitequark: I saw that you were doing a Ruby compiler or something similar. How is that coming along?
<roh>
paul_boddie: true
<roh>
paul_boddie: on the other side... the hackers world may have shortcomings, but there is also niceness... the openness, the sharing of knowledge.. the 'unconventional' or 'unorthodox' concepts being tried and results being published (sometimes) simply is more fun
<roh>
also i realized that there seems to be more 'konservative' engineering in opensource. if stuff is figured out to a certain degree of reliability, it seldomly changes without need. but till then there is much more creativity than in what ive seen of the 'old world'
<paul_boddie>
I think people underestimate the amount of "locked in" knowledge in traditional enterprises, too. If you dismantle them, there's no guarantee that something else can just fill the gap left by them.
<roh>
yep. we could see that the last few years with natural disasters in asia... goods which only are produced in few places were unavail and that made prices explode for some stuff (e.g. harddisks)
<paul_boddie>
I think it's important that knowledge be documented in such a way that an endeavour can be repeated. Some people don't like that because it threatens their position - they need their "secret sauce" - and others insist that patents achieve this, which is a complete joke, but the people who deal with obsolete systems have plenty of lessons to teach people about this kind of thing, and I don't think society will always be able to afford to ignore t
<roh>
true
<roh>
you've seen the lunar lander tape recovery project?
<roh>
somebody stored the original magnetic tapes from back then, and now they recover the data to compare to current measurements
<roh>
quite some reverse engineering endeveaur
<paul_boddie>
Yes, another lesson about maintaining the expertise to do something. I guess NASA is a classic case of loss of organisational memory.
jekhor has quit [Ping timeout: 245 seconds]
<roh>
paul_boddie: yeah. sad to see.. they were leading the field in developing organisational things at one point in history
<roh>
well.. some stuff.. can be forgotten.. how to build nukes for example ;)
<roh>
maybe its important to develop ethics at the same speed as technology
<paul_boddie>
The lessons from that story are that even large, powerful organisations lose track of how to do things, and that even a supposedly perfected process can still have undocumented parts.
<whitequark>
roh: (conservative OSS) look at the present state of so-called linux desktop
<roh>
some of that makes sense. the rest will not make it on the long run
<whitequark>
roh: exactly what I wanted to say
<whitequark>
still, I won't call it "conservative". "no change for the sake of change", maybe, but that's more to simple human laziness
<whitequark>
you won't do bullshit all day unless a manager forces you :)
<paul_boddie>
I think there's a continuous opposition between conservative and crazy, well-researched and reinvented, old and new.
<whitequark>
I find it quite funny that we had devfs, then we had udev, and now we have devfs again
* whitequark
waits for the next iteration
wej has quit [Ping timeout: 260 seconds]
<paul_boddie>
People have no sense of history, that's why. It's part of a more general depressing phenomenon where you can end up arguing with someone about something that probably happened before they were born, and they have the nerve to doubt that any such thing ever happened.
<whitequark>
paul_boddie: I'd say it is more due to the fact that OSS development model is close to evolutionary one
<paul_boddie>
It's like all those people who don't think that Microsoft ever did anything wrong. Clearly they were born fairly recently and never caught up, or they never bothered to pay attention during, say, the 1990s.
<whitequark>
people do stuff; good stuff evolves, forks and lives on.
<whitequark>
Microsoft still does the same wrong thing as they did in the past
<whitequark>
they never stopped
<paul_boddie>
whitequark: Did you ever do anything a while ago and then see someone much more recently announce something very similar as the hot new thing? People are just lazy but want their fame and glory anyway.
<whitequark>
paul_boddie: that's perfectly normal
<whitequark>
each programming language in the last 20 years reiterated histories of ALGOL and Lisp, in differently twisted fashion
<whitequark>
which doesn't mean they ever did something wrong. Both Algol and Lisp have proven to be unusable by general public. Iterating their useful features in this fashion allows us to have better and better languages
<paul_boddie>
About Microsoft, yes, they never stopped, but people either have the view that all the major legal trouble in the 1990s was unmerited (or didn't happen because they never knew about it) and so there's no *real* dirt, or that Microsoft is a changed company. MS is just better at covering their tracks these days, although still incompetent at that, of course.
wej has joined #qi-hardware
<paul_boddie>
About language evolution, I agree. Witness all the people using Lisp who can't understand why people use those "dirty imitations".
<whitequark>
the problem with Lisp is that it's too powerful
<whitequark>
lambda calculus, the purest form of computation, can represent everything, but cannot represent anything actually useful:)
<whitequark>
and with Lisp it's basically impossible to have large scale projects due to the fact that everyone tends to invent their own slightly different variations of already proven concepts
<whitequark>
I've heard that back 50 years ago, function calls were a design pattern
<whitequark>
prologue, epilogue, arguments on stack...
<whitequark>
if Lispers designed C, every programmer would have to define his own ABI
<whitequark>
slightly incompatible but effectively having the same power as the every other one
<whitequark>
this is also C++ is very wrong.
<paul_boddie>
I can see that, actually. Concepts like functions were probably like everything else: a tradeoff that worked for certain use-cases, but would they work satisfactorily in general, and is it wise to eliminate support for the other cases?
<whitequark>
*why
<whitequark>
paul_boddie: the main language designer rule is: simple things should be simple, and complex things possible
<whitequark>
and the notion of "simple things" changes with time
<whitequark>
in 1970, procedures were simple
<whitequark>
in 2000, closures are already simple
<whitequark>
(I'm 19 year old, so the dates might be a bit off.)
<whitequark>
you don't need to define your own way to make closures in language, it should be builtin.
<whitequark>
the problem is, everyone dislikes _something_ in any existing language
<whitequark>
and if the language allows everything to be overridden, then it will be, and the code will become pretty unusable
<paul_boddie>
I think closures are a luxury, myself. You can model what you need from them in other ways, and they cause complications for the language designer and implementer. The Lisp crowd lobbied hard for closures in Python, and I don't really think that the resulting support is worth having.
<paul_boddie>
And it's not worth listening to the Lisp crowd, anyway. They won't use your language after making their demands.
<whitequark>
they add very much to the language. I can't really imagine writing something without the expressive power of closures
<whitequark>
besides that, language designers and implementors aren't the people anyone needs to care about.
<paul_boddie>
How are Python closures not proper? I don't remember, myself.
<whitequark>
well, there are no anonymous functions in Python
<paul_boddie>
There are no multi-statement anonymous functions, you mean.
<whitequark>
there is so-called "lambda" keyword, which defines a "function" which can only have a certain subset of expressions inside
<whitequark>
and which isn't actually very useful
<paul_boddie>
lambda provides a single expression, yes, not one or more statements.
<whitequark>
the best part of Ruby's closures is that they're very easy to use and manipulate
<whitequark>
meth { i_am_a_closure }. done.
<paul_boddie>
Also, you bind functions to names upon definition in Python, but that doesn't mean that you can only have one such function for every binding.
<whitequark>
therefore you are encouraged to write composable code, which happens to be much cleanier and easier to read
<whitequark>
File.open("something") { |io| io.write "hello" }, for example, automatically closes the handle
<paul_boddie>
Although those functions probably still have some notion of a name, they don't need to be available through that name any more. You can make lists of parameterised functions all of which probably think they have a name, but it's irrelevant.
<whitequark>
iterators and lazy generators are implemented via the same simple syntax
<whitequark>
(1..2).each { |x| puts x } # iterator
<whitequark>
(1..2).map { |x| x * x }.each { |x| puts x } # two chained ones
<whitequark>
gen = (1..2).map { |x| x * x }.each # create a generator and pass it around
<whitequark>
or:
<paul_boddie>
Well, not having to name those blocks is convenient and means that you can write them inline, but they don't give you anything over and above a reference to a function.
<whitequark>
sum = line.split(" ").map(&:to_i).map(&:abs).reduce(:+)
<whitequark>
paul_boddie: yeah, this is one of the most common PL-related fallacies
<whitequark>
syntax doesn't actually give you anything if your language is Turing-complete
<whitequark>
the way you define the syntax affects the way people will use your language heavily
<whitequark>
it doesn't actually makes sense to argue if syntax adds something to the language. it doesn't. syntax adds something to usability, or maybe not, and syntax might make writing good code easier. or, again, might not.
<whitequark>
even Java has closures already (Java 8)
<whitequark>
and if you've ever tried to write code with callbacks, you'll easily understand why
<whitequark>
speaking about that, void (*cb)(void*) is NOT an acceptable way to implement closures.
<paul_boddie>
Well, I use a variety of constructs that different groups of other Python programmers frown upon in different ways, but I think that some of these syntax refinements work against their own motivations.
<whitequark>
indeed, there are lots of possible but not very clever things to do in Ruby either
cladamw has joined #qi-hardware
<whitequark>
5 years ago it was cool to extend standard classes, it isn't anymore. There are better conventions now
<whitequark>
best of these conventions are now being integrated to the core language
<whitequark>
or, for example, Ruby has recently gained syntactic keyword arguments
kristianpaul has quit [Quit: Lost terminal]
<whitequark>
1.8 used an "options hash" and a syntactic sugar which allowed you to omit {} for the last function argument if it was a hash
<whitequark>
1.9 added a sugar which mapped `a: expr' to `:a => expr'
<whitequark>
2.0 finally allows you to write `def meth(kwarg: default_value)'
<whitequark>
i.e. it does hash decomposition and mandatory argument checking in the core
<paul_boddie>
Is this not like Python's **kw support?
<whitequark>
yeah, it now works like in Python, except that no one removed positional arguments
<paul_boddie>
Oh, and as I understand it, closures are orthogonal to having anonymous blocks. You can still "close over" the environment even if you have to give a temporary name to the thing doing so.
<paul_boddie>
Or as Wikipedia states, "The term closure is often mistakenly used to mean anonymous function."
<whitequark>
paul_boddie: you are correct
jekhor has joined #qi-hardware
<whitequark>
I meant blocks, indeed.
<paul_boddie>
In fact, you're not wrong to point out that the naming of things in Python is a potential conceptual hurdle for people if only because languages like Java special-case things like methods. This can lead to flaws in people's reasoning about the behaviour of programs, even to the extent of how one should refer to and reason about program units.
<paul_boddie>
In other words, people carry over assumptions like "this is function f" instead of thinking "this function currently known as f".
<whitequark>
yes
<whitequark>
Java requires you to create a class and a method to keep a chunk of code
Jurting has quit [Remote host closed the connection]
<whitequark>
Python requires you to create a method (well, named function to be precise)
<whitequark>
Ruby just allows you to write a chunk of code.
<larsc>
so does PHP ;)
<whitequark>
larsc: PHP does not have closures at all :)
<whitequark>
its "lambdas" are named and non-GC'd methods
<larsc>
whitequark: sorry, just saw the last 2 lines
<whitequark>
ahh
<whitequark>
I see your point then :)
<whitequark>
well, you have toplevel code in Ruby and ERB (php-like interleaved script interpreter) is in standard library
<paul_boddie>
Python would have anonymous blocks if there would be an agreement on syntax, but nobody ever reached agreement.
<whitequark>
but for some reason no one writes in Ruby like in PHP...
<whitequark>
paul_boddie: well, either there was not enough pressure to add closures
<whitequark>
or that's a typical flaw of design by commitee
<paul_boddie>
There *are* closures. There just aren't anonymous blocks.
<whitequark>
sorry, I did that again. Ruby doesn't have non-anonymous closures, so the terms are mixed up there
<paul_boddie>
Closures were added in something like Python 2.2. I don't remember the precise details, but I could look it up.
<whitequark>
I know
<paul_boddie>
Personally, I think that you have two different schools of thought where one school prizes closures and the other says, "Well we can more or less do that more cleanly with structures/objects/explicit state." The former group taunts the latter because they've been writing Lisp since forever, and the latter probably uses languages where closures would add complexity to their implementation that they wouldn't be able to live with.
<whitequark>
paul_boddie: every modern language has closures :)
<paul_boddie>
Clever answer. :-)
<whitequark>
every JVM one, every CLR one, Lua, Ruby, Python, C++ [C++11], JS, whatnot
<whitequark>
so I guess the former group won :)
<whitequark>
seriously, closures aren't hard to implement compared to, for example, continuations
<whitequark>
and continuations are a huge win in some interesting cases
<paul_boddie>
You should go on LWN and start an argument about programming languages on some article about Android. It would give people some entertainment. :-)
<whitequark>
meh, I prefer to start arguments on my blog
<paul_boddie>
But, anyway, how is that compiler of yours coming along?
<whitequark>
50% of my dayjob time is allocated for it:)
<whitequark>
well, there is some progress, but nothing particularly interesting
<whitequark>
I'll probably write an article in a few days
<whitequark>
on the architectural choices
<paul_boddie>
As I recall, you intended to eliminate some of the run-time complexity in favour of making optimisations in the generated program. Is that the idea?
<whitequark>
this is more of a side effect
<whitequark>
every existing Ruby implementation (or, in fact, most language implementations I'm aware about) compile the source to some IR which captures the semantics of the underlying machine
<whitequark>
i.e. C++ compiles to x86, Java compiles to JVM
<whitequark>
if a compiler then tries to optimize the code, it obviously works on that IR
<whitequark>
I'm writing an implementation where IR accurately captures the semantics of Ruby, not some existing VM like LLVM
<whitequark>
the point is, LLVM doesn't know much about the Ruby semantics and therefore it's unable to do some interesting optimizations
<whitequark>
it doesn't have enough information in its IR to argue about the code
<paul_boddie>
Didn't Parrot already try this? ;-)
<whitequark>
it _can_ insert run-time guards to check for types, etc, but JITs are generally slow and very expensive
<whitequark>
you won't use a JIT on an ARM with 8K of RAM
<whitequark>
but everything changes if I make an IR specifically for Ruby
<whitequark>
I can derive a lot of information about types, or about control flows
<whitequark>
I can trivially inline those anonymous blocks where I can do that (i.e. where the block environment does not live further than the function it was defined in)
<whitequark>
I can inline method calls if I can guarantee that the methods won't be redefined
<whitequark>
the current implementations assume that every bit of the Ruby semantics should be accessible at every point of time
<whitequark>
i.e. they assume that you will redefine methods at runtime just because you can
<whitequark>
this requires you to use JIT in your implementation, and even with JIT, some things become way more expensive than they should be
<whitequark>
arithmetics, for example
<paul_boddie>
What you wrote on your blog looked a lot like what someone wrote for Python, but that was more about evaluating the program expressions (the guy didn't say exactly how he did it), and then producing a simplified program. But I guess the two ideas are equivalent as he could easily have been describing a simplified generated program.
<whitequark>
well, that's how optimization works :)
<whitequark>
the whole point is to produce IR which can be easily operated with
<paul_boddie>
Yes, but it's the assumptions that underlie the optimisations that are the important part.
<whitequark>
Rubinius produces LLVM IR, which is easily manipulated with LLVM
<whitequark>
but there isn't a way to say to LLVM: "hey, this function will only ever receive Class or Module arguments, optimize for them"
<whitequark>
basically the same with Java
<whitequark>
and JRuby
<paul_boddie>
The trend in various circles is to not bother with guaranteeing anything before the program is run but instead relying on run-time observations.
<whitequark>
it's a time-memory tradeoff
<paul_boddie>
I don't necessarily agree with that at all, mostly because, as you say, that has a lot of overhead and rules out 8K RAM environments.
<whitequark>
runtime observations pay for themselves when you have unlimited memory
<whitequark>
i.e. JVM heap reaches obscene sizes
<paul_boddie>
What interests me a lot more is the analysis, and not just for the optimisations.
<whitequark>
yes. there are more interesting things to do when you have this IR
<whitequark>
basically this IR is a normalized representation of Ruby
<whitequark>
without all complexity of its syntax and with ease of manipulation
<paul_boddie>
It's interesting that in your blog, you start with bytecode and derive something like an AST. Why not start with the AST?
<whitequark>
it was over a year ago, and I thought that Rubinius' bytecode suits my needs for a source better
<whitequark>
it's not
<whitequark>
I'm using ASTs now
<whitequark>
this is a problem as of itself. There are at least five separate Ruby parsers, and they either suck at completeness, output format or runtime requirements.
<whitequark>
s,either,two of the three,
<whitequark>
my implementation also has truly meta-circular architecture
<whitequark>
i.e. it's a Ruby VM written in Ruby
<whitequark>
if I would want to write a fully-fledged, general purpose implementation (which I don't), I could just compile it with itself eventually
<whitequark>
you won't get most of the runtime optimizations, but as Squeak's history shows, it could still be beneficial as of itself
<paul_boddie>
Heh! At last, someone actually writes a Ruby VM in Ruby instead of *claiming* to do so. :-)
<whitequark>
hehe
cladamw has quit [Quit: Ex-Chat]
<whitequark>
it would still have a separate GC because I cannot do everything, but honestly I don't consider automatic memory management as a part of a programming language implementation
<whitequark>
it should have become an OS routine years ago
<lindi->
whitequark: hmm, if you make explicit assertions for the arguments then llvm could do compile-time optimizations too?
<whitequark>
lindi-: 1) it requires you to have the aforementioned IR to derive the assertions
<whitequark>
but I probably won't have time to play with the board
<whitequark>
I'm not a SIMD guy really :)
kristoffer has quit [Quit: Leaving]
<larsc>
I think their price tag is very ambitious.
<whitequark>
indeed.
<larsc>
the ZED board on which they want to base their design already costs twice as much
<wpwrak>
lindi-: at least the core is something sane. when playing with the psoc 1, i had to use assembler. and i had to write my own assembler, too. http://m8cutils.sourceforge.net/