There's a couple of paper's I've been reading about fusing functional programming and relational model and the influential "Can Programming Be Liberated From the von Neumann Style? A Functional Style and its Algebra of Programs".
Also we have Dijkstra's, "On the cruelty of really teaching computing science". Firstly, he states that people are by nature conservative and often use previous experience as the basis to learn something new, "...radical novelties are so disturbing that they tend to be suppressed or ignored, to the extent that even the possibility of their existence in general is more often denied than admitted."
So computing is generally quite different but people continue to use the wrong metaphors and analogies: "The practice is pervaded by the reassuring illusion that programs are just devices like any others, the only difference admitted being that their manufacture might require a new type of craftsmen, viz. programmers. From there it is only a small step to measuring "programmer productivity" in terms of "number of lines of code produced per month". This is a very costly measuring unit because it encourages the writing of insipid code, but today I am less interested in how foolish a unit it is from even a pure business point of view. My point today is that, if we wish to count lines of code, we should not regard them as "lines produced" but as "lines spent": the current conventional wisdom is so foolish as to book that count on the wrong side of the ledger.
Besides the notion of productivity, also that of quality control continues to be distorted by the reassuring illusion that what works with other devices works with programs as well. It is now two decades since it was pointed out that program testing may convincingly demonstrate the presence of bugs, but can never demonstrate their absence. After quoting this well-publicized remark devoutly, the software engineer returns to the order of the day and continues to refine his testing strategies, just like the alchemist of yore, who continued to refine his chrysocosmic purifications."
The solution, is teaching students correctly, that the basics is not to be found Pascal or C or Java but in logic and mathematics: "Right from the beginning, and all through the course, we stress that the programmer's task is not just to write down a program, but that his main task is to give a formal proof that the program he proposes meets the equally formal functional specification. While designing proofs and programs hand in hand, the student gets ample opportunity to perfect his manipulative agility with the predicate calculus. Finally, in order to drive home the message that this introductory programming course is primarily a course in formal mathematics, we see to it that the programming language in question has not been implemented on campus so that students are protected from the temptation to test their programs. And this concludes the sketch of my proposal for an introductory programming course for freshmen."
2 comments:
Looks like you're coming around to the Haskell school of thought. You can see these papers specific papers as well of many of the comments you've noted were the basis for the Haskell language.
One of the things you'll see is the statement that Haskell was created for research purposes. Don't be fooled by this into thinking that it can't be applied to real world situations:
Frag is a FPS written in Haskell.
Quake Haskell is a rewrite of Quake in Haskell (not in English, sorry).
An informal view of functional programming in software prototyping.
A pattern seems to be emerging, at least from my point of view, of where the new ideas coming from that are influencing the mainstream.
In programming it's seem to be mostly coming from functional programming and similary RDF/Semantic Web is influencing databases and the like.
Post a Comment