I'm a research engineer at Mozilla working on the Rust compiler. I have history with Firefox layout and graphics, and programming language theory and type systems (mostly of the OO, Featherweight flavour, thus the title of the blog). http://www.ncameron.org @nick_r_cameron
Tuesday, December 21, 2010
Things I'll miss about Wellington
People's Coffee - excellent coffee, even by NZ standards
The Engine Room - a great place for climbing training
The Rak - Not the best climbing spot on earth, but the one I've spent the most time, and I've grown to really like some of the problems
Prana - vegetarian cafe in Newtown - awesome
Cafes in general - including Sweet Mother's, Baobob, Cafe Deluxe, Midnight Espresso, Esspressoholic, Fidel's, Lido, O Sushi
The art gallery (I went to Roundabout the other day and it was excellent, as have been may other exhibits)
The Embassy - such a cool cinema
Our house - it might be small, but it is lovely, and my wife and I's first home together, adn it has an amazingly sunny deck
Two surprisingly good Hip-hop nights at the San Fran Bathhouse (Jean Grae, Talib Kweli, Pharaoh Monch, Mr Len)
The Powerhouse - surely the best place on earth to lift weights (no frills, all hardcore)
Bars - Havana, Monteray, Good Luck, Southern Cross (some weird and cool live music nights), Alice, Motel
All my friends and colleagues - of course!
Links in JOT posts
JOT blog repost: OOPSLA day 3 (finally)
Homogeneous Family Sharing - Xin Qi
Xin talked about extending sharing from classes to class families in the J& family of languages. Sharing is a kind of bidirectional inheritance, and is a language-level alternative to the adapter design pattern. The work includes formalism, soundness proof, and implementation using Polyglot. Dispatch is controlled by the view of an object, the view can be changed by a cast-like operation.
I didn't quite get shadow classes, but I think they are like further bound classes in Tribe.
Finally, their families are open, as in open classes, so the programmer can add classes to families post hoc.
Mostly Modular Compilation of Crosscutting Concerns by Contextual Predicate Dispatch - Shigeru Chiba
Shigeru presented a halfway language between OOP and AOP called GluonJ. The idea is that it should be a more modular version of aspects (I think). However, it was found to be not as modular to check and compile as standard OOP. The language supported cross-cutting concerns with predicate dispatch and an enhanced overriding mechanism.
Ownership and Immutability in Generic Java - Yoav Zibin
Yoav talked about work that combined ownership and immutability in a single system using Java's generics. It is nice work, but I'm afraid I was too busy being nervous about being next up to write any notes.
Tribal Ownership - Nick Cameron (me!)
I talked about work with James Noble and Tobias Wrigstad on using a language with virtual classes (Tribe) to support object ownership (i.e., ownership types without the extra type annotations) for free (that is, no additional programmer syntax overhead). I really like this work, it all seems to come together so neatly, which I find pretty satisfying. I really do think virtual classes are extraordinarily powerful and yet easy enough for programmers to understand. Hopefully, they'll make it into a mainstream language before too long...
A Time-Aware type System for Data-Race Protection and Guaranteed Initialization - Nicholas Matsakis
Nicholas introduced a language (Harmony) where intervals of 'time' are represented in the type system to make the language time-aware. This can be used to prevent race conditions in concurrent programs and for other uses (including some non-concurrent ones), such as allowing new objects time to establish their invariants. Intervals are scoped and an ordering may be specified by the programmer; the runtime or compiler may reorder execution subject to this ordering. Checking is modular and is flow insensitive.
Wednesday, November 17, 2010
OOPSLA day 2 (belated) (JOT repost)
NOTE: I've come back to my notes about the last two days of OOPSLA; it's two weeks since the conference ended, and my memory is already kind of hazy, so the quality of these last two posts might be less than ideal... and another week and a half has passed before I finished even the first of the last two posts, still, better late than never, eh?
Creativity: Sensitivity and Surprise - Benjamin Pearce
Benjamin gave the oddest invited talk I've ever seen. He talked about various aspects of creativity over a large set of photographs, including some of his own. The photos were beautiful and made a great show. Not entirely sure what it has to do with programming, languages, systems, or applications, except at the most abstract level. Still an interesting talk, and it seemed to go down very well with the audience too.
Specifying and Implementing Refactorings - Max Schaffer
Automatic refactoring is popular, and correct in the common cases, but specifications are imperfect. The current `best practice' (e.g., in Eclipse) is to use a bunch of preconditions, but this is not ideal for automatic tools because it is difficult to identify all necessary preconditions; so refactoring sometimes fails, even if all the preconditions are satisfied.
The authors previously suggested specifications based on dependencies and breaking refactorings down into smaller pieces. In this work, they show that this idea actually works for proper refactorings. The dependencies are static and semantic, e.g., constraints on synchronisation and name binding. The authors specified and implemented 17 refactorings.
What can the GC compute efficiently? - Christoph Reichenbach
Christoph presented a system which checks assertions when the garbage collector is run. These assertions are about objects and relations between objects in the heap. This is a pretty efficient way to check heap assertions because the heap must be traversed anyway to do GC. There is a single-touch property - i.e., each assertion can only touch each object once - so checking the assertions is very fast. Their assertion language can describe reachability, dominance, and disjointness, and assertions can be combined with the usual logical operators. Interestingly, garbage collection must be re-ordered to check for reachability and dominance.
Type Classes as Objects and Implicits - Bruno Oliveira
This work `encodes' Haskell type classes in Scala using generics and implicits (the latter being a Scala feature that enables the programmer to omit some parameters). My understanding of the work was that type classes can be done using only generics, but implicits are required to make the `encoding' usable by a programmer. There is a whole lot of other complex-Scala-type-system stuff - I have notes about type members and dependent method types, but I can't remember why...
The interesting thing is that you end up with a really, really powerful language feature: as well as type classes, you can encode session types, which I find incredible (although according to the paper, you can do this with Haskell type classes).
Supporting Dynamic, Third-Party Code Customizations in JavaScript using Aspects
The authors are motivated by the popularity of JavaScript, both on the web and to customise browsers. Such scripts typically rely heavily on code injection, that is inserting new code into existing scripts. This is a pretty ugly process all round - it's as non-modular as you can imagine and implemented in totally unchecked and unsafe ways (mostly monkey patching). The authors propose doing it with aspect-style weaving instead, but claim it's not really aspects, apparently. Weaving is done by the JIT. Their empirical results show that their approach is sufficiently expressive for most use.
Saturday, November 06, 2010
Papers
I still have a couple of blog posts to write up about the last two days at OOPSLA, these should be coming shortly (I was on holiday for a week after OOPSLA and have been kind of busy since I got back).
Thursday, October 21, 2010
OOPSLA day 1 (JOT repost)
Registration-Based Language Abstractions - Samuel Davis
Samuel presented a method for adding language constructs to a language. These constructs are outside of the language, but also outside of the source code, so each programmer can have their own personal version of a programming language and the tool will present code using the right constructs. It seems like a very sophisticated macro system to me, but with better tool support (I don't mean this in a derogatory way, the system is obviously more powerful and useful than macros, I just mean it as a simile).
I attended, enjoyed and found interesting two talks - Pinocchio: Bringing Reflection to Life with First-class Interpreters presented by Toon Verwaest, and Lime: A Java-Compatible and Synthesizable Language for Heterogeneous Architectures presented by Joshua Auerbach. I'm afraid I can't say much about either of them, but they were good talks and I'll try to read both papers.
From OO to FPGA: Fitting Round Objects into Square Hardware? - Jens Palsberg
A talk on compiling high-level languages to FPGAs, the challenge is to compile a standard OO program to an FPGA. Currently code written in a small subset of C can be compiled to FPGAs, but hand-coded FPGA code is better (faster, less area, smaller energy consumption). The general technique presented is to compile from Virgil to C and then to FPGAs. Unfortunately, the C subset is so small (no pointers, etc.) that objects cannot be compiled in the usual way.
The authors used a mix of existing compilation techniques with some new ideas of their own. Essentially they compile objects to sparse integer arrays, but must then expend a lot of effort in compressing these tables.
They have experimental results which show slightly better performance for their tool chain than for the hand-tuned version (for the non-oo case). In the OO case, it is harder to compare (no-one else has done it), but by interpreting performance results from CPU execution, they reason that their tool gives good results here too.
An interesting challenge which emerged in the questions, is producing an intermediate language for compilation to FPGAs that preserves parallelisation, as opposed to C which 'flattens' away any parallel code into sequential code.
Panel - Manifesto: a New Educational Programming Language
For the last session of the day, I attended the panel session on a proposed new programming language, aimed at first year university students. The language is called Grace (http://gracelang.org), it is proposed to be a community effort, with a semi-open development process and this panel was an effort to get the community behind it. Grace will be a general purpose (as opposed to domain specific) language, designed for novices (so no fancy type system), and deigned for programming in the small (so no fancy module system). It will not be industrial strength, therefore it will not need to be backward compatible, and should have low overhead for small programs (no "public static void main").
The proposers argued that the time is right: Java will be good for the next few years, but is getting big and a bit long in the tooth. Alex Buckley (Java "theologist", also on the panel, but not associated with Grace) did not disagree, but did say that Java would have a lot of the features discussed in a few years time (which means it might not look so old but will be even bigger).
The proposers (James Noble, Andrew Black, and Kim Bruce) have ambitious goals: Grace should be multi-platform, multi-paradigm (it should support teaching with or without objects, with or without types, in a functional or procedural style), it should be useful for teaching first and second years how to program, and for data structures courses. With Smalltalk never far below the surface, it was declared that everything would be an object, although it was not stated what was meant by "everything". The proposers proposed that Grace have a powerful extension/library system for adding in things like concurrency, basically because we don't know the best way to do concurrency right now. This seems a big ask, one thing the concurrency community mostly agress on is that concurrency cannot be added on afterwards, it must be holistically baked in.
It sounds to me like a great idea - an academic, community based teaching language should be much better suited for purpose than a professional programming language. But, to be honest, the session did not have very much buzz. The panel itself was obviously excited about the project, the audience less so. There were no great questions from the floor, or any really exciting debate. The lengthiest discussion was about the relative merits of the PLT group's book/language/curriculumn. On the other hand no one really disagreed that there was a gap in the market for such a language. I'm interested to find out if the proposers got encouraging words after the session. (Disclaimer: I skipped the last half hour to attend a research talk, so the session might have lit up then and I would have missed it.)
Tuesday, October 19, 2010
Dynamic Languages Symposium (DLS) (JOT repost)
Almost unbelievably, the wifi works really well. It's been a while since I've been at a conference with really good wifi, but the Nugget seems to have cracked it, fingers crossed that it'll last.
Invited talk - Smalltalk Virtual Machines to JavaScript Engines: Perspectives on Mainstreaming Dynamic Languages - Allen Wirfs-Brock
The theme of this talk is how to get dynamic languages into the mainstream. The talk started well with some interesting general points on implementing dynamic languages, and ended well with some observations on the current generations of Javascript interpreters, but most of the talk was a retrospective of Smalltalk.
An early point was that performance is important for dynamic language take up. As much as language designers and programming guides state that design (of the language or program) must come first, if a language's runtime performance is not good enough for a given task, it will not be used. Another early point was that virtual machine implementors got blinded by the metaphor - a VM is not a machine, it is a language implementation, and must be coded like one.
Allen gave an impressive Smalltalk demo, running Smalltalk from 1984 on a machine of the day, which practically ran on steam. It was an interactive graphical demo and the performance was very impressive (in fact, the computer was pretty high powered for the time, but it was impressive nonetheless).
More of Allen's observations: holistic design gives high performance, tweaks to an existing VM will not get you very far (therefore, he is not a fan of trying to extend the JVM to dynamic languages); optimising fast paths is fine, but don't let the exceptional paths get too slow, it is probably these that makes the language special; methodologies are required to enable language adoption.
Most of the Smalltalk stuff was fairly typical, but the analysis of its death was more interesting: Smalltalk was going guns blazing in '95, but was effectively dead by '97. His analysis was that Smalltalk was a fad, never a mainstream language (which sounds right to me, not that it was bad language mind, but its influence in academic language research seems much higher than its influence in real life). One reason for this demise is that the 'Smalltalk people' expended way too much energy on GUI systems that nobody actually used, and not enough energy on real problems.
Another interesting analysis was on why Java succeeded, reasons given included: familiar syntax, conventional tools, the network effect, etc. It seems to me that people always try to find excuses for Java's success (although those points are obviously true); maybe Java was actually a good language that fit the needs of the time better than other languages?
A slight tangent was that Java is essentially a manually dynamically typed language; that is, casts are manual dynamic typing.
We then got back into the good stuff. Javascript was assumed to be inherently slow, then V8 (Google) showed that Javascript could be fast. Fast Javascript is important for the web, which means computing in general nowadays. You only need to look at the competition between browsers to see that Javascript performance is important. This reminded me that I think that Javascript engines are possibly the coolest and most interesting language engineering happening at the moment, and sadly it is not happening in academia (Allen suggested we need a research browser, which would be nice, but seems unlikely to come about).
Some of Allen's observations on Javascript VMs: most teams are still on their first or second tries (earlier in the talk, Allen stated that it takes three goes to get good at VMs) - things are going to get much better; performance is still low compared to Smalltalk in '95(!); Sunspider is not a great benchmark and is holding back proper evaluation and development; Javascript is harder to make fast than Smalltalk (because it is more dynamic), new ideas are needed to get more speed; Allen wandered why all the Javascript VMs use so much memory; the Javascript engine needs to be part of an holistic browser design to get good performance; Javascript seems to be the mainstream; Javascript performance is at the beginning, not the end.
The talk ended by reminding us that ambient/ubiquitous computing is here, and suggested that dynamic languages were going to be part of that era. He didn't explain why, however.
Meanwhile, at PLATEAU - GoHotDraw: Evaluating the Go Programming Language with Design Patterns - Frank Schmager
Frank Schmager presented work he has done with James Noble and I on evaluating the Go programming language using design patterns. This is a novel idea for evaluating programming languages, and hopefully a new tool in the language evaluation toolbox. Apparently he gave a very good talk, go Frank! (sorry for that pun)
PLATEAU invited talk - The Fitness Function for Programming Languages: A Matter of Taste? - Gilad Bracha
For the second session, I attended Gilad Bracha's invited talk at PLATEAU. Gilad always gives interesting and entertaining talks, and this was no exception.
"There are two kinds of languages - those that everyone complains about and those that aren't used"
--- Bjorn Stroustrup
Gilad's talk was about how to judge a language. He argued that there was more to a language's success than popularity, that in fifty years time we will look back and certain languages will be admired, and others won't. Furthermore, success is really mostly down to the network effect (or the sheep effect, as Gilad called it); and that the most successful languages follow the Swiss Army Knife approach (aka the kitchen sink approach aka the postmodern approach) to language design, which generally, it is agreed, is not elegant or 'great'. So is it possible to define what makes a language great? Or is it just a matter of taste?
There were a couple of tangents on parser combinators and first-class pattern matching in Newspeak (Gilad's language project).
Some criteria for greatness (or lack of it) were suggested: how much syntactic overhead is there in the language (such as unnecessary brackets, semicolons), does the language force attention on the irrelevant (e.g., on the low level in a non-systems language), how compositional is the language. Gilad asked if languages can be judged as theories (where programs are models of the theory), criteria here were consistency, comprehensiveness, and predictive value (which for languages means how hard is it to write a given program).
An interesting observation was that the most common actions in a language have no syntax (or actually no syntactic overhead), e.g., function application in functional languages, method call in Smalltalk.
Another observation on evaluating languages is that we often try to measure how hard a language is to learn. Gilad argued that ease of learning is not an indicator of greatness. He uses Newton's calculus as an allegory - it is widely hated for being difficult to learn, but is truly a great contribution to science.
Finally, Gilad stated that good aesthetics makes good software, that strict criteria for evaluating a language are not ideal, and that quality is more important than market share.
There was a big debate in the question session afterwards, covering how to judge a language, how to review programming language papers and the review process in general, cognitive theories, and even maths vs. art.
Proxies: Design Principles for Robust Object-oriented Intercession APIs - Tom Van Cutsem
Back at the DLS, Tom talked about implementing intercession (that is, intercepting method calls) in Javascript. Unlike some scripting languages, this is surprisingly difficult in Javascript. He described a technique to do it using proxy objects.
Contracts for First-Class Classes - T. Stephen Strickland
The last talk of the day was on contracts in Racket, specifically on and using first-class classes. (By the way, Racket is the new name for PLT Scheme, I assumed I was the only one who didn't realise this, but a few other people admitted their confusion later. Why did they change the name?) Not being a Lisp fan, I found the examples very hard to read - too many brackets! Anyway, Stephen (or T?) described Eiffel-style contracts (pre- and post-conditions). These can be added using first-class classes (in the same way that methods can be added to classes). He showed that object contracts were still required in some circumstances (as well as class contracts), and showed how these were implemented using class contracts on new classes and proxies to make the old and new classes work together.
Monday, October 18, 2010
FOOL (JOT repost)
DeepFJig - Modular composition of nested classes - Marco Servetto
FJig is a formalisation of the Jigsaw language, which focuses on mixins and 'class' composition. This work extends FJig with an extension to nested classes. I believe that virtual nested classes are the most significant change to the OO paradigm currently being proposed. Being able to abstract and inherit at the level of class families, rather than just individual classes, solves an awful lot of problems with the usual class model. Thus, I'm happy to see another take on the idea.
From what I gathered, DeepFJig only supports 'static' nesting of classes, this makes the type system simple (no path dependent types, or exact types required), but at the expense of a lot of the benefits of virtual classes. The interesting thing here is that by combining nested classes with the mixin composition operators found in FJig, you get a great deal of expressivity - Marco showed how to encode generics (in a virtual types style) and aspects (yes, aspects, as in AOP). This latter probably means that the language is too expressive for some software engineering purposes, but it wasn't clear from the talk how much you can sabotage classes, as you can usually do when using aspects.
Lightweight Nested Inheritance in Layer Decomposition - Tetsuo Kamina
Another nested classes paper, this time extending the 'Lightweight Nested Inheritance' (J&) concept. LNI uses paths of class names and exact class names to identify types. This work extends that with generics so that types can be more precisely specified. However, it seems that you only bite back some of the expressivity lost when compared with systems such as VC and Tribe (which have dependent path types). So, it is a slightly more heavyweight lightweight version of nested class types. The interesting aspect is that type variables can be used as type constructors for path types (but not generics types, i.e., X.C is OK, but X
Mojojojo - More Ownership for Multiple Owners - Paley Li
Paley presented work he has done with James Noble and I on Multiple Ownership type systems. Multiple Ownership was proposed at OOPSLA '07, but the formalisation was unwieldy. This work presents a simpler, more elegant, and more powerful formalisation of the Multiple Ownership concept.
Traditional ownership types systems give each object a single owner, this organises the heap into a tree, which is great for reasoning about programs. Unfortunately, real programs rarely fall nicely into a runtime tree structure, so more flexible ways to organise the heap are required. Multiple Ownership allows each object to be owned by multiple owners, thus organising the heap into a DAG.
Mojojojo (if you don't get the name, Google for the Powerpuff Girls) adds a powerful system of constraints over the structure of the heap, generics, existential quantification, and a host of small improvements to the formal system, resulting in something a lot nicer than MOJO. Paley gave a great talk, and I recommend you all read the paper (totally unbiased opinion, of course :-) ).
Interoperability in a Scripted World: Putting Inheritance & Prototypes Together - Kathryn E. Gray
More work on integrating typed and untyped languages, which seems to be very fashionable right now. This work focuses on making Java and Javascript work together, rather than focusing on type checking. The most interesting bit is making prototyping and inheritance work together in the same world.
I'm sorry I cannot write more about this paper, because it sounds really interesting, but I was a bit distracted at the beginning of the talk, and never quite got back into the flow. I'll be reading the paper later though...
Adding Pattern Matching to Existing Object-Oriented Languages - Changhee Park
Changhee talked about adding pattern matching to Fortress (which reminds me to check on what is happening with Fortress nowadays). In fact one of the more interesting bits of the talk was the generalisation - the requirements on a language such that it can support pattern matching in the way described.
The general idea of the work is to support ADT-style decomposition of types by subtype using a typecase expression and function parameters and decomposition of objects into its fields, similarly to how tuples are decomposed in Python etc. What I thought was missing was a discussion of whether or not you would actually want to do this: you are breaking object-based encapsulation, which most languages avoid.
Sunday, October 17, 2010
Registration
Looking through the program is pretty exciting, there seems to be a lot of good-sounding papers and invited talks. The organisers also seem to have managed the scheduling well - despite three concurrent sessions at most times, there is not a single clash between talks I'd like to attend; Thursday's invited talk does seem to clash with lunch, however, not sure how well that is going to work out.
First Impressions of Reno and OOPSLA/SPLASH (JOT repost)
It should be an interesting year for OOPSLA: it has undergone re-branding from OOPSLA to SPLASH (a re-arrangement of the OOPSLA letters, minus OO (because who programs with objects any more?), and appended with "for Humanity" (cringe)). The research paper selection process has changed too, they introduced `white-ball' papers (each member of the PC can select one paper to be accepted without argument), and there were slightly more papers accepted than in previous years (including mine, so I can't really complain; Thursday afternoon, if you're interested). The payment structure has changed too: you have to register and pay for individual workshops, I can't comprehend why - the best thing about workshops is wandering between them.
Anyway, after twenty-odd hours on a plane from NZ, we started our descent into Reno, we got a birds-eye view of the Nugget (the conference venue and hotel) as we came in - sandwiched between the expressway and a railway yard, it did not look good. Reno airport was like a gateway into hell, slot machines everywhere and a backdrop of billboards for "gentleman's clubs".
The conference venue is almost comically grim. The main floor is a sea of slot machines and haggard looking people. There are a lot of cowboy hats around, and not in an ironic way. No-one looks happy to be here, mostly people look desperate, or just plain chewed up. People smoke a lot, indoors, which seems a bit odd in 2010. There is a patched motorcycle gang drinking in the lobby (seriously, this is not an exaggeration).
If I had to describe Sparks, and the Nugget, in a word, it would be "grim". I don't think I have ever been so disappointed in the location of a conference. I hope (and expect) the conference itself to be excellent, it will have to be to justify enduring this place for a week. On the bright side lots of interesting people are arriving, and the free wifi at Starbucks has become a natural hub...
OOPSLA report
Thursday, August 19, 2010
More writing
Also, back to teaching again in two weeks. Teaching type systems again for six weeks. The first time I've taught the same stuff again, so hopefully I will be better and more polished, and it should be less work.
Monday, August 02, 2010
On advertising, the software industry, and Bill Hicks
I watched "American: The Bill Hicks Story" at the weekend. It was an excellent film and well worth watching, especially if you are a Bill Hicks fan. How is this relevant to programming languages? It's not. But it is vaguely relevant to the state of the software industry, bear with me...
One of Bill Hicks' 'things' is with marketing and advertising; see, for example, the above quote. Now I wouldn't quite agree with that, but I do agree that advertising and marketing make my life considerably worse, and I really can't stand them. I think that working in these industries cannot, morally, be justified. Whilst I am a supporter of capitalism (it might not be a great economic system, but it is definitely the least worst that anybody has come up with so far), I detest the current climate of consumerism and think it is a great detriment to most societies. Advertising/marketing is a key feeder of consumerism and thus, I don't like it.
Now the great thing about a lot of software right now is that it is free. Some is really free, made by volunteers out of the goodness of their hearts, but most is funded by advertising. The most obvious and successful example being Google. Furthermore, a whole lot of non-software information is funded by advertising, for example, most of the internet. So I can't close my eyes and wish that all the advertising in the world would go away, because then so would most of the software I use and websites I read. And companies I might one day like to work for.
So is there a moral problem to using such ad-funded software and websites. I use adblock, so of course I don't actually see any ads, and certainly never click any. So I'm kind of getting a free ride (oh, how I wish there was adblock for TV and real life). So I'm not funding the ads in anyway. But, this opens up a new problem: am I just free loading on those who are less tech-savvy than I and who do not use adblock? Well they don't pay for the ads, even if they click them, so not in a monetary sense. Although I guess they put up with the inconvenience (who knows, perhaps they even like the ads). And at the end of the day it's a slice of the price of the goods that we all buy, so in a way there is a consumption tax that is used to fund free websites and software, which is kind of nice. And also hundreds of engineers and investors, which is not so cool. And of course as a tax on consumption to fund information, it must be highly inefficient, since the advertisers and marketing people and managers at Google on obscene salaries all take their cut. It may still be more sefficient than an actual tax administered by a government though.
Friday, July 30, 2010
Teaching programming
Here is how I would design a curriculum if I were king (or head of programme or whatever):
1st year: assembly (8086, none of this RISC crap) and Haskell in parallel (in fact I would start with machine code and lambda calculus for the first few weeks).
2nd year: C and Python, again in parallel
3rd year: C++ and Java.
You could swap Haskell for some other pure, lazy, functional language if you like, and Javascript, Ruby, or Perl for Python, and Java for C#, and you probably don't need C++ and Java in the third year, one or the other would do.
And of course, this wouldn't work in real life --- you would scare off most of the students in the first year and it would only appeal to the very smartest and geekiest of students. But I think it is a good order: students would get some key concepts (recursion, pointers, machine organisation) early which they can apply when the learn the later langauges. The problem with learning Java first and then the fun stuff, is you don't learn your lessons that way - programming in C helps to make you a better Java programmer, but not vice-versa.
I think this organisation would give a good appreciation of real-life computers (too many students have no idea about how their Java programs relate to bits and bytes); and how to think about programming in a 'smart' way (higher order functions etc.)
A final advantage, the languages match the scale of the exercises: small programs in assembly/Haskell up to large programs in Java, just the way it was all designed.
I honestly think this would be the best way of teaching programming to good students. Unfortunately, a CS degree is about more than just programming, and students get a say in how they are taught, so it will never catch on.
Wednesday, July 21, 2010
Python
I didn't find it a hard language to learn, I guess it is designed that way, but being familiar with Groovy and Javascript probably help. I've only written small programs, but here are some thoughts:
I love the syntax and general feel, it feels very 'right'. Most things are done the way I would do them. I like indentation much better than braces and don't miss semi-colons one bit. On the other hand, it is much more readable than Perl. The only things I object to is colons to initiate block statement and pass statements rather than empty blocks, both seem to go against the minimalist syntactic philosophy, and both seem unnecessary.
In fact I love the language. It seems very good at what it sets out to do, much like C and unlike C++. There are very few rough edges, and lots of things that make you go "nice!".
I like the support for first-class functions, lambdas, map, filter, etc. Syntax for lists, slices, list comprehensions, generators, etc. I thought slices could have been used instead of the range() expression.
The jury is still out on yield.
Not using types is liberating in small programs. But my Groovy experience is that I quickly want to use them for documentation once the programs get larger ("just use variable names" you say; OK, I want compiler-checked documentation). Optional/hybrid types are the obvious solution. Maybe it is because I'm too used to C/Java programming.
The inefficient implementation scares the hell out of me. But, intellectually, I know it doesn't matter for 99% of applications, but still... Also I would prefer the implementation was not as obvious to the programmer (in docs as well as reflective code).
I like the duck typing approach to objects. It feels so much more lightweight than implementing interfaces. It makes me think that interfaces could be supported in a typed language much more easily. Not sure how, but something like Donna Malayeri's blend of nominal and structural types could be helpful.
I find the way fields of classes are handled strange: adding fields when they are assigned to fits nicely with the way variables are used, but feels very uncomfortable for fields. I think this is Java/C++ conditioning at work, and I haven't written programs that make use of this facility yet, so I am reserving judgement. It may well be a good idea, but it gives me the creeps.
I like the 'everything is public' philosophy, although I fear that those of a more software engineering bent would be horrified.
In conclusion, I like Python a lot. Of the scripting/dynamic programming languages I've looked at, it is my favourite. I hope to have an opportunity to use it for something serious soon.
And I still have the feeling that there is a huge space for a programming language with some of the dynamic features of a language like Python, combined with a more heavyweight class system (virtual classes) and optional typing to make a very pleasant general purpose language.
Women and Computer Science
This was in part set in motion by a talk on the subject by Barbara Crump at the NZCSRSC.
Anyway, I have lots of thoughts and opinions, but mainly for now I am reading and pondering. Musings on the subject will have to wait until I feel sufficiently informed, which may take some time...
Tuesday, June 29, 2010
TOOLS Europe day 1
The day kicked off with Oege de Moor's invited talk. I was not that excited about the prospect, but it was actually a really interesting talk - his tool is very useful and does some interesting things, all whilst being language independent. And the internal language is a pretty cool combination of logic/relational query language and OO.
My talk was at 2:30, and went pretty well. I should probably have had more and better examples; I realised as I talked that I was expecting the audience to keep rather a lot in their heads - why is this so obvious when giving the talk, but impossible to comprehend when practising?
Erik Ernst followed with an interesting talk on virtual types/parametric types/their relationship. Lots more interesting virtual classes stuff to think about.
Johan Ostland finished off the session with a talk about Welterweight Java, the talk filled in a lot of background about the various -weight Java calculi that I wasn't aware of, but probably should have been. As nice as Welterweight Java looks, I hope to never have to use it - my languages are already too big, and the smaller the better: if I could work purely in the untyped lambda calculus I would.
Friday, June 25, 2010
ECOOP day 3
Gilad Bracha talked about modules as objects in Newspeak and was awarded the best paper award. More nested classes stuff, this time to support modules. Late binding of names apparently gives nested and virtual classes and mixins straight up. Abolishing the global namespace means imports have to be passed in when top level classes are instantiated. At the root of it all, classes must be passed in by the IDE or some other tool. This leads to automatic sandboxing of code.
I liked the talk on inline caching, I love this kind of compiler optimisation stuff, it brings out my inner geek. But, it is far enough out of my area, that I need to read the paper to get the most out of that one, so no comments, sorry.
And that is the end of ECOOP for another year, now onto Malaga and TOOLS...
Wednesday, June 23, 2010
ECOOP day 1
Anyway, Shriram gave another talk about Javascript --- this time about a core calculus, interesting they had actually implemented the de-sugar-er and operational semantics so could execute Javascript using their formalism.
I really wanted to see Gavin Bierman's talk on dynamic types for C# --- he is always a great speaker. Unfortunately I went on a mission to buy some new t-shirts at lunchtime (due to the relative pricing of a new t-shirt v. having a single t-shirt laundered at my hotel) and got lost. As compensation to myself I had a delicious blueberry gelatto.
I attended Andrew Kennedy's summer school session on unit types in F#, it was a good session with lots of interesting theoretical stuff about unit types (I couldn't believe there was so much to talk about them). I guess though, I would have preferred more on these interesting details (Andrew skipped quite a few slides), and a bit less on F# itself.
I would have liked to have seen the talks on "Verifying Generics and Delegates" and (especially after Shriram's plug yesterday) "Recency Types for Analysing Scripting Languages", but these clashed with the summer school. This was a shame, I believe they could have scheduled this better (there are other clashes later in the week too).
Tuesday, June 22, 2010
Worst wifi connection ever
FTfJP invited talk - "Electrifying Javascript"
Actually uses flow analysis, but a simple (and intra-procedural) kind, looking only at tags. The type system uses tagchecks, and the flow analysis inserts tagchecks.
A slogan: "Types on the outside, flows on the inside".
The type checker assumes function parameters are anntotated with types. The final piece of the puzzle is that they dynamically infer these types which can then be used for type checking (!)
The whole process seems to work pretty well - good results, but over a relatively small code base.
[Note to self: reasonable plug for recency types, tomorrow]
Correction
maspeghi
Towards a Semantic Model for Java Wildcards
I think we got a lot of interest, and certainly we had some good, interested questions.
ECOOP 2010
Maribor is a very nice city, and I find it quite nice to be somewhere 'foreign' - different language, different architecture, etc. Quite the change after a year in australasia.
There has been quite a lot of grumbling about the location - too hard to get to, to hard to get around, not enough organisation. And some of these comments are fair, but I found the journey not too stressful, and the organisers are pretty organised (although some info has been pretty late in arriving). And I think it is great to have the conference somewhere less usual, it is a good oportunity to see a lovely country which I would otherwise never have visited.
My hotel is also lovely, particularly nice is the wide variety of saunas in the basement - it's a miracle I've made it to the conference really. What is less lovely is that access is by gondola only, once an hour until 10 - which is pretty restrictive.
Finally, I was not expecting many people to be here, from what I heard; but actually, there are many people I know, and it is very nice to catch up...
Tuesday, May 25, 2010
Papers
These papers should be appearing shortly on my research page.
Thursday, May 13, 2010
Impact
Robert O'callahan (Mozilla) gave a talk at Vic recently (blog link), in essence, on the impact of one's work in the world. I missed the talk, but read the slides and had a chat with Rob about this, and it got me thinking. A lot.
What impact does my work have? Well as a theoretical researcher, I'm used to the answer being none. And I've been pretty comfortable with that, I get satisfaction from the intellectual challenge, rather than the real-world impact. If one of my ideas trickles down into something useful one day, then that is a bonus.
But, thinking about this a bit more, impact is like risk - it's a two dimensional idea, in the case of impact, there is magnitude (how many people, how much you impact each person) and 'positivity' (I can't think of a better name, I mean how positive the effect is). I've always thought in terms of magnitude, implicitly assuming I would be making things better, not worse (no intention of doing research for the military, for example). But I think it is important to think in terms of both - one should strive to make one's impact better, as well as bigger.
Not sure what any of this means in practical terms, but it is a musing in the spirit of the blog's name.
UK elections - postscript
Fingers crossed for some good things from the new boys (talking of which, it really is boys, where are all the powerful female ministers?), and hopefully proper voting reform at some stage...
Thursday, April 29, 2010
UK Elections
Who to vote for - no-one looks conviincing, labour are tired and have made a total mess, plus there is that unpleasent Tony Blair taste left over. The size of the british state is getting ridiculous and the infringements on civil liberties frightening.
Voting Tory would be like selling one's soul to Satan (remember the '80s?), plus Cameron looks like Blair 2.0. The liberal democrats have some good policies (civil liberties, imigration) and actually have a chance of getting some votes this year, but have some stupid policies too (abolishing tuition fees, no nuclear power, etc.). Perhaps a lib-dem/tory coalition might work. What we really need is a proper liberal party - in both economic and social terms. And one with the guts to radically reform the health service (take power away from the GMC, reform doctor training and the doctor's/GMC monopoly) and education (let universities charge what they like, abandon the stupid target of getting stupid people through universities, fix primary/secondary schools a bit (although God knows how to do that)).
So The Economist is probably right, as repulsive as it seems, the tories are probably the best bet for a vote. Now that is a sorry state of affairs, glad I don't actually live in the UK right now.
Wednesday, April 14, 2010
You wait months for a good seminar...
Following Monday's seminar Robert was scheduled to talk again on tuesday, this time at the NZCSRSC (student research conference). It sounded like an interesting talk - "How to change the world" is not a modest title. And having perused the slides I am very disappointed not to have attended - it looks like there were a lot of good things, and thinking about your work in terms of impact is an interesting perspective, and one which I have not thought of since working in PLT.
Anyway, I couldn't attend because I was attending another seminar, this one from Rusten Leino on the Dafny language/verifier and other verifcation stuff going on at Microsoft research. It was a very interesting seminar with a very good demo of verification in action. Pretty impressive amount of bugs caught and all proofs done automatically - it was kind of like magic! Really solved the problem I have with a lot of these things, which is that no-one wants to prove their programs. The annotation overhead is still quite high, and I think only a minority of programmers will be able to write the specifications. But, with a little practice, I can imagine writing code with specs in Dafny in a practical way.
Rusten also summarised some of the other verification efforts he is associated with and they have acheived some pretty cool stuff.
"Dynamic framing" sounds cool, and I should try and understand it better.
Overall, I got the feeling that right there was the future of 'safe' programming languages, or at least a sneak peak of it. And, since it's at Microsoft, it actually has a fighting chance of making it into the real world!
Monday, April 12, 2010
Who would have thought...
One nice thing is that short interations seem to work better for standards too, just like software. I think this is one of those fundamental computer science truths, kind of like "you can solve any problem with an extra indirection", for any task, shorter iterations are better.
Also, interesting parsing problem - "herebedragons", its obvious what is intended, but how should you parse it? And make a tree that can be manipulated by Javascript etc. And we all thought syntax had been solved thirty years ago. In fact, the web seems to be a whole new world of interesting computer science problems (see the post on Javascript VMs). Now what would be an interesting research topic?