Tuesday, March 18, 2014

Stefan Bucur's thoughts on ASPLOS'14

I am a PhD student at EPFL, who works on scaling automated testing techniques, such as symbolic execution, to large real-world software systems. This year, I got the chance to present at ASPLOS my work on generating symbolic execution engines for interpreted languages.

I think my other peers have already done a great job at covering the event, so I won't reiterate on that. I'll just mention that I was happy with how my talk went and with the great feedback I received from the people I met. As an ASPLOS first-timer, I was pleased to meet a community that fits so well my research area.

I will express here, instead, my own view on the topic debated in a session during the last day of the conference: “Resolved: Specialized architectures, languages, and system software should largely supplant general-purpose alternatives within the next decade.” I really enjoyed the discussions and it was fun to see how this polarized the audience to the point that the chair had to cut on questions, as we were running out of time.

Since I didn't get the chance to express my opinion to the audience there, I'm doing it here: I think both specialization and generalization camps deserve the fine bottles of wine offered as a prize, because both trends are bound to stay in the computer architecture. The reason is simple: any computer system roughly operates at two boundaries: the physical boundary -- the hardware artefacts that support its existence -- and the human interface layer, where I include both end users and developers. These two boundaries create the opposing forces of specialization and generalization.

On the one hand, physics is "heterogenous" by nature. You can't drive a Lamborghini over a corn field -- you'd have to use a tractor; there are different types of vehicles available for the different types of terrain that exist on the planet. In the same way, a good computer architecture will need to work around and take advantage of any physical peculiarity in order to make the computation more efficient.

On the other hand, the human interface layer should be as uniform as possible, because human minds cannot tackle complexity well. Ideally, a developer would be happy to write the system entirely in, say, Python. But in practice, there are no good compilers to turn Python in an optimal hardware+software system, and the language becomes hard to maintain at large scales, so this is the point where the generalization and specialization forces collide.

These two forces collide and reconcile in any system. It only differs the abstraction level where this happens, and history had showed that these points evolve in time for a particular system, driven by technological trends. The question is not whether we should aim for one or the other, but how we can automatically generate optimal hardware and software that translate the high-level human intent. There's a lot of research work ahead of us :)


No comments:

Post a Comment