Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Doesn't the C compiler do all sorts of optimization magic that makes the output of the compiler sometimes hard to predict? C is also not the only language that succeeded; C++ and Java are doing pretty well, and I've used them quite alot without delving into the sausage factory of GCC/javac. Scala is doing pretty well too, with it's 20 stage compilation process!

I disagree that anything trying to abstract away network communications will fail; I doubt that every engineer at google is a full-stack expert who single-handedly sets up a massive-scale service every day by hand. More likely, they have developed a set of somewhat-reusable abstractions that they can build upon for a variety of services. Hadoop is another example where the whole network thing is abstracted away. Amazon S3/Cloudfront is yet another. Difficult, but not impossible.

Perhaps Opa is trying to be the minimal portable abstraction over a full-stack service. I know GWT tried to be the same thing and failed, but one failure doesn't mean you give up, and I'm glad they're still trying.



It's true that a C compiler does that now, but it didn't 40 years ago. Platforms that start really complex don't have a hope of lasting -- they collapse before they get to the stage of old and hairy.

I think this is an important point -- systems that last have to start really simple.

The Web was laughably simple at first. HTTP 1.0 was roughly: open a TCP connection, send a URL, get the document back, close the connection. It was WAY simpler than many contemporary hypertext solutions.

Unix was also WAY simpler than its contemporaries. Linux is old and hairy now, but that's just how things age without falling apart. They have to accomodate many different people's (sometimes broken) mental models under one roof. A system like Opa seems like it can only accomodate the mental model of its creators, and thus won't age gracefully.

C++ despite being huge has some modesty: it respected people's existing C code and didn't "cover it up". Plenty of people write C with classes still and that's actually a feature. Opa seems like it "covers up" what's underneath.

Java is kind of an exception to the adoption curve because it had huge marketing behind it like no other language did. But I think Java 1.0 was still pretty darn simple. It was a very small language. I'll grant that Java did try to cover up the OS. I think this limited its widespread application, but it's admittedly still massively popular for certain things. You couldn't do async I/O in Java for awhile, nor could you do things like make a Windows shortcut on the desktop.

Hadoop, being based on the MapReduce abstraction, does indeed pretty successfully allow the programmer to ignore the network. The fairly large restriction of being able to write 2 pure functions -- map and reduce -- is what allows this (it allows retries without affecting correctness, etc.). In a way this proves the point. You can't write arbitrary procedural/stateful code (in Opa or any other language) and distribute it over the network at scale. I'll go as far as to claim that this problem is unsolvable in a fundamental sense, like the halting problem is unsolvable.

Now maybe Opa introduces some restrictions in their model that help with distribution that I don't know about, but in general I am skeptical of the "write all your code in this one clean language with our nice model and we'll figure out the rest automatically with our hyper-optimized advanced technology". We've heard it before.

I honestly don't know why as a programmer you would want to ignore the distinction of code running on the client or the server. I'm all for sharing (some) code, as Node.js allows, but it should be obvious in any application whether a code path is running on the client or the server, and you shouldn't need a compiler to figure it out. That kind of coupling is crazy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: