Wednesday, April 13, 2011

The naked server

Twitter have abandoned Rails for some parts of their application. I didn't find their description of the move to be fully informative.  E.g.: just what do they mean when they say "we launched a replacement for our Ruby-on-Rails front-end"? Predictably the move got a few comments on the Ruby/Rails newsgroup that I subscribe to. The general consensus seemed to be "so what"? I think we might all be missing the bigger picture. Perhaps this is just a logical step in the progression of web applications? If so, what does it mean for their future? To explain my thoughts, some history:
  • Way back when, pages viewed in a browser were fairly simple, being mostly static files that were dished up from directories by a dedicated server application.
  • Then we needed to make those static pages more dynamic, and to reflect knowledge related to the identity of the person who was viewing those pages. However, browsers were geared to rendering static pages, so we controlled what we could and put the smarts on the server, building frameworks that allowed us to customize each individual page for the viewer. Our first efforts at these frameworks were quite awful, IMHO, but as time went by we got better at it.
  • Now web browsers have an common embedded scripting language (ECMAScript): and can create and change whole pages, dynamically, on the fly.
See the flow here? The complexity started on the server, and then tunneled down those tubes to the browser. Currently it is spread across the browser and the server, with JavaScript frameworks, and server side frameworks being combined to produce an application. Do we still need those server side frameworks to compose and build web pages? Surely we can now start to strip the server side of the equation back? In this vision of the future we would create the bones of the site using HTML and CSS, mix in JavaScript, and serve these as static files to the browser, that then makes the magic happen. The browser fetches any data required from the server in an easy to digest format, such as JSON.  The server loses the complexity that it gained to build dynamic web pages. I'm going to call this vision of the future the naked server.  If correct, I predict two things:
  • JavaScript/HTML/CSS is going to grow dramatically in importance (and that seems to be happening).
  • node.js is going to become popular. Because if you know JavaScript, and you simply want to fetch some data from the server, or write some back, why not use the same language and, possibly, files, on both sides of the tubes?
Over the next few weeks I am going to see if I can write a simple naked server application. Not being a JavaScript guru yet, I think I might stick to my knitting, and use naked servlets for the server side of the equation. Then I plan to add to my knowledge and to build a node.js cloud based application. This should be fun!

5 comments:

Daniel said...

The javascript requests dynamic content from the server? How does it know what to request?

martin-paulo said...

JavaScript is initially downloaded in the form of a static file.
It is then interpreted by by the browser to make the page in the browser a dynamic page. The script knows what to request because that initial download has the addresses of the server endpoints that it needs to get information from.
Of course that fetched information can contain further information about new server endpoints...

Daniel said...

Ahh so the javascript in the browser can also do things like control & make decisions about things like optional content and timing. More pull, less push. I guess that could be useful.

martin-paulo said...

Yeah: and the complex server side frameworks required to create html that has be been personalized for a particular user and their work flow are no longer needed...

martin-paulo said...

A yes: others are starting to agree:http://peteryared.blogspot.com/2011/07/javascript-one-language-to-rule-them.html