Monday, March 16, 2015

PluralSight, Part I

List of things mentioned:

Express.js

  • response.send sends JSON if you give it js literals (like an array, or foo:'bar')
  • some deprecation issues, installed body-parser to read in urlencoded params. See here for more info
  • more precisely, req.param is deprecated, and replaced by req.params, req.body, and req.query

Codeschool: Express.js, Part I

The first part of this course overlaps with the previous node.js course, and it's all a pretty gentle exposition of the express API. Some reminders:

  • npm install express@4.9
  • response.send() is the same as response.json() when sending arrays.
  • response.redirect(301,'/parts')
  • place html in /public folder (no use of ejs, yet)
  • response.sendFile(__dirname + '/public/index.html');
  • static middleware: app.use(express.static('public'));
  • what is middleware?

Sunday, March 15, 2015

Project Part I: Starting out

First thing to do is to install node and npm, obviously, but assuming that is done, go and create a folder and put a git repository on it (init, use github, .gitignore swap files, etc).

Next, call npm init, which is going to populate package.json for you. You can start installing plug-ins once you've done this. I'm not too sure what plug-ins I need for now, but at least let's use

$ npm install express --save
Since I'm going to use Heroku, here's their version of hello world, which I'm calling app.js:
var express = require('express');
var app = express();

app.set('port', (process.env.PORT || 5000));
app.use(express.static(__dirname + '/public'));

app.get('/', function(request, response) {
  response.send('Hello World!');
});

app.listen(app.get('port'), function() {
  console.log("Node app is running at localhost:" + app.get('port'));
});
Follow the starting steps on Heroku website, but most importantly, create a file called Procfile containing the one liner (telling the server what to run at start)
web: node app.js
You can also run the server locally using
$ foreman start web

... and that's it, should be up and running online.

Codeschool: Real Time Web, Part IV

Sockets I/O

(skipping for now) We're building a chat server. Use socket.io to handle the real-time input/output:
$ npm install --save socket.io
and on the html that uses it:

and here is the code
var express = require('express');
var app = express();

var server = require('http').createServer(app);
var io = require ('socket.io')(server);

io.on('connection',function(client){
  console.log("Client connected");
});
server.listen(8080);

Persisting Data

To keep a log of previous chat messages, we can store the messages in the array:
var message = [];
var storeMessage = function(name,data){
  messages.push({name: name, data: data});
  if (message.length > 1){
    messages.shift()
  }
}
and when a new client joins, we should store the message every time a new message is broadcast. Plus, whenever there is a new connection, we need to give that client all the previous chat messages:
io.sockets.on('connection',function(client){
  ...
  client.on('join',function(name){
    messages.forEach(function(message){
      client.emit("messages",message.name + ": " + message.data);
   });
  });
});
So, how to persist the chat message? DATABASE! Unfortunately they use Redis instead of Postgresql. Anyway, in the hope that this becomes relevant, here is a rewrite of the storeMessage function using redis:
var redisClient = redis.createClient();

var storeMessage = function(name,data){
  var message = JSON.stringify({name: name, data: data});
  redisClient.lpush("message",message,function(err,response){
    redisClient.ltrim("messages",0,9);
  });
}
and the join listener:
client.on('join',function(name)
  redisClient.lrange("messages,0,-1,function(err,messages){
    messages = messages.reverse();
    messages.forEach(function(message) {
      message = JSON.parse(message); 
      client.emit("messages",message.name+ ": " + message.data);
    });
  });
});

Codeschool: Real Time Web, Part III

Express is a web framework for node.js, i.e. we can finally build a web server (easily). To install it, use

$ npm install --save express
The --save adds the module to our dependency list. Here is a simple app to get us started:
var express = require('express');
var app = express();

app.get('/', function(request, response){
  response.sendFile(__dirname + "/index.html");
});
app.listen(8080);

ejs

ejs is the template (templating?) package. By default it will look for templates under 'views' directory. Here is a more complicated example using ejs:

var request = require('request');
var url = require('url');

// route definition
app.get('/tweets/:username',function(req,response){
  var username = req.params.username;

  options = {
    protocol: "http",
    host: 'api.twitter.com',
    pathname: '/1/statuses/user_timeline.json',
    query: { screen_name: username, count: 10}
  }

  var twitterUrl = url.format(options);
  //request(twitterUrl).pipe(response);
  request(url, function(err,res,body) {
    var tweets = JSON.parse(body);
    response.locals = {tweets: tweets, name: username};
    response.render('tweets.ejs');
  });
where the template looks like

Tweets for @<%=name%>

    <% tweets.forEach(function(tweet){ %>
  • <%== tweet.text %>
  • <% }); %>
To call this, use the url:
curl -s http://localhost:8080/tweets/eallam

Here is another example, supposing that we are going to call Twitter's search API, e.g http://search.twitter.com/search.json?q=codeschool. It uses the 'request' module.

var req = require('request');
var url = require('url');
var experss = require('express');

var options = {
  protocol: "http:",
  host: "search.twitter.com",
  pathname: '/search.json',
  query: { q: "codeschool"}
};

var searchURL = url.format(options);
var app = express();

app.get('/',function(req,response){
  request(searchURL).pipe(response);
}).listen(8080);
});


Codeschool: Real Time Web, Part II

Modules

Suppose we created a module called custom_hello.js:

var hello = function(){
  console.log("hello!");
}
module.exports = hello;
then to use this in another file, we write
var h = require('./custom_hello');
h.hello();
Shorthand notation:
exports.hello = function(){
  console.log("hello!");
}
and correspondingly, to use.
require('./custom_hello').hello();
In this way, you can actually choose which functions to export.
var foo1 = function() { ... }
var foo2 = function() { ... }
module.exports.foo1 = foo1
which makes foo2 a private function.

npm

npm is the package manager for node. You can find repositories of modules in http://www.npmjs.org. To install a module, use

$ npm install 
which is going to install said module in node_modules directory in your application root.

It is also a good idea to have package.json in your app root directory, which contains the dependencies your app needs, for example
{
  "name": "My App",
  "version" : "1",
  "dependencies": {
    "connect": "1.8.7" 
  }
}
Running npm install will install all the dependencies. Each dependency may of course has its own dependencies, i.e. its own package.json file.

Thursday, March 12, 2015

Codeschool: Real Time Web, Part I

The only thing I have done before watching the codeschool course was to download and run the installation file from https://nodejs.org. It installs node and npm (the package manager for javascript). You don't actually need to, but it's always nice to see something actually working.

Main points:

  • non-blocking
  • event loop - checking for events continuously (e.g. request, connection, close)

Here is a non-blocking code example:
  fs.readFile('/etc/hosts', function(err,contents) {
    console.log(contents);
  });
  console.log('Doing something else');
or equivalently
  var callback = function(err,contents) {
    console.log(contents);
  });
  fs.readFile('/etc/hosts',callback);
The Hello World, which you can run with node hello.js
  var http = require('http')

  http.createServer(function(request,response) {
    response.writeHead(200, {
      'Content-Type': 'text/html'
    });
    response.write("Hello!");
    setTimeout(function(){
      response.write("I am done.");
      response.end()
    }, 5000);
  }).listen(8080);

Events

Many objects in node emit events, for example, fs.readStream emits a data event, and obviously we can listen for these events. The objects that can emit events inherit from the EventEmitter class.

We can create custom EventEmitter:

  var EventEmitter = require('events').EventEmitter;
  var logger = new EventEmitter();
If you want to listen for error events:
  logger.on('error',function(message){
    console.log('Error: ' + message);
  });
And to trigger the event:
  logger.emit('error','An error');

In fact, earlier, when we called http.createServer(function(request,response) {...}), this returns a http.Server object, which is an EventEmitter, set to listen for request events (it is all in the documentation for node.js). It is possible to write it this way:

  var server = http.createServer();
  server.on('request', function(request,response}{
    response.writeHead(200);
    response.write("Hello, this is dog");
    response.end();
  });
which is how you can explicitly bind several listeners on the same object. Actually, you can listen to an event more than once:
var http = require('http');
var server = http.createServer();

server.on('request', function(request, response) {
  response.writeHead(200);
  response.write("Hello, this is dog");
  response.end();
});

server.on('request', function(request, response) {
  console.log("New request coming in...");
});

server.on('close', function(){
  console.log("Closing down the server...");
});

server.listen(8080);
In the code above, whenever a request event happens, two things happen: send a response and write to console.

Streams

Streams is about how data is transferred back and forth. Streams can be readable, writable, or both. For example, in the arguments for http.createServer, request is a readable stream, response is a writable stream. These streams are kept open until we close the connection.

How do you read from a stream (e.g. how do you read request?). The request object is an EventEmitter, and it emits two events: 'readable', when the server can start reading the data, and 'end', when the client finishes uploading the data. Here is an example code:

http.createServer(function(request,response){
  response.writeHead(200)
  request.on('readable',function(){
    var chunk = null;
    while (null !== (chunk = request.read())) {
      response.write(chunk);
    }
  });
  request.on('end'), function() {
    response.end();
  });
}).listen(8080);
(the .write function implicitly does toString()). However, there is an even faster method, using pipe:
http.createServer(function(request,response) {
  response.writeHead(200);
  request.pipe(response);
}).listen(8080);

How is that useful? Well, the second main example is about reading and writing file:

var fs = require('fs'); // require the filesystem module

var file = fs.createReadStream("readme.md");
var newfile = fs.createWriteStream("readme_copy.md");

file.pipe(newFile);
You can for example do request.pipe(newfile), and when you for example do
$ curl --upload-file readme.md localhost:8080
it will upload readme.md to newfile. Here is a complete implementation of file upload with progress update:
http.createServer(function(request,response){
  var newFile = fs.createWriteStream("readme_copy.md");
  var fileBytes = request.headers['content-length'];
  var uploadedBytes = 0;

  request.on('readable', function() {
    var chunk = null;
    while(null !== (chunk = request.read())) {
      uploadedBytes += chunk.length;
      var progress = (uploadedBytes / fileBytes)*100;
      response.write("progress: " + parseInt(progress,10) + "%\n");
    }
  });
  request.pipe(newFile);
}).listen(8080);
To do list: play around with http://gulpjs.com.