Node js Interview Questions

Are the pre-interview jitters bothering you? These top Node JS interview questions and answers would give you some relief. To boost your knowledge and confidence in this short time, we have curated a list of basic and advanced Node JS interview questions. Start preparing today!

  • 4.6 Rating
  • 19 Question(s)
  • 22 Mins of Read
  • 8819 Reader(s)

Advanced

npm is world’s largest software repository. Globally, open source developers use npm to share and borrow packages. Example, you need to install node and npm before getting necessary packages for Angular development.Packages are needed to bring modularity to code development.   

npm consists of three distinct components:

1) website – to discover packages and their dependencies (by searching), setup profiles for authentication, version control management 

2) CLI (Command line interface) -  runs on a terminal which is the way developers interact with npm.

3) registry – large public database of javascript software and meta-information surrounding it

Some of the uses of npm are:

1) manage multiple versions of code and code dependencies

2) update applications easily when underlying code is updated

3) Adapt packages of code for your apps, or incorporate packages as they are.

Blocking program 

  1. executes very much in sequence
  2. Easier to implement the logic

 Example: Let us create text file named blk.txt

blk.txt

 A technology that has emerged as the frontrunner for handling Big Data processing is Hadoop.

Create a javascript code as follows and save as blk.js

var fs = require("fs");
var data = fs.readFileSync('blk.txt');
console.log(data.toString());
console.log('Program Ended');

Execute the code. The result is 

A technology that has emerged as the frontrunner for handling Big Data processing is Hadoop.

Program Ended

In this example, program blocks until it reads the file and then only it proceeds to next statement or end the program.

Non Blocking program

  1. Do not execute in sequence
  2. In case a program needs to use any data to be processed, it should be kept within the same block to make it sequential execution.

Example: Use the same input file defined for blocking code example.

Create a javascript code as follows and save as nblk.js

var fs = require("fs");
fs.readFile('blk.txt', function (err,data) {
if (err) return console.error(err);
console.log(data.toString());
});
console.log("Program Ended"); 

Execute the code. The result is

Program Ended

A technology that has emerged as the frontrunner for handling Big Data processing is Hadoop.

This example shows that program doesn’t wait for file reading and prints “Program Ended” first. At the same time, the program without blocking continues to read the file.

  1. Node.js uses Events heavily. Most of Node.js API’s are built on Event driven architecture.

Every action on a computer is an event.

Example: File opening is an event.

Objects in Node.js can fire events like createReadStream object fires when opening or closing a file.

Example: to read a stream of characters in existing file trnk.txt

var fs = require("fs");
var rdstream = fs.createReadStream('trnk.txt');
rdstream.on('open', function(){
console.log("File Opened");
});  

Executing above code you will get result as 

File Opened

You can create, fire and listen to your events using Events module, eventEmitter class.

Events module and eventEmitter class is used to bind events and event-listener.

  1. import events module
  2. create an eventEmitter object
  3. create an eventHandler  

To fire an event use eventEmitter.emit(‘evenName’)

To bind an event handler with an event eventEmitter.on(‘eventName’, event handler)

Example: Refer the following example.

// Import Events module
var events = require('events');
// Create an eventemitter object
var eventEmitter = new events.EventEmitter();
//Create an event handler
var myEventHandler = function () {
console.log('I have completed');
}
// Assign evenhandler to an event
eventEmitter.on('complete', myEventHandler);
 
// Fire the complete event
eventEmitter.emit('complete')

Following is the result after executing the code.

Result: I have completed

For firing an event , use  emit method

For assigning an event, use on method.

The buffers module provides a way of handling streams of binary data.

Nodejs implements Buffer using Buffer class. 

Typically, the movement of data is done with the purpose of processing it, or read it, and make decisions based on it. But there is a minimum and a maximum amount of data a process could take over time. So if the rate at which the data arrives is faster than the rate at which the process consumes the data, the excess data need to wait somewhere for its turn to be processed.

On the other hand, if the process is consuming the data faster than it arrives, the few data that arrive earlier need to wait for a certain amount of data to arrive before being sent out for processing.

That “waiting area” is the buffer! It is a small physical location in your computer, usually in the RAM, where data are temporally gathered, wait, and are eventually sent out for processing during streaming.

Example: An example where you can see buffering in action is when you are trying to read an e-book( of size 500 pages with graphics) in google books. If internet is fast enough, the speed of stream is fast enough to fill up the buffer and send out for further processing, then fill another one and send out for processing, till stream is finished.

If your internet connection is slow, Google books display a loading icon, which means gathering more data or expecting more data to arrive. When the buffer is filled up and processed, google books show the page. While displaying the page, more data continues to arrive and wait in the buffer.

No. Buffer class is part of Global Modules.

- Convert Buffer to JSON

- Reading from Buffers

- Concatenating Buffers

- Compare Buffers

- Copy Buffer

Example:

How to convert binary equivalent of stream xyz to JSON format:

create a javascript file with following code.

var buf = Buffer.from('xyz');
console.log(buf.toJSON());

in the first line buf is the variable and Buffer is the Buffer class. Using toJSON method we can convert code as shown in the result below.

Execute the code : Following is the result

{ type: 'Buffer', data: [ 120, 121, 122 ] }

Duplex: Duplex streams are streams that implement both readable and writable interfaces

Examples:

  1. TCP sockets
  2. zlib streams ( Compression library functionality streams)
  3. cypto streams ( Cryptography functionality streams)

Transform: A type of duplex streams where output is in someway related to the input. Like duplex streams, Transform streams also implement both Readable and Writable interfaces.

Examples:

  1. zlib streams ( Compression library functionality streams)
  2. cypto streams ( Cryptography functionality streams)

Piping is 

- a mechanism where output of one stream is provided as input to another stream 

- normally used to get data from one stream and pass the data to another stream

No limit on piping operations.

Example: Create a text file dataSinput.txt with the following content.

Monday is first day of week

Tuesday is second day of week

Wednesday is third day of week

After executing the following code you can view the contents in the outputfile.

var fs = require("fs"); //import fs module
//creating a readstream to read our inputdatafile dataSinput.txt
var readStream = fs.createReadStream("F://dataSinput.txt");
//creating a writestream(initially empty) which is destination for transferred data 
var writeStream = fs.createWriteStream("F://dataSoutput.txt");
//Use Pipe command to transfer from readstream to writestream. 
//Pipe command takes all data from readstream and pushes it to writestream 
readStream.pipe(writeStream);

Output in dataSoutput.txt can be seen as

Monday is first day of week

Tuesday is second day of week

Wednesday is third day of week

Chaining is

- a mechanism to connect output of one stream to another stream and create a chain of multiple stream operations.

- normally used with piping operations.

Example: Create a text file dataSinput.txt with the following content.

Monday is first day of week

Tuesday is second day of week

Wednesday is third day of week

After executing following code.

var fs = require("fs"); //import fs module
var zlib = require("zlib"); //import zlib module
//creating a readstream to read our inputdatafile dataSinput.txt
var readStream = fs.createReadStream("F://dataSinput.txt");
//create a compressed folder zlib
var czlib = zlib.createGzip();
//creating a writestream(initially empty) which is destination for transferred data 
var writeStream = fs.createWriteStream("F://dataSoutput.txt.gz");
//Use Pipe command to transfer from readstream to gzip. 
//Pipe commands takes all data from readstream and pushes it to compressed writestream file  
readStream.pipe(czlib).pipe(writeStream);
console.log("File Compressed");

You get result as “File Compressed”. And compressed file dataSoutput.txt.gz as output 

Which consists of text file dataSoutput.txt.

Yes.Every method of fs module supports both synchronous and asynchronous forms.

Asynchronous methods take the first parameter of callback function as error and last parameter as completion function.

It is better to use an asynchronous method as it never blocks as program during execution whereas synchronous methods does block the program during execution.

import fs module and declare buffer class

  1. Open the file using fs.open method
  2. Execute the method ftruncate to truncate the opened file. Provide the name of the opened file , length of file after which the file will be truncated.
  3. Read the truncated file after successful truncation using fs.read method and buffer
  4. iclose the file using fs.close method

Example: Create a text file named trnk.txt with Knowledgehut tutorials as text in it

Create a javascript code as follows. Save it as trnkexmpl.js

var fs = require("fs"); //import module
var buf = new Buffer(1024); //define buffer
 
fs.open('trnk.txt', 'r+', function (err,fd) {
if (err) {
return console.error(err);
}
console.log("File opened");
//Truncate the open file)
fs.ftruncate(fd, 12, function(err) {
if (err) {
return console.log(err);
}
console.log("File Truncated")
console.log("Going to read same file") 
fs.read(fd, buf, 0, buf.length, 0 ,function(err, bytes) {
if(err) {
return console.log(err); 
}
//Print only read bytes
if(bytes > 0) {
console.log(buf.slice(0, bytes).toString());
}
 
//Close the opened file
fs.close(fd, function(err){
if (err) {
console.log(err);
}
console.log("File closed successfully");
});
});
});
 
});
 

Execute the code . You get the following result

File opened

File Truncated

Going to read same file

Knowledgehut

File closed successfully

  1. import fs module
  2. delete the file using fs.unlink method and name of file to be deleted as parameter.

Create a text file “demo_file_del.txt” file to be deleted.

var fs = require("fs");  //import fs module
console.log("going to delete demo_file_del.txt file")
fs.unlink('demo_file_del.txt', function(err) { // call unlink method)
if (err) {
return console.err(err);
}
console.log("File deleted successfully")
});

Result: 

going to delete demo_file_del.txt file

File deleted successfully

demo_file_del.txt was deleted.

Beginner

  1.  A server side platform built on Google Chrome’s Javascript V8 Engine.
  2. Open source, cross platform runtime environment for developing server side and networking applications. Node.js runs on various platforms like Windows, Linux, MacOs.

For those coming from Java development background , here is the analogy.

It is not advisable to use Node.js for CPU intensive applications

Because node.js is designed around using a single thread very efficiently. Its event based model dispatches code fragments when specific events occur. Those code fragments are supposed to execute very quickly and then return control to node.js, which then dispatches the next event.

If one of those code fragments performs a long running task, then no more events will be dispatched and the whole system appears to hang.

Microsoft, Paypal, Uber

Microsoft Visual Studio code uses Node.Js as a platform.

Paypal migrated their application from Java backend to Node.JS and achieved benefits like increased productivity and lesser lines of written code and fewer files are constructed.

Uber was among the first to put Node.js into full production. Its mobile application process needed high scalability option due to increasing demand of users. Uber’s Despatching system runs on Node.JS. Uber achieved benefits like faster code deployment and effective error analysis(without restart) and high scalability.

Here is the list of important features of Node.JS

No buffering: Node.js applications never buffer any data(ASCII). These applications simply output data in chunks.

Extremely Fast: Node.js is built on Google Chrome’s Javascript V8 Engine so its library is very fast in code execution. As you are aware Javascript is used to built client side applications which is also built on V8 engine, so this makes Node JS applications faster.

I/O is Asynchronous and Event Driven: All APIs of Node.js library are asynchronous i.e. non-blocking. So a Node.js based server never waits for an API to return data. The server moves to the next API after calling it and a notification mechanism of Events of Node.js helps the server to get a response from the previous API call. It is also a reason that makes it is very fast

Highly Scalable: Node.js is highly scalable because event mechanism helps the server to respond in a non-blocking way.

  1. I/O bound applications.
    It can be anything ranging from reading/writing local files to making an HTTP request to  an API. 
  2. Data streaming applications.
    Stock broker trading applications – takes those prices and send them in realtime to    customers or to other application for further processing.
  3. Data intensive Real time applications. Like Chat applications like Whatsapp, facebook,  Linkedin
  4. JSON API based applications like Twitter
  5. Single page applications like Angular applications.
    Angular uses npm (NodeJS package manager) for accessing its libraries. For speeding  up angular development you need CLI (Command level Interface)which inturn requires  node and npm.

Streams are a way handling following:

  1. reading/ writing files
  2. network communications
  3. any kind of end-to-end information exchange.

It is the movement of data from one point to another.

When a program is supposed to read a file consisting of single page(three to four lines), it will be initially read into memory from start to finish and then starts processing. If the file is an e-book consisting of 500+ pages , then it takes lot of storage space and time to be loaded into memory, before started processing. This is where Streams make a difference.

Using streams, you read it piece by piece , processing its content without keeping it in memory.

The following are the advantages of streams

  1. Memory Efficiency: You don’t load large amounts data before processing it, It just reads a block of data and process it immediately and continues the same till end of the file.
  2. Time Efficiency: It takes way less time to process the data as soon as you have it without having to wait for whole data to be loaded or available.

The following code statement refer to stream module

const stream = require(‘Stream’);

The following are the types of streams in Node.Js:

1) Readable Streams: A stream where you can recieve data but cannot send it.  When you push data into a readable stream , it is buffered, until customer starts to read the data

    Example: HTTP Requests to the server. Based on the HTTP request, server sends HTTP response to the client which is a readable stream. Another example is RSS feed posted by remote servers on the HTTP clients are readonly streams.

Module used: fs.createReadStream

2) Writable Streams: A stream where you can send the data but not recieve  from it.

    Example: user entered data on HTTP clients go as HTTP Responses to the server where data is written.

Module used: fs.createWriteStream()

3) Duplex Streams: Streams that are both readable and writable.

Example: TCP sockets

Module used: net.socket

4) Transform Streams: : A type of duplex streams where output is in someway related to the input. Like duplex streams, Transform streams also implement both Readable and Writable interfaces.

Description

Many reputed companies are hunting for a good web developer and if you’re passionate about being web developer and planning to opt for Node JS as a career building tool, you are already on the right track! Make the best use of your time and be thorough with these Node js interview questions and the best answers. Our basic and advanced Node.js interview questions are followed by answers from industry experts so that you can prepare better for your upcoming interview. These top Node js interview questions will help to save your time in preparation and will definitely help your interviewer to understand your deep knowledge of Nodejs.

We’ve listed all the frequently asked questions and answers which will help you understand clearly and these simple to remember. The answers you find here are prepared by industry experts.

All our interview questions for nodejs are up-to-date and our aim is to keep you always updated with the latest interview questions. These Node JS Interview Questions and answers will definitely help you in cracking your interview and follow your dream career as a Node.JS Developer. 

Practice these interview questions. Be confident, gear up. All the best!
 

Read More
Levels