Search

What is Node JS Process?

How to use the global process moduleNode.js is built using C++ program with a V8 engine embedded in it to provide an environment to execute JavaScript outside the browser. Many of the accessible global methods are actually wrappers around methods which make calls to core C libraries.Node.js has many in-built global identifiers that are available and known to developers. There are also some which are accessible at the module level via module inheritance.Few of the global objects are:global - It is a namespace and accessible across the application. Setting a property to this namespace makes it accessible throughout the running process.process - Process is a module which provides interaction with the current Node.js process.console - Console is a module mostly used for logging the information or error. It wraps the STDIO functionality of a process.setTimeout(), clearTimeout(), setInterval(), clearInterval() - All these can be categorized as timer functions.Some of the globules accessible via module inheritance are module, exports, __filename, __dirname, require() etc.In this article, we attempt to understand more about the ‘process’ object and its details with examples. ‘process’ object is a global which is an instance of EventEmitter and can be accessed directly. It can also be explicitly accessed using module global i.e. require.const process = require(‘process’);process object has a property ‘argv’ which is an array containing the properties passed to node process.Create a simple index.js file and lets console.log the process.argvconsole.log(process.argv) Type in ‘node index.js’ in terminal. On pressing Enter, the following output should be provided:[   '/Users/***/.nvm/versions/node/v12.20.1/bin/node',   '/Users/***/index.js' ]Now, let’s pass in some other parameters and you should see the parameters being displayed say ‘node index.js test’.Also note that process has ‘process.stdout’ and ‘process.stderr’ which helps us to send a message to the standard output channel and standard error channel if there is any error.Infact, the console.log is internally doing process.stdout.write(msg + ‘\n’).console.log(‘Hello world’) is the same as process.stdout.write(‘Hello world’ + ‘\n’).Files can be read as stream and we can pipe this to process.stdout. For example, replace the content of index.js with the below code.var fs = require('fs') fs.createReadStream(__filename).pipe(process.stdout);Running the ‘node index.js’ should display the content of the file. Another interesting thing about Node.js is that when it is done or doesn’t have anything left to do, then it will exit out of the process. Let’s understand this with an example.setTimeout(() => { process.stdout.write('Executed after 1000 ms' + '\n'); }, 1000)This one will wait for 1 sec and then outputs ‘Executed after 1000 ms’ and terminates the process.If we want the process to run forever, then we can replace the setTimeout with setInterval which will execute the callback every time, post the time interval. And the only way to exit is by pressing ‘Ctrl+C’ or the process gets crashed.To get a quick walk through of properties, methods and events on the process object, add ‘console.log(process)’ in index.js and run node index.js. Most of them are self-explanatory as per their name.version: 'v12.20.1', //current version of node versions: {…}, // gives insight about the node and its core components like V8 engine version arch: 'x64', platform: 'darwin', release: {…}, // details of node source and version of lts. moduleLoadList: [...], // details of modules available with node. binding: [Function: binding], _events: [Object: null prototype] {   newListener: [Function: startListeningIfSignal], // whenever a new listener is added   removeListener: [Function: stopListeningIfSignal], // existing listener is removed   warning: [Function: onWarning],   SIGWINCH: [Function]   },   _eventsCount: 4,   _maxListeners: undefined,   domain: null,   _exiting: false,   config: {   target_defaults: {…},   variables: {...}   },   cpuUsage: [Function: cpuUsage],   resourceUsage: [Function: resourceUsage],   memoryUsage: [Function: memoryUsage],   kill: [Function: kill],   exit: [Function: exit],   openStdin: [Function],   getuid: [Function: getuid],   geteuid: [Function: geteuid],   getgid: [Function: getgid],   getegid: [Function: getegid],   getgroups: [Function: getgroups],   allowedNodeEnvironmentFlags: [Getter/Setter],   assert: [Function: deprecated],   features: {…},   setUncaughtExceptionCaptureCallback: [Function: setUncaughtExceptionCaptureCallback],   hasUncaughtExceptionCaptureCallback: [Function: hasUncaughtExceptionCaptureCallback],   emitWarning: [Function: emitWarning],   nextTick: [Function: nextTick],   stdout: [Getter],   stdin: [Getter],   stderr: [Getter],   abort: [Function: abort],   umask: [Function: wrappedUmask],   chdir: [Function: wrappedChdir],   cwd: [Function: wrappedCwd],   initgroups: [Function: initgroups],   setgroups: [Function: setgroups],   setegid: [Function],   seteuid: [Function], setgid: [Function], setuid: [Function], env: {…}, // environment details for the node application title: 'node', argv: [      '/Users/srgada/.nvm/versions/node/v12.20.1/bin/node',     '/Users/srgada/index.js'  ], execArgv: [], pid: 29708, ppid: 19496, execPath: '/Users/srgada/.nvm/versions/node/v12.20.1/bin/node', debugPort: 9229, argv0: 'node', mainModule: Module {…} //details of the main starting file or module. This is deprecated in latest one and use require.main instead  }Let’s take a look at a few of the properties which are most used or required.pid – gives the process id platform – is linux or darwin version – node version title – process name, by default it is node and can be changed execPath – for executable process path argv – arguments passedSome common methods are:exit - exits the process and accepts the exit code as argument. cwd – to get the current working directory and to change we can use ‘chdir’. nextTick – as the name suggests, it places the callback passed to this function in the next iteration of event loop. It is different to setTimeout with 0 ms delay.process.nextTick(() => {      console.log('Got triggered in the next iteration of event loop');   });   setTimeout(() => {      console.log("Even after nextTick is executed");   }, 0);   console.log("First text to be printed"); Output: First text to be printed   Got triggered in the next iteration of event loop   Executed after some delayEVENTS: To log (or) perform any cleaning before exiting the process, we can hook to ‘exit’ event which is raised when process.exit is invoked.console.log(process.argv); process.on('exit', () => {      console.log('Perform any clean up like saving or releasing any memory');   });Exit is fired after the event loop is terminated. As a result, we can’t perform any async work in the handler. So if you want to perform some calls like saving content to db, we can hook to ‘beforeExit’ when the process gets exited.process.on('beforeExit', code => {     // Can make asynchronous calls     setTimeout(() => {       console.log(`Process will exit with code: ${code}`)       process.exit(code)     }, 1000)   });   process.on('exit', code => {     // Only synchronous calls     console.log(`Process exited with code: ${code}`)   });     console.log('After this, process will try to exit');Another event ‘uncaughtException’, as the name suggests, is raised when there is an unhandled exception in the application. Whenever an unhandled exception is found, the Node.js application logs the stack raise and exit.process.on('exit', () => {       console.log('Perform any clean up like saving or releasing any memory');   });   process.on('uncaughtException', (err) => {       console.error('An unhandled exception is raised. Look at stack for more details');       console.error(err.stack);     process.exit(1); }); var test = {};   //raises an exception.   test.unKnownObject.toString();OutputAn unhandled exception is raised. Look at stack for more details TypeError: Cannot read property 'toString' of undefined at Object.<anonymous> (/Users/srgada/index.js:10:20) at Module._compile (internal/modules/cjs/loader.js:999:30) at Object.Module._extensions..js (internal/modules/cjs/loader.js:1027:10) at Module.load (internal/modules/cjs/loader.js:863:32) at Function.Module._load (internal/modules/cjs/loader.js:708:14) at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:60:12) at internal/main/run_main_module.js:17:47 Perform any clean up like saving or releasing any memorySimilar to ‘uncaughtException’, a newer concept called unhandledRejection error is introduced. This is raised if a promise is rejected and there is no handler to the response.In both the cases, it is expected that the application will crash and should not be continued.  One reason could be that the application might be in an undefined state. If you are wondering why someone would hook to this event, then it is to perform synchronous cleanup of allocated resources (e.g. file descriptors, handles, etc) before shutting the process.Note: ‘beforeExit’ event is not fired when there is an ‘uncaughtException’ or process.exit is called explicitly.Signal Events: Events emitted by operating system to Node.js process are referred to as signals. Most common among them are SIGTERM and SIGINT. Both are related to process termination. Signals are not available to worker threads. Let’s look into an example for SIGINT:setInterval(() => {     console.log('continued process');  }, 1000); process.on('SIGINT', signal => {     console.log(`Process ${process.pid} has been interrupted`)     process.exit(0) });In the terminal, execute node index.js. This will be a continuous process without any exit criteria because of setInterval.  Press Ctrl+C, the result is that ‘SIGINT’ event is raised to node application and is captured in handler. Because of process.exit command in handler, the process exits.Node.js is a single thread process. And in some cases, you may want some specific logic to be run in the child process and not in the main one, so that if any crash happens the main process is still alive.Taking the previous example of displaying the content of the index.js file, let’s do it this time with the help of ‘child_process’ module.var exec = require('child_process').exec; exec('cat index.js',(err, stdout, stderr) => { console.log(stdout); });Note: cat is a binary which is available on iOS. This may vary based on your operating system. ‘spawn’ on child process is similar to exec but it gives more granular control of how the processes are executed.Let’s spin off a child process from the parent process and pass on the data from child process to child process.var spawn = require('child_process').spawn;   if (process.argv[2] === 'childProcess')   {      console.log('Child process is executed’);   } else {      var child = spawn(process.execPath, [__filename, 'childProcess']);      child.stdout.on('data', (data) => {         console.log('from child:', data.toString());      })   }Overview: At line 6, spawn is provided with the process to execute and second parameter is the parameter passed to it.Since we want to spin off another node process, we rely on ‘process.execPath’.‘__filename’ is the name of the file i.e. index.js and second argument being ‘childProcess’.When child process is spawned [like node index.js childProcess], condition at line 2 is satisfied and sends out the data to the parent process.Parent process captures the stdout of child on data changed event and prints the same in stdout. Output: from child: Child process is runningAs we learnt earlier, all stdout is a stream and we can pipe the child.stdout to the parent.stdout directly instead of listening to the data changed event. Replace line 7,8,9 with this:child.stdout.pipe(process.stdout);Another shorthand version is to provide the 3rd parameters to the spawn without the need to pipe as shown below: [it basically inherits allstdio from the parent and does pipe for you]var child = spawn(process.execPath, [__filename, 'childProcess'], {   stdio: 'inherit' });Note that, each child process is self-contained and data is not shared across multiple child processes or parent.What if you want to transfer the data or control the child process like terminate if needed?The third parameter of the spawn containing the stdio is basically an array with stdIn, stdOut and stdErr. Passing null will take the default. We can pass in the fourth item ‘pipe’ which helps in sending the data from child process.var spawn = require('child_process').spawn; if (process.argv[2] === 'childProcess') {     var net = require('net');     var pipe = new net.Socket({ fd: 3 });     pipe.write('Terminate me'); } else {     var child = spawn(process.execPath, [__filename, 'childProcess'], {        stdio: [null, null, null, 'pipe']     });     child.stdio[3].on('data', (data) => {     if (data.toString() === 'Terminate me') {        console.log('Terminated child process');        child.kill();     } }); }From the above code snippet, we can see that child creates a socket with parent and sends the data. The parent is listening to it and does the required operation i.e. to terminate the child process.ConclusionNode.js is a single-threaded, non-blocking performance and works great for a single process. But what if we want to scale up and have a distributed application? Regardless of how performant the server is, single thread can only support a limited load. To overcome this, Node.js has to work with multiple processes and even on multiple machines.
What is Node JS Process?
Sumanth
Sumanth

Sumanth Reddy

Author

Full stack, UI Architect having 14+ Years of experience in web, desktop and mobile application development with strong Javascript/.Net programming skills . 


Strong experience in microsoft tech stack and Certified in OCI associate . 


Go-to-guy on integration of applications, building a mobile app by zeroing on tech stack. Majorily experienced on engineering based IIOT products and Marekting Automation products deployed on premise and on-cloud. 

Posts by Sumanth Reddy

What is Node JS Process?

How to use the global process moduleNode.js is built using C++ program with a V8 engine embedded in it to provide an environment to execute JavaScript outside the browser. Many of the accessible global methods are actually wrappers around methods which make calls to core C libraries.Node.js has many in-built global identifiers that are available and known to developers. There are also some which are accessible at the module level via module inheritance.Few of the global objects are:global - It is a namespace and accessible across the application. Setting a property to this namespace makes it accessible throughout the running process.process - Process is a module which provides interaction with the current Node.js process.console - Console is a module mostly used for logging the information or error. It wraps the STDIO functionality of a process.setTimeout(), clearTimeout(), setInterval(), clearInterval() - All these can be categorized as timer functions.Some of the globules accessible via module inheritance are module, exports, __filename, __dirname, require() etc.In this article, we attempt to understand more about the ‘process’ object and its details with examples. ‘process’ object is a global which is an instance of EventEmitter and can be accessed directly. It can also be explicitly accessed using module global i.e. require.const process = require(‘process’);process object has a property ‘argv’ which is an array containing the properties passed to node process.Create a simple index.js file and lets console.log the process.argvconsole.log(process.argv) Type in ‘node index.js’ in terminal. On pressing Enter, the following output should be provided:[   '/Users/***/.nvm/versions/node/v12.20.1/bin/node',   '/Users/***/index.js' ]Now, let’s pass in some other parameters and you should see the parameters being displayed say ‘node index.js test’.Also note that process has ‘process.stdout’ and ‘process.stderr’ which helps us to send a message to the standard output channel and standard error channel if there is any error.Infact, the console.log is internally doing process.stdout.write(msg + ‘\n’).console.log(‘Hello world’) is the same as process.stdout.write(‘Hello world’ + ‘\n’).Files can be read as stream and we can pipe this to process.stdout. For example, replace the content of index.js with the below code.var fs = require('fs') fs.createReadStream(__filename).pipe(process.stdout);Running the ‘node index.js’ should display the content of the file. Another interesting thing about Node.js is that when it is done or doesn’t have anything left to do, then it will exit out of the process. Let’s understand this with an example.setTimeout(() => { process.stdout.write('Executed after 1000 ms' + '\n'); }, 1000)This one will wait for 1 sec and then outputs ‘Executed after 1000 ms’ and terminates the process.If we want the process to run forever, then we can replace the setTimeout with setInterval which will execute the callback every time, post the time interval. And the only way to exit is by pressing ‘Ctrl+C’ or the process gets crashed.To get a quick walk through of properties, methods and events on the process object, add ‘console.log(process)’ in index.js and run node index.js. Most of them are self-explanatory as per their name.version: 'v12.20.1', //current version of node versions: {…}, // gives insight about the node and its core components like V8 engine version arch: 'x64', platform: 'darwin', release: {…}, // details of node source and version of lts. moduleLoadList: [...], // details of modules available with node. binding: [Function: binding], _events: [Object: null prototype] {   newListener: [Function: startListeningIfSignal], // whenever a new listener is added   removeListener: [Function: stopListeningIfSignal], // existing listener is removed   warning: [Function: onWarning],   SIGWINCH: [Function]   },   _eventsCount: 4,   _maxListeners: undefined,   domain: null,   _exiting: false,   config: {   target_defaults: {…},   variables: {...}   },   cpuUsage: [Function: cpuUsage],   resourceUsage: [Function: resourceUsage],   memoryUsage: [Function: memoryUsage],   kill: [Function: kill],   exit: [Function: exit],   openStdin: [Function],   getuid: [Function: getuid],   geteuid: [Function: geteuid],   getgid: [Function: getgid],   getegid: [Function: getegid],   getgroups: [Function: getgroups],   allowedNodeEnvironmentFlags: [Getter/Setter],   assert: [Function: deprecated],   features: {…},   setUncaughtExceptionCaptureCallback: [Function: setUncaughtExceptionCaptureCallback],   hasUncaughtExceptionCaptureCallback: [Function: hasUncaughtExceptionCaptureCallback],   emitWarning: [Function: emitWarning],   nextTick: [Function: nextTick],   stdout: [Getter],   stdin: [Getter],   stderr: [Getter],   abort: [Function: abort],   umask: [Function: wrappedUmask],   chdir: [Function: wrappedChdir],   cwd: [Function: wrappedCwd],   initgroups: [Function: initgroups],   setgroups: [Function: setgroups],   setegid: [Function],   seteuid: [Function], setgid: [Function], setuid: [Function], env: {…}, // environment details for the node application title: 'node', argv: [      '/Users/srgada/.nvm/versions/node/v12.20.1/bin/node',     '/Users/srgada/index.js'  ], execArgv: [], pid: 29708, ppid: 19496, execPath: '/Users/srgada/.nvm/versions/node/v12.20.1/bin/node', debugPort: 9229, argv0: 'node', mainModule: Module {…} //details of the main starting file or module. This is deprecated in latest one and use require.main instead  }Let’s take a look at a few of the properties which are most used or required.pid – gives the process id platform – is linux or darwin version – node version title – process name, by default it is node and can be changed execPath – for executable process path argv – arguments passedSome common methods are:exit - exits the process and accepts the exit code as argument. cwd – to get the current working directory and to change we can use ‘chdir’. nextTick – as the name suggests, it places the callback passed to this function in the next iteration of event loop. It is different to setTimeout with 0 ms delay.process.nextTick(() => {      console.log('Got triggered in the next iteration of event loop');   });   setTimeout(() => {      console.log("Even after nextTick is executed");   }, 0);   console.log("First text to be printed"); Output: First text to be printed   Got triggered in the next iteration of event loop   Executed after some delayEVENTS: To log (or) perform any cleaning before exiting the process, we can hook to ‘exit’ event which is raised when process.exit is invoked.console.log(process.argv); process.on('exit', () => {      console.log('Perform any clean up like saving or releasing any memory');   });Exit is fired after the event loop is terminated. As a result, we can’t perform any async work in the handler. So if you want to perform some calls like saving content to db, we can hook to ‘beforeExit’ when the process gets exited.process.on('beforeExit', code => {     // Can make asynchronous calls     setTimeout(() => {       console.log(`Process will exit with code: ${code}`)       process.exit(code)     }, 1000)   });   process.on('exit', code => {     // Only synchronous calls     console.log(`Process exited with code: ${code}`)   });     console.log('After this, process will try to exit');Another event ‘uncaughtException’, as the name suggests, is raised when there is an unhandled exception in the application. Whenever an unhandled exception is found, the Node.js application logs the stack raise and exit.process.on('exit', () => {       console.log('Perform any clean up like saving or releasing any memory');   });   process.on('uncaughtException', (err) => {       console.error('An unhandled exception is raised. Look at stack for more details');       console.error(err.stack);     process.exit(1); }); var test = {};   //raises an exception.   test.unKnownObject.toString();OutputAn unhandled exception is raised. Look at stack for more details TypeError: Cannot read property 'toString' of undefined at Object. (/Users/srgada/index.js:10:20) at Module._compile (internal/modules/cjs/loader.js:999:30) at Object.Module._extensions..js (internal/modules/cjs/loader.js:1027:10) at Module.load (internal/modules/cjs/loader.js:863:32) at Function.Module._load (internal/modules/cjs/loader.js:708:14) at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:60:12) at internal/main/run_main_module.js:17:47 Perform any clean up like saving or releasing any memorySimilar to ‘uncaughtException’, a newer concept called unhandledRejection error is introduced. This is raised if a promise is rejected and there is no handler to the response.In both the cases, it is expected that the application will crash and should not be continued.  One reason could be that the application might be in an undefined state. If you are wondering why someone would hook to this event, then it is to perform synchronous cleanup of allocated resources (e.g. file descriptors, handles, etc) before shutting the process.Note: ‘beforeExit’ event is not fired when there is an ‘uncaughtException’ or process.exit is called explicitly.Signal Events: Events emitted by operating system to Node.js process are referred to as signals. Most common among them are SIGTERM and SIGINT. Both are related to process termination. Signals are not available to worker threads. Let’s look into an example for SIGINT:setInterval(() => {     console.log('continued process');  }, 1000); process.on('SIGINT', signal => {     console.log(`Process ${process.pid} has been interrupted`)     process.exit(0) });In the terminal, execute node index.js. This will be a continuous process without any exit criteria because of setInterval.  Press Ctrl+C, the result is that ‘SIGINT’ event is raised to node application and is captured in handler. Because of process.exit command in handler, the process exits.Node.js is a single thread process. And in some cases, you may want some specific logic to be run in the child process and not in the main one, so that if any crash happens the main process is still alive.Taking the previous example of displaying the content of the index.js file, let’s do it this time with the help of ‘child_process’ module.var exec = require('child_process').exec; exec('cat index.js',(err, stdout, stderr) => { console.log(stdout); });Note: cat is a binary which is available on iOS. This may vary based on your operating system. ‘spawn’ on child process is similar to exec but it gives more granular control of how the processes are executed.Let’s spin off a child process from the parent process and pass on the data from child process to child process.var spawn = require('child_process').spawn;   if (process.argv[2] === 'childProcess')   {      console.log('Child process is executed’);   } else {      var child = spawn(process.execPath, [__filename, 'childProcess']);      child.stdout.on('data', (data) => {         console.log('from child:', data.toString());      })   }Overview: At line 6, spawn is provided with the process to execute and second parameter is the parameter passed to it.Since we want to spin off another node process, we rely on ‘process.execPath’.‘__filename’ is the name of the file i.e. index.js and second argument being ‘childProcess’.When child process is spawned [like node index.js childProcess], condition at line 2 is satisfied and sends out the data to the parent process.Parent process captures the stdout of child on data changed event and prints the same in stdout. Output: from child: Child process is runningAs we learnt earlier, all stdout is a stream and we can pipe the child.stdout to the parent.stdout directly instead of listening to the data changed event. Replace line 7,8,9 with this:child.stdout.pipe(process.stdout);Another shorthand version is to provide the 3rd parameters to the spawn without the need to pipe as shown below: [it basically inherits allstdio from the parent and does pipe for you]var child = spawn(process.execPath, [__filename, 'childProcess'], {   stdio: 'inherit' });Note that, each child process is self-contained and data is not shared across multiple child processes or parent.What if you want to transfer the data or control the child process like terminate if needed?The third parameter of the spawn containing the stdio is basically an array with stdIn, stdOut and stdErr. Passing null will take the default. We can pass in the fourth item ‘pipe’ which helps in sending the data from child process.var spawn = require('child_process').spawn; if (process.argv[2] === 'childProcess') {     var net = require('net');     var pipe = new net.Socket({ fd: 3 });     pipe.write('Terminate me'); } else {     var child = spawn(process.execPath, [__filename, 'childProcess'], {        stdio: [null, null, null, 'pipe']     });     child.stdio[3].on('data', (data) => {     if (data.toString() === 'Terminate me') {        console.log('Terminated child process');        child.kill();     } }); }From the above code snippet, we can see that child creates a socket with parent and sends the data. The parent is listening to it and does the required operation i.e. to terminate the child process.ConclusionNode.js is a single-threaded, non-blocking performance and works great for a single process. But what if we want to scale up and have a distributed application? Regardless of how performant the server is, single thread can only support a limited load. To overcome this, Node.js has to work with multiple processes and even on multiple machines.
3484
What is Node JS Process?

How to use the global process moduleNode.js is bui... Read More

What is NPM in Node JS?

If you have ever worked with Node or any JavaScript framework, then you have already worked with NPM directly or indirectly. Let’s get into the details to understand more about NPM, and learn how to install the latest versions and manage different versions using NVM. In later articles we can discuss mastering basic tools like adding, updating or removing a package. We can also touch base on some advanced features like cache, audits, scripting and more. What is NPM ? To build an application with JavaScript or any other frameworks like Angular, React, Express or Node.js, we rely on a Package Manager to install its dependencies. This package manager is called as NPM. Note: Dependencies are the packages that are used in our projects and part of the package.json. NPM comes with Node.js and is pre-installed with Node. Package.json is the file containing the details of your project such as who created it, and what is the version of node and packages that your project is dependent on. A package is basically a set of files combined together to serve a specific interest of function. If we visit then we can search for different packages based on the functionality we need. For example, if you search for ‘date format’ to support multiple locales, we get 1171 packages (as on the date this article was written) with the topmost package being ‘moment’.  In short, if you are thinking of a functionality to build, then there is a high chance that there is a package already available in NPM for the same. As of January 2021, the current count is 1,493,231 packages.  NPM is used for building lightweight projects that can be easily shared across multiple development teams without dependencies being shared. It allows free use of resources and installs the dependencies only when needed. Install npm Node/npm can be installed on Mac, Windows or Linux as well. Let’s go through the steps involved in installing it on Mac. The approach should be similar for the other Operating systems. For Linux, refer to this. Navigate to and you will see the download section. Under the download, there is the other downloads link which will display different operating systems. Download the one specific to your Operating System. Double click on the node-v**.**.*.pkg to install the node. It is a simple wizard with straightforward steps. npm is installed with Node.js If you are wondering why we are installing node instead of NPM, this is because Node.js installs NPM along with it. This is evident from the below snapshot. How to check that you have node and npm installed The most common way to check if node or npm is installed is by looking at the terminal. Open terminal and Type in `node -v` and you should see the same version display in the installation wizard. The same applies to npm; i.e. You can type in ‘npm -v’ in terminal. On Versions - A note on versions, npm versions, Node versions and Long-Term Support When we are downloading the node.js from, we have two options; i.e. LTS and Current, both pointing to different versions, and currently on the higher version. LTS stands for ‘long time support’ and Current is the version that has the latest features, and offers support for 6 months. After 6 months, odd-numbered releases will become unsupported and even numbered ones will be given LTS status with support for 30 months. So, for all production applications, one should rely on Active LTS or Maintenance LTS releases. Current can be used for any trainings or by source contributors i.e. library authors.  Use a Node.js version manager Imagine you are working on an Enterprise application for an organization which uses a specific version of Node LTS. Also imagine that there is another app (it could be your pet project) that you are working on, for which you prefer to work on the latest version. But how can we have two different versions of Node in the same machine?  To achieve the same, we have NVM (Node.js version manager). Here is the official site for Mac and Linux users. Windows users can visit this link. Follow the installation steps to install the NVM on your machine.  To verify if NVM is installed correctly, open the terminal and type in ‘nvm --version.’ Type in ‘nvm list’ to display all the node.js versions that are installed on your machine. For now, you should be seeing only one version. Say you want to install an outdated version of Node.js, say 12. Type in ‘nvm install 12’ in terminal and it should install the 12 version of Node.js for you.  Now, type in ‘nvm list’ to see both the versions of node that are available for use. To switch to a specific version of node, type in ‘nvm use 12’. To check if it is the active one, type in ‘node -v’. Now you are good to go ahead with your project for that specific version of node.js. ConclusionNPM is one of the world's largest software registries. The Source contributors or developers are from across the world, and use npm to share or use packages. Many organizations/firms use npm for private development as well. NPM has 3 components i.e. Website, Command Line Interface and Registry.  We used the website to identify the package for ‘date format’ above. We learnt to set up private packages as well alongside public. The Command Line Interface is run from the terminal and is used by most of the developers, and the Registry is the public database of the JavaScript software.  
5643
What is NPM in Node JS?

If you have ever worked with Node or any JavaSc... Read More

Node.Js - Net Module

Node.js has a ‘net’ module which provides an asynchronous network API for creating stream-based TCP/IPC servers and clients. It can be accessed using: const net = require('net');  To create a TCP/IPC based server, we use the createServer method. var server = net.createServer(); The ‘server' object is of type net.Server. Let’s explore a few properties, events and methods on this class. First and foremost, the method needed is ‘listen’ which starts the server for listening to connections in async, firing the ‘listening’ event. server.listen(9000, () => {    console.log('opened server on port: ', 9000);  }); To find out on which address a server is running, we can use the address() method on the net.Server instance. If we need to log the port on which the server is running, then we can get this info as well without hardcoding. server.listen(9000, () => {    console.log('opened server on %j', server.address().port);  }); The first parameter of listen is the port in which the server starts listening, and a callback which gets called once it has started listening. A few of the common errors raised are:  ERR_SERVER_ALREADY_LISTEN – server is already listening and hasn’t been closed. EADDRINUSE – another server is already listening on the given port/handle/path. Whenever an error happens, an ‘error’ event is raised. We can hook to it and capture the errors accordingly. server.on('error', (e) => {    if (e.code === 'EADDRINUSE') {      console.log('Address in use, retrying...');      setTimeout(() => {        server.close();        server.listen(PORT, HOST);      }, 1000);    }  }); Whenever a client connects to this server then a 'connection' event is raised and in the callback we can get hold of the client object for communicating data. server.on("connection", (socket) => {    console.log("new client connection is made");  }); The second parameter is actually a callback which has the reference to the connection object, and the client object is of type ‘net.Socket’. To get the details like address and port, we can rely on remoteAddress, and remotePort properties respectively.   server.on("connection", (socket) => {    console.log("Client connection details - ", socket.remoteAddress + ":" + socket.remotePort);  }); Let’s assume that we are developing an application server like bot which needs to take inputs from clients and respond to the client. We can get hold of the client object and send messages to it from the server. As soon as the client is connected, we can send a sample return message on successful connection. server.on("connection", (socket) => {    console.log("Client connection details - ", socket.remoteAddress + ":" + socket.remotePort);    socket.write('SERVER: Hello! Connection successfully made.');  }); Now if there is any data being sent by client, we can capture that data on the server by subscribing to ‘data’ event on the client socket object.  socket.on('data', (data) => {    console.log(data.toString());// since data is streamed in bytes, toString is used.  }); Some of the most commonly used events on ‘net.Socket’ are data, error and close. As the names suggest, data is for listening to any data sent, error when there is an error happens and close event is raised when a connection is closed which happens once. Here is an example in server.js file: const net = require('net');  var server = net.createServer();  server.on("connection", (socket) => {    console.log("new client connection is made", socket.remoteAddress + ":" + socket.remotePort);    socket.on("data", (data) => {      console.log(data.toString());    });    socket.once("close", () => {      console.log("client connection closed.");    });    socket.on("error", (err) => {      console.log("client connection got errored out.")    });    socket.write('SERVER: Hello! Connection successfully made.');  });  server.on('error', (e) => {    if (e.code === 'EADDRINUSE') {      console.log('Address in use, retrying...');      setTimeout(() => {        server.close();        server.listen(PORT, HOST);      }, 1000);    }    else {      console.log("Server failed.")    }  });  server.listen(9000, () => {    console.log('opened server on %j', server.address().port);  }); ‘net’ module also has another class type net.BlockList. This helps in controlling or disabling the inbound or outbound traffic based on rules from any specific IP addresses, IP ranges, or IP subnets. Here is an example snippet from the documentation: const blockList = new net.BlockList();  blockList.addAddress('123.123.123.123');  blockList.addRange('10.0.0.1', '10.0.0.10');  blockList.addSubnet('8592:757c:efae:4e45::', 64, 'ipv6');  console.log(blockList.check('123.123.123.123')); // Prints: true  console.log(blockList.check('10.0.0.3')); // Prints: true  console.log(blockList.check('222.111.111.222')); // Prints: false  // IPv6 notation for IPv4 addresses works:  console.log(blockList.check('::ffff:7b7b:7b7b', 'ipv6')); // Prints: true  console.log(blockList.check('::ffff:123.123.123.123', 'ipv6')); // Prints: true Now that we have the server up and running, we can build a client to connect to the server and start sending bi-directional data. This client could be another node.js application, java/c# application working with TCP sockets, asp.net MVC application talking to node.js TCP server or any other client application. But that client application should have TCP based communication mechanism support. Since we are talking about ‘net’ module, let’s build the client application as well using net module. Moreover, it supports TCP based communication as well. 'net’ module has a factory function called ‘createConnection’ which immediately creates a socket and establishes a connection with the server running on the specified port.  Let's create another client.js file and create a connection. const net = require('net');  const client = net.createConnection({ port: 9000 }, () => {    console.log('CLIENT: I connected to the server.');  }); The first parameter contains the details of the server. Since we are running the server locally, providing the port number would suffice for us as the host default address is localhost for TCP connections. The second parameter is the callback called once the connection is made successfully with the server. The returned value is of type net.Socket which we have learnt about earlier. Let’s hook to ‘data’ event and console log the information sent by the server. client.on('data', (data) => {    console.log(data.toString());    client.end();  }); Here we are not persisting the TCP connection and ending it once we receive a message from the server. We can subscribe to close event and handle any clean up needed. client.on('end', () => {    console.log('CLIENT: I disconnected from the server.');  }) The output on the client terminal has to be:  CLIENT: I connected to the server.  SERVER: Hello! This is server speaking.  CLIENT: I disconnected from the server. Output on server terminal will be: new client connection is made ::ffff:127.0.0.1:51680  CLIENT: Hello this is client!  client connection closed. In case we want to continue the client instance till the server is alive, we can comment out the ‘client.end()’ call. Any message in the terminal can be processed and sent to the server. For reading the text from terminal we use the readline module. Here is a complete example: const net = require('net');  const readline = require('readline');  const rl = readline.createInterface({    input: process.stdin,    output: process.stdout  });  const client = net.createConnection({ port: 9000 }, () => {  console.log('CLIENT: I connected to the server.');    client.write('CLIENT: Hello this is client!');  });  client.on('data', (data) => {    console.log(data.toString());    //client.end();  });  client.on('end', () => {    console.log('CLIENT: I disconnected from the server.');  })  rl.on('line', (input) => {    client.write(`CLIENT: ${input}`);  }); Both client and server now can communicate. When we type any text in client terminal, that is communicated to the server, and the server can respond back to the client via terminal.  ConclusionWebsockets help in creating a full-duplex connection for sending messages from client to server and vice-versa. Some of the real-time use cases that you may be familiar with are chat apps, IoT devices and so on. The Node.js net module helps you to create a server application easily, which can communicate to any type of client application like a web browser, mobile app, IoT device, Node.js client, or anything that knows TCP where the messaging need is bi-directional with streams. ‘net’ module can be used to communicate among child processes within a node.js server as well. 
5657
Node.Js - Net Module

Node.js has a ‘net’ module which provide... Read More

How to Create MongoDB Database in Node.js

MongoDB is an open-source platform written in C++. It is a document database designed for ease of development and scaling. It is a cross-platform, document-oriented and non-structured database. This article is a quick guide on how to create MongoDB database in Node.js. We will understand MongoDB compatibility and language compatibility and exploreMongo shell. We will see how to import data into the database, how to use MongoDB driver for node.js and the Node.js based CRUD operations. We will conclude with building a simple app using node express app with MongoDB to store and serve content. How MongoDB is different from relational databases Relational databases are SQL enabled and are organised in tables with each entry containing a specific set of columns with a specific data type.  It is seen that sometimes relational databases become very complex for simple schemas, for e.g., where an employee who has more than one phone number, email IDs, addresses, teams, etc. and all the related data is stored in different tables. To retrieve complete information for an employee, multiple tables needs to be joined and queried.  One of the disadvantages with relational databases is that to update or change schema, we would need to modify multiple tables. As the requirement changes, more updates might be needed, further complicating the design. This makes it necessary for a dedicated DB admin. Mongo is a NoSql DB and is object-based containing an entire entity in document record. Unlike relational database, NoSqldb can be queried faster and easy to get complete details of a specific entity or document.  Another feature of Mongo is if there is no document matching to query than no error is returned. In relational database, a table contains the same set of fields whereas in Mongo DB, schema is dynamic and each one need not conform to the same schema.Mongo is developer friendly and JSON representation makes it easy to parse. MongoDB stores documents in collections. Collections are analogous to tables in relational databases. In MongoDB, databases hold collections of documents. In addition to collections, MongoDB supports: Read-only Views (Starting in MongoDB 3.4) On-Demand Materialized Views (Starting in MongoDB 4.2). MongoDB data structure Documents – JSON Objects Store data as BSON (Binary JSON) [For the BSON spec, see bsonspec.org] The value of a field can be any of the BSON data types, including other documents, arrays, and arrays of documents. For example, the following document contains values of varying types var mydoc = {_id: ObjectId("3423jksdhfi83p9fj90cd"),  name: { firstName: "Sachin", lastName: "Tendulkar" },  birth: new Date('Apr 24, 1973'),  awards:[ “Bharat Ratna”, “Padma Vibhushan”, “Padma Shri”]  followers :NumberLong(30000000)  } _id holds an ObjectId,  name holds an embedded document that contains firstName and lastName, birth hold values of Date types awards holds an array of strings followers holds a value of NumberLong type. Note: Mongo supports nesting up to 100 levels. Learn more about bson-types at https://docs.MongoDB.com/manual/reference/bson-types/ Setup MongoDB environmentFormacOS platform, download the tgz file and extract the file. Easiest way to get the items in your path is to copy them to usr/local/bin directory. Follow the below steps to get it doneNow type ‘mongod’ in command prompt which is mongo daemon and  you can see it has started. Invoke another command prompt and type ‘mongo’ to start off in the shell. To make sure everything is working let’s insert a dummy data and observe that it inserted one document. db.users.insert({“name”: “Sachin Tendulkar”})Database is all setup. For Windows platform, download the msi and run the installation to complete the installation. Once the installation is done, you can find where it has been installed. Two things important in bin folder are mongo and MongoD. Copy the path and add to the path in environment variables.  Once this has been done, we should be able to run both the Mongo Server and Mongo Shell from command prompt.  Invoke the command prompt and do ‘md \data’ and then create data DB by ‘md \data\db’. This is the file that mongo is going to look for to use to store information about database.  Now type ‘mongoD’ in command prompt which is mongo daemon and  you can see it has started. Invoke another command prompt and type ‘mongo’ to start off in the shell. To make sure everything is working let’sinsert a dummy data and observe that it inserted one document. db.users.insert({“name”: “Sachin Tendulkar”}) Database is all setup. The recommended way to get started using the Node.js driver is by using NPM (Node Package Manager) to install the dependency in your project. After you’ve created your project with npminit, you can install the MongoDB driver and its dependencies with the command: npm install MongoDB --save This will download the MongoDB driver and add a dependency entry in your package.json file. We will learn more of this as we go forward. Compatibility MongoDB compatibility Note: The driver does not support older versions of MongoDBNode.js DriverMongoDB 4.2MongoDB 4.0MongoDB 3.6MongoDB 3.4MongoDB 3.2MongoDB 3.0MongoDB 2.6>= 3.X✓✓✓✓✓✓✓>= 3.2.1✓✓✓✓✓✓>= 3.1✓✓✓✓✓✓>= 3.0✓✓✓✓✓>=2.2.12✓✓✓✓Language compatibilityNode.jsDriverNode.js v0.8.XNode.js v0.10.XNode.js v0.12.XNode.js v4.X.XNode.js v6.X.XNode.js v8.X.XNode.js v10.x.xNode.js v12.x.x>=3.X.X✓✓✓✓✓2.2.X – 2.0.X✓✓✓✓✓✓>=1.4.18✓✓✓1.4.x✓✓Few salient points When designing the schema for your database, a quick understanding of Embedded vs References is helpful. You can refer Coderwall for more information on this.  Three of the other features that mongo provides: Indexing, Sharding and replicationcan make Mongo the right choice for your application. Indexing- supports ad hoc queries 64 indices per collection Single Field Compound (multiple Fields) Unique Sharding  - Scalability Partitions data onto different machines Scale application into smaller systems Autosharding supported by Mongo Challenging to set upReplicationReliability (Primary and n number of secondary server) Maximizes uptime Replica sets Automatic failoverExplore Mongo Shell Note:All the examples and snaps are taken in macOS. Invoke terminal and type ‘mongod’ which will display the listening port 27017Sinncewe are running locally, we need not secure and optimize it Invoke another terminal (command + T) and type ‘mongo’You should see that we are now connected to MongoDB://127.0.0.1:27017 By default, we are connected to test database  Type ‘db’ to check it in terminal and observe ‘test’ Let us use a different database say ‘MyNewMongoDB’ Type ‘use MyNewMongoDB’ Output: Switched to dbMyNewMongoDB To view all available dbs, type ‘show dbs’.Note: We will not see the new db ‘MyNewMongoDB’. Instead the db is created when the first document is created. Mongo stores data in databases which are separated into collections of documents. To Create a new collection, we need to insert a document and the collection will be created as a side effect. Type ‘db.cricketers.insert({“name”:”Sachin”}) Now check ‘show dbs’ to view our new db Type ‘show collections’ to check that our collection is created.As you can see, mongo doesn’t require any schema or setup work to start working with data. The commands we used are very similar to what we have in SQL but mongo shell uses a JavaScript interpreter. So, we can interact with database using JavaScript expressions making it easy to perform various tasks. Print(‘test’) will print test var arr = [“one”, “two”, “three”] type arrwhich will print the results. Let us take this option to explore further i.e. we will insert 1000 records to db using a for loop.for(i=0;i insert. Read --> find or findOne. Update --> update. Delete --> delete. Node.js based CRUD operations Let ussetup up a simple application using Node.js and MongoDB. We would understand how to setup the driver and perform simple CRUD operations. Follow the below steps to setup project mkdirlearnMongoDB cd learnMongoDB npminit [you would be prompted for inputs related to package.json] npm install MongoDB–save [--save is to update the package.json] Assuming that the mongo db server is still running. If not, start as mentioned earlier. Open the directory learnMongoDB in your favourite IDE. (I am using Visual Studio code). Add a file app.js with the below snippet of code. // import the MongoDB and get the client object.  constMongoClient = require('MongoDB').MongoClient;  constassert = require('assert');  // Connection URL - can be obtained in the MongoDB server  consturl = 'MongoDB://localhost:27017';  // Database Name   constdbName = 'myproject';  // Create a new MongoClient  constclient = newMongoClient(url);  // Use connect method to connect to the Server  client.connect(function(err) {  assert.equal(null, err);  console.log("Connected successfully to server");  constdb = client.db(dbName);  client.close();  }); Observe that we are not providing the db name in the URL. Instead we are getting the db instance on successfully connecting to the server. This way, we can get connected to different databases in the same client instances. Note: We are using assert module to check if connection is successful or not by validating the error object. Will be used for other CRUD operations as well. Run ‘node app.js’ and should observe the log mentioning connected successfully to server. Insertion: Now we have the db object, let’s get hold of the collection and insert some objects into it. Create a function ‘insertDocuments’ to perform insertion into the collection.Call this insertDocuments in the callback of successful connection. // insertDocuments take the db instance  constinsertDocuments = function(db, callback) {  // Get the documents collection  constcollection = db.collection('documents');  // Insert some documents  collection.insertMany([        {a :1}, {a :2}, {a :3}      ], function(err, result) {  assert.equal(err, null); // assertion to check if any error is there.  assert.equal(3, result.result.n);  assert.equal(3, result.ops.length);  console.log("Inserted 3 documents into the collection");  callback(result); // return the inserted documents      });  }  // Use connect method to connect to the Server  client.connect(function(err) {  assert.equal(null, err);  console.log("Connected successfully to server");  constdb = client.db(dbName);  //insert the documents and in the callback, close the client.  insertDocuments(db, function(result) {  console.log(result);  client.close();    })  }); In the callback of the insertDocuments, we get the result object which is as shown below: { result: { ok: 1, n: 3 }, ops: [ { a: 1, _id: 5eeefd2a16f61641f58a9418 },  { a: 2, _id: 5eeefd2a16f61641f58a9419 },  { a: 3, _id: 5eeefd2a16f61641f58a941a } ],  insertedCount: 3,  insertedIds:  { '0': 5eeefd2a16f61641f58a9418,  '1': 5eeefd2a16f61641f58a9419,  '2': 5eeefd2a16f61641f58a941a } } As you can see, result object gives an overview of the action. Ops is the objects added to the collection. And other fields are self-explanatory. Note: You might have noticed that a new field ‘_id’ has been added which is generated by MongoDB for unique identification. If we are not happy with the unique id, we can as well supply like we did during importing of the data. Find: Let’s check if documents added can be retrieved using find. Collection object has a method called ‘find’ which take a parameter i.e. search query to identify the objects. If it is empty then all documents are returned.Invoke the findDouments in the callback of insertion for validating the records inserted. We can use a query filter to get specific item as well. // findDocuments from the db instance  constfindDocuments = function(db, callback) {  // Get the documents collection  constcollection = db.collection('documents');  // Finds all documents. If required, parameterize the query filter to get specific items.  collection.find({}).toArray(function(err, docs) {  assert.equal(err, null);  console.log("Found the following records");  console.log(docs)  callback(docs);      });    }  // Use connect method to connect to the Server  client.connect(function(err) {  assert.equal(null, err);  console.log("Connected successfully to server");  constdb = client.db(dbName);  //insert the documents and in the callback, close the client.  insertDocuments(db, function(result) {  findDocuments(db, function() {  client.close();      })        })  }); For example, if we want to find a specific set of documents based on pre-defined query filter like ‘{‘a’:3}’ : // Find specific document with a query filter  constfindDocuments = function(db, callback) {  // Get the documents collection  const collection = db.collection('documents');  // Find some documents  collection.find({'a': 3}).toArray(function(err, docs) {  assert.equal(err, null);  console.log("Found the following record(s)");      console.log(docs);  callback(docs);    });  } Updation: Now go andupdate a specific document. There are various methods available for updation like update, updateMany, updateOne etc. Use updateOne method to update the first one which matches the query. Signature: updateOne(filter, update, options, callback) Let’s send the filter as {a : 2 } i.e. document whose field a is equal to 2, update as $set: {b:1} to avoid replacement of entire object i.e. adding a new field b to document with value set as 1. Ignore options which is an optional and have a callback to capture post actions. constupdateDocument = function(db, callback) {  // Get the documents collection  constcollection = db.collection('documents');  // Update document where a is 2, set b equal to 1  collection.updateOne({ a :2 }        , { $set: { b :1 } }, function(err, result) {  assert.equal(err, null);  assert.equal(1, result.result.n);  console.log("Updated the document with the field a equal to 2");  callback(result);      });      }    // Use connect method to connect to the Server  client.connect(function(err) {  assert.equal(null, err);  console.log("Connected successfully to server");  constdb = client.db(dbName);  //insert the documents and in the callback, close the client.  insertDocuments(db, function(result) {  findDocuments(db, function() {  updateDocument(db, function() {  client.close();          })      })       })  }); Deletion: Let’s use deleteOne method to find and delete the first item with field a and value 3. constremoveDocument = function(db, callback) {  // Get the documents collection  constcollection = db.collection('documents');  // Delete document where a is 3  collection.deleteOne({ a :3 }, function(err, result) {  assert.equal(err, null);  assert.equal(1, result.result.n);  console.log("Removed the document with the field a equal to 3");  callback(result);      });        }  // Use connect method to connect to the Server  client.connect(function(err) {  assert.equal(null, err);  console.log("Connected successfully to server");  constdb = client.db(dbName);  //insert the documents and in the callback, close the client.  insertDocuments(db, function(result) {  findDocuments(db, function() {  updateDocument(db, function() {  removeDocument(db, function(){  client.close();              })          })      })    })  }); Note: All the methods on the collection returns promise object if callback is not passed. So to avoid callback hell as we see above, we can rely on promise and program the code better. Node express app with MongoDB to store and serve content Now that we have fair idea of how to work with MongoDB using mongoshell and node.js driver. Let us build a simple app using node express app with MongoDB to store and serve content. Note: Since this topic is more about MongoDB rather than express, we will keep the express app a basic version. Let us go and comment out all the code in the app.js file from the previous example. We should be left with this: // import the MongoDB and get the client object.  constMongoClient = require('MongoDB').MongoClient;  const { equal } = require('assert');  // Connection URL - can be obtained in the MongoDB server  consturl = 'MongoDB://localhost:27017';  // Database Name   constdbName = 'myproject'; Invoke a terminal and navigate to the directory where the app.js file is located. Let us install express with the below command. npm install express –save Import the express and body-parser as shown below: constexpress = require('express');  constbodyParser = require('body-parser');  create an app instance and use bodyParser.json().  constapp = express();  app.use(bodyParser.json());  Let the express app be listening on port 3000 with a default route.  app.get('/', (req, res) =>res.send('Express app with MongoDB!'));  app.listen(3000, () =>console.log("App listening at http://localhost:3000")); Consolidated code: constexpress = require('express');  constbodyParser = require('body-parser');  // import the MongoDB and get the client object.  constMongoClient = require('MongoDB').MongoClient;  const { equal } = require('assert');  // Connection URL - can be obtained in the MongoDB server  consturl = 'MongoDB://localhost:27017';  // Database Name   constdbName = 'myproject';  constapp = express();  app.use(bodyParser.json());  app.get('/', (req, res) =>res.send('Express app with MongoDB!'));  app.listen(3000, () =>console.log("App listening at http://localhost:3000")); In the terminal, run ‘node app.js’ and should see the console log that app is listening in port 3000. To check the default route is working correctly. Invoke the postman and perform get operation.Now let us create a new route to retrieve the document based on field a and parameterized value. app.get('/api/documents/:name', (req,res) => {  }) :name is the parameter being passed. So let’s retrieve it from request parameters. And since we know it is numeric and not string. Let’s parse it to integer. constname = parseInt(req.params.name); Now lets connect to the MongoClient as earlier. This time we will not be using callback approach rather rely on the promise being returned. constclient = awaitMongoClient.connect(url, {useNewUrlParser:true});Since returned object is a promise and in order for the promise to be resolved or rejected. We need to use await keyword before proceeding further. Note: Promise object has two properties i.e. state and result. Promise state property initially has ‘pending’ and post execution will be either ‘fulfilled’(resolve) or ‘rejected’(reject). Similarly, result will be ‘undefined’ initially and then changes to ‘value’[resolve(value)] or ‘error’[reject(error)]. Let us get the db ‘myproject’ and find the first document based on the query filter from collection using findOne method. constdb = client.db(dbName);  constdocument = awaitdb.collection('documents').findOne({"a":name}); Once we get the document, lets return the status as 200 with the obtained document and close the client connection. res.status(200).json(document);  client.close(); As we are using await keyword, we need to add async keyword to the callback. And also lets add a try-catch block to return an error status in case of connection failure as internal server error. app.get('/api/documents/:name',async (req,res) => {  try {     constname = parseInt(req.params.name);  constclient = awaitMongoClient.connect(url, {useNewUrlParser:true});  constdb = client.db(dbName);  constdocument = awaitdb.collection('documents').findOne({"a":name});  res.status(200).json(document);  client.close();            } catch(error) {  res.status(500).json({message:"Error connecting to db", error});      }  }) Now we are good to go and restart our express app. Press ctrl + c and run node app.js to see app is listening at. Make sure that MongoDB Server is up and running in the background. Switch to postman and perform a GET operation. This should return the first record in the collection as shown below. {"_id":"5eeefd2a16f61641f58a9419","a":2,"b":1} Same GET request can be performed for other documents by changing the name as 2Similar to get operations, we can configure the routes to perform other CRUD operations on MongoDB to store and serve the content.  Conclusion In this guide, we have gone overMongoDB compatibility and language compatibility and explored the Mongo shell. We have seen how to import data into the database, how to use MongoDB driver and the Node.js based CRUD operations. We have also looked at how to build a simple app using node express app with MongoDB to store and serve content.Armed with this knowledge, you are ready to take a deeper dive into MongoDB and NodeJS. Here’s wishing you the best! Real coding requires real exercises. Practice live coding with our programming  experts on live workshops now.
6751
How to Create MongoDB Database in Node.js

MongoDB is an open-source platform written in C++.... Read More