KEMBAR78
FSD Unit I &II | PDF | Angular Js | Internet & Web
0% found this document useful (0 votes)
47 views76 pages

FSD Unit I &II

The document provides an overview of Full Stack Development, detailing both front-end and back-end technologies, including key frameworks and languages like Node.js, MongoDB, Express, React, and Angular. It explains the roles of users, browsers, web servers, and backend services in web applications, emphasizing the importance of JavaScript across all stacks. Additionally, it introduces the Node.js-to-AngularJS stack, highlighting its components and their functions in building modern web applications.

Uploaded by

B NAGALAKSHMI
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views76 pages

FSD Unit I &II

The document provides an overview of Full Stack Development, detailing both front-end and back-end technologies, including key frameworks and languages like Node.js, MongoDB, Express, React, and Angular. It explains the roles of users, browsers, web servers, and backend services in web applications, emphasizing the importance of JavaScript across all stacks. Additionally, it introduces the Node.js-to-AngularJS stack, highlighting its components and their functions in building modern web applications.

Uploaded by

B NAGALAKSHMI
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 76

UNIT-I

Introduction to Full Stack Development:


Understanding the Basic Web Development Framework- User, Browser, Webserver,
Backend Services, Full Stack Components - Node.js, MongoDB, Express, React,
Angular. Java Script Fundamentals, NodeJS- Understanding Node.js, Installing Node.js,
Working with Node Packages, creating a Node.js Application, Understanding the Node.js
Event Model, Adding Work to the Event Queue, Implementing Callbacks.

Full Stack Development refers to the development of both front end (client side) and back
end (server side) portions of web applications.
full Stack Web Developers
Full Stack web developers have the ability to design complete web applications and websites. They
work on the frontend, backend, database, and debugging of web applications or websites.

Front-end Development
It is the visible part of website or web application which is responsible for user experience. The user
directly interacts with the front end portion of the web application or website.
Front-end Technologies
The front end portion is built by using some languages which are discussed below:
 HTML: HTML stands for Hyper Text Markup Language. It is used to design the front end
portion of web pages using markup language. HTML is the combination of Hypertext and
Markup language. Hypertext defines the link between the web pages. The markup language
is used to define the text documentation within tag which defines the structure of web pages.
 CSS: Cascading Style Sheets, fondly referred to as CSS, is a simply designed language
intended to simplify the process of making web pages presentable. CSS allows you to apply
styles to web pages. More importantly, CSS enables you to do this independent of the HTML
that makes up each web page.
 JavaScript: JavaScript is a famous scripting language used to create the magic on the sites
to make the site interactive for the user. It is used to enhancing the functionality of a website
to running cool games and web-based software.
Front End Libraries and Frameworks
 AngularJS: AngularJs is a JavaScript open source front-end framework that is mainly used
to develop single page web applications(SPAs). It is a continuously growing and expanding
framework which provides better ways for developing web applications. It changes the static
HTML to dynamic HTML. It is an open source project which can be freely used and
changed by anyone. It extends HTML attributes with Directives, and data is bound with
HTML.
 React.js: React is a declarative, efficient, and flexible JavaScript library for building user
interfaces. ReactJS is an open-source, component-based front end library responsible only
for the view layer of the application. It is maintained by Facebook.
 Bootstrap: Bootstrap is a free and open-source tool collection for creating responsive
websites and web applications. It is the most popular HTML, CSS, and JavaScript
framework for developing responsive, mobile-first web sites.
 jQuery: jQuery is an open source JavaScript library that simplifies the interactions between
an HTML/CSS document, or more precisely the Document Object Model (DOM), and
JavaScript. Elaborating the terms, jQuery simplifies HTML document traversing and
manipulation, browser event handling, DOM animations, Ajax interactions, and cross-
browser JavaScript development.
 SASS: It is the most reliable, mature and robust CSS extension language. It is used to extend
the functionality of an existing CSS of a site including everything from variables,
inheritance, and nesting with ease.
 Some other libraries and frameworks are: Semantic-
UI, Foundation, Materialize, Backbone.js, Ember.js etc.
Other Important Points
 Work with text editors to use shortcuts and its facilities i.e. Visual studio, Atom, Sublime etc.
 Make UI responsible using grid system.
 Git and git commands like init, add, commit etc for version control and to work with team.
 Other tools like npm & yarn package managers, sass css pre-processor, browser DevTools
i.e. chrome devtools.
 Understand using HTTP, JSON, GraphQL APIs to fetch data using axios or other tools.
 It also requires some design skill to make layout and look better.
Back-end Technologies
It refers to the server-side development of web application or website with a primary focus on
how the website works. It is responsible for managing the database through queries and APIs by
client-side commands. This type of website mainly consists of three parts front end, back end,
and database. The back end portion is built by using some libraries, frameworks, and languages
which are discussed below:
 PHP: PHP is a server-side scripting language designed specifically for web development.
Since, PHP code executed on server side so it is called server side scripting language.
 C++ It is a general purpose programming language and widely used now a days for
competitive programming. It is also used as backend language.
 Java: Java is one of the most popular and widely used programming language and platform.
It is highly scalable. Java components are easily available.
 Python: Python is a programming language that lets you work quickly and integrate systems
more efficiently.
 Node.js: Node.js is an open source and cross-platform runtime environment for executing
JavaScript code outside of a browser. You need to remember that NodeJS is not a framework
and it’s not a programming language. Most of the people are confused and understand it’s a
framework or a programming language. We often use Node.js for building back-end services
like APIs like Web App or Mobile App. It’s used in production by large companies such as
Paypal, Uber, Netflix, Walmart and so on.
 Back End Frameworks: The list of back end frameworks are: Express, Django, Rails,
Laravel, Spring etc.
 The other back end program/scripting languages are: C#, Ruby, REST, GO etc.
Other Important Points
 Structuring the data in efficient way.
 Handle request-response of APIs for storing and retrieve data.
 Security of data is important.
Note: JavaScript is essential for all stacks as it is dominant technology on Web.
Database: Database is the collection of inter-related data which helps in efficient retrieval,
insertion and deletion of data from database and organizes the data in the form of tables, views,
schemas, reports etc.
 Oracle: Oracle database is the collection of data which is treated as a unit. The purpose of
this database is to store and retrieve information related to the query. It is a database server
and used to manages information.
 MongoDB: MongoDB, the most popular NoSQL database, is an open source document-
oriented database. The term ‘NoSQL’ means ‘non-relational’. It means that MongoDB isn’t
based on the table-like relational database structure but provides an altogether different
mechanism for storage and retrieval of data.
 SQL: Structured Query Language is a standard Database language which is used to create,
maintain and retrieve the relational database.

Understanding the Basic Web Development Framework


To get you in the right mind-set to understand the benefits of utilizing Node.js, MongoDB, and
AngularJS as your web framework, this section provides an overview of the basic components of
most websites. If you are already familiar with the full web framework, then this section will be old
hat, but if you only understand just the server side or client side of the web framework, then this
section will give you a more complete picture.
The main components of any web framework are the user, browser, webserver, and backend
services. Although websites vary greatly in terms of appearance and behavior, all have these basic
components in one form or another.
This section is not intended to be in-depth, comprehensive, or technically exact but rather a very
high-level perspective of the parts involved in a functional website. The components are described
in a top-down manner, from user down to backend services. Then the next section discusses the
Node.js-to-AngularJS stack from the bottom up, so you can get a picture of where each of the pieces
fits and why. Figure 1.1 provides a basic diagram to help you visualize the components in a
website/web application, which are discussed in the following sections.
Figure 1.1 Basic diagram of the components of a basic website/web application.
Users
Users are a fundamental part of every website; they are, after all, the reason websites exist in the
first place. User expectations define the requirements for developing a good website. User
expectations have changed a lot over the years. In the past, users accepted the slow, cumbersome
experience of the “world-wide-wait,” but not today. They expect websites to behave much more
quickly, like applications installed on their computers and mobile devices.
The user role in a web framework is to sit on the visual output and interaction input of webpages.
That is, users view the results of the web framework processing and then provide interactions using
mouse clicks, keyboard input, and swipes and taps.
The Browser
The browser plays three roles in the web framework:

 Provide communication to and from the webserver


 Interpret the data from the server and render it into the view that the user actually sees
 Handle user interaction through the keyboard, mouse, touchscreen, or other input
device and take the appropriate action
Browser-to-Webserver Communication
Browser-to-webserver communication consists of a series of requests, using the HTTP and HTTPS
protocols. Hypertext Transfer Protocol (HTTP) is used to define communication between the
browser and the webserver. HTTP defines what types of requests can be made as well as the format
of those requests and the HTTP response.
HTTPS adds an additional security layer, SSL/TLS, to ensure secure connections by requiring the
webserver to provide a certificate to the browser. The user can then determine whether to accept the
certificate before allowing the connection.
There are three main types of requests that a browser will make to a webserver:

 GET: The GET request is typically used to retrieve data from the server, such as .html
files, images, or JSON data.
 POST: POST requests are used when sending data to the server, such as adding an item
to a shopping cart or submitting a web form.
 AJAX: Asynchronous JavaScript and XML (AJAX) is actually just a GET or POST
request that is done directly by JavaScript running in the browser. Despite the name, an
AJAX request can receive XML, JSON, or raw data in the response.
Rendering the Browser View
The screen that the user actually views and interacts with is often made up of several different
pieces of data retrieved from the webserver. The browser reads data from the initial URL and then
renders the HTML document to build a Document Object Model (DOM). The DOM is a tree
structure object with the HTML document as the root. The structure of the tree basically matches
the structure of the HTML document. For example, document will have html as a child,
and html will have head and body as children, and body may have div, p, or other elements as
children, like this:

document

+ html

+ head

+ body

+ div

+p

The browser interprets each DOM element and renders it to the user’s screen to build the webpage
view.
The browser often gets various types of data from multiple webserver requests to build a webpage.
The following are the most common types of data the browser uses to render the final user view as
well as define the webpage behavior:

 HTML files: These provide the fundamental structure of the DOM.


 CSS files: These define how each of the elements on the page is to be styled, in terms
of font, color, borders, and spacing.
 Client-side scripts: These are typically JavaScript files. They can provide added
functionality to a webpage, manipulate the DOM to change the look of the webpage,
and provide any necessary logic required to display the page and provide functionality.
 Media files: Image, video, and sound files are rendered as part of the webpage.
 Data: The webserver can provide data such as XML, JSON, or raw text as a response
to an AJAX request. Rather than send a request back to the server to rebuild the
webpage, new data can be retrieved via AJAX and inserted into the webpage via
JavaScript.
 HTTP headers: HTTP defines a set of headers that the browser can use, as well as
client-side scripts to define the behavior of the webpage. For example, cookies are
contained in the HTTP headers. The HTTP headers also define the type of data in the
request as well as the type of data expected to be returned to the browser.
User Interaction
The user interacts with the browser via mice, keyboards, and touchscreens. A browser has an
elaborate event system that captures user input events and then takes the appropriate actions.
Actions vary from displaying a popup menu to loading a new document from the server to
executing client-side JavaScript.
Webservers
A webserver’s main focus is handling requests from browsers. As described earlier, a browser may
request a document, post data, or perform an AJAX request to get data. The webserver uses HTTP
headers as well as a URL to determine what action to take. This is where things get very different,
depending on the webserver, configuration, and technologies used.
Most out-of-the-box webservers such as Apache and IIS are made to serve static files such
as .html, .css, and media files. To handle POST requests that modify server data and AJAX requests
to interact with backend services, webservers need to be extended with server-side scripts.
A server-side script is really anything that a webserver can execute in order to perform the task the
browser is requesting. These scripts can be written in PHP, Python, C, C++, C#, Perl, Java, ... the
list goes on and on. Webservers such as Apache and IIS provide mechanisms to include server-side
scripts and then wire them up to specific URL locations requested by the browser. This is where
having a solid webserver framework can make a big difference. It often takes quite a bit of
configuration to enable various scripting languages and wire up the server-side scripts so that the
webserver can route the appropriate requests to the appropriate scripts.
Server-side scripts either generate a response directly by executing their code or connect with other
backend servers such as databases to obtain the necessary information and then use that information
to build and send the appropriate responses.
Backend Services
Backend services are services that run behind a webserver and provide data that is used to build
responses to the browser. The most common type of backend service is a database that stores
information. When a request comes in from the browser that requires information from the database
or other backend service, the server-side script connects to the database, retrieves the information,
formats it, and then sends it back to the browser. On the other hand, when data comes in from a web
request that needs to be stored in the database, the server-side script connects to the database and
updates the data.

Understanding the Node.js-to-AngularJS Stack Components


With the basic structure of the web framework fresh in your mind, it is time to discuss the Node.js-
to-AngularJS stack. The most common—and I believe the best—version of this stack is the Node.js-
to-AngularJS stack comprised of MongoDB, Express, AngularJS, and Node.js.
In the Node.js-to-AngularJS stack, Node.js provides the fundamental platform for development. The
backend services and server-side scripts are all written in Node.js. MongoDB provides the data store
for the website but is accessed via a MongoDB driver Node.js module. The webserver is defined by
Express, which is also a Node.js module.
The view in the browser is defined and controlled using the AngularJS framework. AngularJS is an
MVC framework in which the model is made up of JSON or JavaScript objects, the view is
HTML/CSS, and the controller is AngularJS JavaScript code.
Figure 1.2 provides a very basic diagram of how the Node.js to AngularJS stack fits into the basic
website/web application model. The following sections describe each of these technologies and why
they were chosen as part of the Node.js to AngularJS stack. Later chapters in the book will cover
each of the technologies in much more detail.
Figure 1.2 Basic diagram showing where Node.js, Express, MongoDB, and AngularJS fit in the web
paradigm.
Node.js
Node.js is a development framework that is based on Google’s V8 JavaScript engine and executes
it.
You can write most—or maybe even all—of your server-side code in Node.js, including the
webserver and the server-side scripts and any supporting web application functionality. The fact that
the webserver and the supporting web application scripts are running together in the same server-
side application allows for much tighter integration between the webserver and the scripts. Also, the
webserver can run directly on the Node.js platform as a Node.js module, which means it’s much
easier than using, say, Apache for wiring up new services or server-side scripts.
The following are just a few reasons Node.js is a great framework:

 JavaScript end-to-end: One of the biggest advantages of Node.js is that it allows you
to write both server- and client-side scripts in JavaScript. There have always been
difficulties in deciding whether to put logic in client-side scripts or server-side scripts.
With Node.js you can take JavaScript written on the client and easily adapt it for the
server and vice versa. An added plus is that client developers and server developers are
speaking the same language.
 Event-driven scalability: Node.js applies a unique logic to handling web requests.
Rather than having multiple threads waiting to process web requests, with Node.js they
are processed on the same thread, using a basic event model. This allows Node.js
webservers to scale in ways that traditional webservers can’t.
 Extensibility: Node.js has a great following and very active development community.
People are providing new modules to extend Node.js functionality all the time. Also, it
is very simple to install and include new modules in Node.js; you can extend a Node.js
project to include new functionality in minutes.
 Fast implementation: Setting up Node.js and developing in it are super easy. In only a
few minutes you can install Node.js and have a working webserver.
MongoDB
MongoDB is an agile and very scalable NoSQL database. The name Mongo comes from the word
“humongous,” emphasizing the scalability and performance MongoDB provides. It is based on the
NoSQL document store model, which means data is stored in the database as basically JSON
objects rather than as the traditional columns and rows of a relational database.
MongoDB provides great website backend storage for high-traffic websites that need to store data
such as user comments, blogs, or other items because it is quickly scalable and easy to implement.
This book covers using the MongoDB driver library to access MongoDB from Node.js.
Node.js supports a variety of database access drivers, so the data store can easily be MySQL or
some other database. However, the following are some of the reasons that MongoDB really fits in
the Node.js stack well:
 Document orientation: Because MongoDB is document oriented, data is stored in the
database in a format that is very close to what you deal with in both server-side and
client-side scripts. This eliminates the need to transfer data from rows to objects and
back.
 High performance: MongoDB is one of the highest-performing databases available.
Especially today, with more and more people interacting with websites, it is important
to have a backend that can support heavy traffic.
 High availability: MongoDB’s replication model makes it very easy to maintain
scalability while keeping high performance.
 High scalability: MongoDB’s structure makes it easy to scale horizontally by sharding
the data across multiple servers.
Express
The Express module acts as the webserver in the Node.js-to-AngularJS stack. Because it runs in
Node.js, it is easy to configure, implement, and control. The Express module extends Node.js to
provide several key components for handling web requests. It allows you to implement a running
webserver in Node.js with only a few lines of code.
For example, the Express module provides the ability to easily set up destination routes (URLs) for
users to connect to. It also provides great functionality in terms of working with HTTP request and
response objects, including things like cookies and HTTP headers.
The following is a partial list of the valuable features of Express:

 Route management: Express makes it easy to define routes (URL endpoints) that tie
directly to the Node.js script functionality on the server.
 Error handling: Express provides built-in error handling for “document not found”
and other errors.
 Easy integration: An Express server can easily be implemented behind an existing
reverse proxy system, such as Nginx or Varnish. This allows you to easily integrate it
into your existing secured system.
 Cookies: Express provides easy cookie management.
 Session and cache management: Express also enables session management and cache
management.
AngularJS
AngularJS is a client-side framework developed by Google. It provides all the functionality needed
to handle user input in the browser, manipulate data on the client side, and control how elements are
displayed in the browser view. It is written in JavaScript, with a reduced jQuery library. The theory
behind AngularJS is to provide a framework that makes it easy to implement web applications using
the MVC framework.
Other JavaScript frameworks could be used with the Node.js platform, such as Backbone, Ember,
and Meteor. However, AngularJS has the best design, feature set, and trajectory at this writing. Here
are some of the benefits AngularJS provides:

 Data binding: AngularJS has a very clean method for binding data to HTML elements,
using its powerful scope mechanism.
 Extensibility: The AngularJS architecture allows you to easily extend almost every
aspect of the language to provide your own custom implementations.
 Clean: AngularJS forces you to write clean, logical code.
 Reusable code: The combination of extensibility and clean code makes it very easy to
write reusable code in AngularJS. In fact, the language often forces you to do so when
creating custom services.
 Support: Google is investing a lot into this project, which gives it an advantage over
similar initiatives that have failed.
 Compatibility: AngularJS is based on JavaScript and has a close relationship with
jQuery. This makes it easier to begin integrating AngularJS into your environment and
reuse pieces of your existing code within the structure of the AngularJS framework.

JAVA SCRIPT FUNDAMENTALS:

JavaScript is a powerful language with a rich set of features that enable developers to create
dynamic and interactive web applications. Understanding its fundamentals is essential for anyone
looking to work in web development.
JavaScript programs can be inserted almost anywhere into an HTML document using
the <script> tag.

For instance:

<!DOCTYPE HTML>
<html>
<body>
<p>Before the script...</p>
<script>
alert( 'Hello, world!' );
</script>
<p>...After the script.</p>
</body>
</html>

Syntax and Basics

 Variables: Used to store data values.

javascript
Copy code
var name = "John";
let age = 30;
const isStudent = true;

 Data Types: Common data types include strings, numbers, booleans, null, undefined,
objects, and arrays.

javascript
Copy code
let message = "Hello, World!"; // String
let count = 42; // Number
let isActive = true; // Boolean
let student = null; // Null
let course; // Undefined

2. Operators

 Arithmetic Operators: +, -, *, /, %
javascript
Copy code
let sum = 10 + 5; // 15
let product = 10 * 5; // 50

 Comparison Operators: ==, ===, !=, !==, <, >, <=, >=

javascript
Copy code
let isEqual = (5 == '5'); // true
let isStrictEqual = (5 === '5'); // false

 Logical Operators: &&, ||, !

javascript
Copy code
let result = (true && false); // false
let isNotTrue = !true; // false

3. Control Structures

 Conditionals: if, else if, else, switch

javascript
Copy code
if (age > 18) {
console.log("Adult");
} else {
console.log("Minor");
}

 Loops: for, while, do...while

javascript
Copy code
for (let i = 0; i < 5; i++) {
console.log(i);
}

4. Functions

 Function Declaration

javascript
Copy code
function greet(name) {
return "Hello, " + name;
}

 Function Expression
javascript
Copy code
const greet = function(name) {
return "Hello, " + name;
};

 Arrow Functions

javascript
Copy code
const greet = (name) => "Hello, " + name;

5. Objects and Arrays

 Objects: Collections of key-value pairs.

javascript
Copy code
let person = {
firstName: "John",
lastName: "Doe",
age: 25
};
console.log(person.firstName); // John

 Arrays: Ordered lists of values.

javascript
Copy code
let numbers = [1, 2, 3, 4, 5];
console.log(numbers[0]); // 1

6. DOM Manipulation

 JavaScript can interact with and manipulate the HTML DOM (Document Object Model).

javascript
Copy code
document.getElementById("myElement").innerHTML = "New Content";

7. Events

 JavaScript can respond to user actions such as clicks, keypresses, etc.

javascript
Copy code
document.getElementById("myButton").addEventListener("click", function() {
alert("Button clicked!");
});
8. Asynchronous JavaScript

 Callbacks

javascript
Copy code
function fetchData(callback) {
setTimeout(() => {
callback("Data received");
}, 2000);
}
fetchData((data) => {
console.log(data);
});

 Promises

javascript
Copy code
let promise = new Promise((resolve, reject) => {
let success = true;
if (success) {
resolve("Promise resolved");
} else {
reject("Promise rejected");
}
});
promise.then((message) => {
console.log(message);
}).catch((message) => {
console.log(message);
});

 Async/Await

javascript
Copy code
async function fetchData() {
let response = await fetch('https://api.example.com/data');
let data = await response.json();
console.log(data);
}
fetchData();

9. ES6 and Beyond

 New features introduced in ECMAScript 2015 (ES6) and later versions.


o let and const for block-scoped variables.
o Arrow functions.
o Template literals.
o Destructuring assignment.
o Default parameters.
o Classes.
o Modules (import/export).

Understanding Node.js

Node.js is a runtime environment that allows you to run JavaScript code on the server side. It uses
the V8 JavaScript engine (the same one used by Google Chrome) to execute code, and it provides a
non-blocking, event-driven architecture for building scalable network applications.

Installing Node.js

1. Download Node.js:
o Visit the official Node.js website.
o Download the installer for your operating system (Windows, macOS, Linux).
2. Install Node.js:
o Run the installer and follow the instructions.
o Verify the installation by opening a terminal or command prompt and typing:

bash
Copy code
node -v
npm -v

This should display the installed versions of Node.js and npm (Node Package
Manager).

Working with Node Packages

Node.js uses npm to manage packages (libraries and tools). With npm, you can install, update, and
remove packages easily.

1. Initialize a Project:
o Create a new directory for your project and navigate into it.

bash
Copy code
mkdir my-node-app
cd my-node-app

o Initialize a new Node.js project:

bash
Copy code
npm init -y

This creates a package.json file in your project directory, which keeps track of your
project's dependencies and configuration.
2. Installing Packages:
o Install a package (e.g., Express for web server functionality):

bash
Copy code
npm install express

o Installed packages are listed in the package.json file under "dependencies".


3. Using Packages:
o Require the installed package in your JavaScript files:

javascript
Copy code
const express = require('express');

Creating a Node.js Application

1. Set Up the Application:


o Create an app.js file in your project directory.
o Require the necessary modules:

javascript
Copy code
const express = require('express');
const app = express();

2. Define Routes:
o Set up a simple route to handle requests:

javascript
Copy code
app.get('/', (req, res) => {
res.send('Hello, World!');
});

3. Start the Server:


o Listen for incoming requests:

javascript
Copy code
const PORT = 3000;
app.listen(PORT, () => {
console.log(`Server is running on port ${PORT}`);
});

4. Run the Application:


o Start the server by running the app.js file:

bash
Copy code
node app.js

o Open a web browser and navigate to http://localhost:3000 to see the "Hello, World!"
message.

Understanding the Node.js Event Model

Node.js uses an event-driven model, where events trigger callback functions. This model is
implemented using an event loop, which continuously checks for new events and executes their
corresponding callbacks.

Adding Work to the Event Queue

You can use functions like setTimeout, setInterval, and process.nextTick to add tasks to the event
queue.

 setTimeout:

javascript
Copy code
setTimeout(() => {
console.log('This will run after 1 second');
}, 1000);

 setInterval:

javascript
Copy code
setInterval(() => {
console.log('This will run every 2 seconds');
}, 2000);

 process.nextTick:

javascript
Copy code
process.nextTick(() => {
console.log('This will run at the end of the current operation');
});

Implementing Callbacks

Callbacks are functions passed as arguments to other functions and are invoked after the completion
of certain tasks.

 Example with a Timer:

javascript
Copy code
function greet(name, callback) {
console.log(`Hello, ${name}!`);
callback();
}

function afterGreet() {
console.log('This function runs after the greet function');
}

greet('John', afterGreet);

 Example with Asynchronous Code:

javascript
Copy code
const fs = require('fs');

fs.readFile('example.txt', 'utf8', (err, data) => {


if (err) {
console.error(err);
return;
}
console.log(data);
});

In this example, fs.readFile reads a file asynchronously, and the callback function is executed after
the file reading is complete.
UNIT-II

Node.js
Working with JSON, Using the Buffer Module to Buffer Data, Using the Stream Module
to Stream Data, Accessing the File System from Node.js- Opening, Closing, Writing,
Reading Files and other File System Tasks. Implementing HTTP Services in Node.js-
Processing URLs, Processing Query Strings and Form Parameters, Understanding
Request, Response, and Server Objects, Implementing HTTP Clients and Servers in
Node.js, Implementing HTTPS Servers and Clients. Using Additional Node.js Modules-
Using the OS Module, Using the Util Module, Using the DNS Module, Using the crypto
Module.

NODEJS WORKING WITH JSON

In Node.js, working with JSON (JavaScript Object Notation) is quite simple because JSON is
natively supported. Here are a few basic operations you can perform when working with JSON data
in Node.js:

1. Parsing JSON from a String

You can use JSON.parse() to convert a JSON string into a JavaScript object.

const jsonString = '{"name": "Sridhar", "age": 30, "city": "Warangal"}';


const obj = JSON.parse(jsonString);
console.log(obj.name);
console.log(obj.age); // 30

2. Converting JavaScript Object to JSON

To convert a JavaScript object into a JSON string, use JSON.stringify().

const obj = { name: "Nagarajy", age: 25, city: "London" };


const jsonString = JSON.stringify(obj);
console.log(jsonString); // {"name":"Nagaraju","age":25,"city":"London"}

3. Working with JSON Files

Node.js provides a built-in module called fs (File System) to read from and write to files. You can
use this to read and write JSON files.

Reading a JSON file

const fs = require('fs');

// Read the JSON file asynchronously


fs.readFile('data.json', 'utf8', (err, data) => {
if (err) {
console.log('Error reading file:', err);
return;
}
const jsonData = JSON.parse(data);
console.log(jsonData);
});
Writing to a JSON file
const fs = require('fs');
const data = {
name: "Bob",
age: 40,
city: "Warangal"
};
// Convert JavaScript object to JSON string
const jsonData = JSON.stringify(data);

// Write JSON data to file


fs.writeFile('output.json', jsonData, (err) => {
if (err) {
console.log('Error writing file:', err);
} else {
console.log('File has been saved!');
}
});

4. Example: Using JSON with an Express Server

If you’re using Express.js, you can easily handle JSON data in request and response objects.

Handling JSON request body

Ensure that the Express app is set up to handle JSON requests.

const express = require('express');


const app = express();

// Middleware to parse JSON request body


app.use(express.json());
app.post('/user', (req, res) => {
const user = req.body; // Assuming request body is JSON
console.log(user);
res.send({ message: 'User received', user });
});
app.listen(3000, () => {
console.log('Server is running on port 3000');
});

In this case, when you send a POST request with JSON data to /user, it will be parsed and logged.

5. Error Handling with JSON


Sometimes, when working with JSON, parsing errors can occur, especially if the JSON data is
malformed. Here's how you can handle such errors:

const jsonString = '{"name": "John", "age": 30'; // Incorrect JSON


try {
const obj = JSON.parse(jsonString);
console.log(obj);
} catch (error) {
console.log('Error parsing JSON:', error.message);
}

This will catch and log the error if the JSON is malformed.

USING BUFFER MODULE TO BUFFER DATA

In Node.js, the Buffer module is used to handle raw binary data directly. Buffers are especially
useful when working with binary streams, files, or any other kind of raw data that doesn’t naturally
fit into JavaScript's typical string encoding.

1. Creating Buffers

You can create a buffer in a few ways, such as from a string, an array, or allocating a new buffer
with a specific size.

Example 1: Creating a buffer from a string

const buffer = Buffer.from('Hello, world!', 'utf-8');


console.log(buffer); // <Buffer 48 65 6c 6c 6f 2c 20 77 6f 72 6c 64 21>

In this example, the string 'Hello, world!' is converted to a buffer using the UTF-8 encoding.

Example 2: Creating a buffer from an array

const buffer = Buffer.from([1, 2, 3, 4, 5]);


console.log(buffer); // <Buffer 01 02 03 04 05>

This creates a buffer from an array of numbers, where each number represents a byte in the buffer.

Example 3: Allocating a buffer of a specific size

const buffer = Buffer.alloc(10); // Allocates a buffer of 10 bytes


console.log(buffer); // <Buffer 00 00 00 00 00 00 00 00 00 00>

This creates a buffer of 10 bytes, initialized with 0x00.

2. Reading Data from a Buffer

You can access the individual bytes of a buffer using its indices, just like an array.

Example:
const buffer = Buffer.from('Hello, world!');
console.log(buffer[0]); // 72 (ASCII code for 'H')
console.log(buffer[1]); // 101 (ASCII code for 'e')
3. Writing Data to a Buffer

You can modify data inside the buffer, as buffers are mutable.

Example:const buffer = Buffer.alloc(10);

buffer[0] = 72; // ASCII for 'H'


buffer[1] = 101; // ASCII for 'e'
buffer[2] = 108; // ASCII for 'l'
buffer[3] = 108; // ASCII for 'l'
buffer[4] = 111; // ASCII for 'o'
console.log(buffer.toString()); // "Hello"

4. Buffer Slicing

Buffers can be sliced into smaller buffers without copying the data.

Example:

const buffer = Buffer.from('Hello, world!');


const slicedBuffer = buffer.slice(0, 5);
console.log(slicedBuffer.toString()); // "Hello"

5. Concatenating Buffers

You can concatenate multiple buffers into a single buffer using Buffer.concat().

Example:

const buffer1 = Buffer.from('Hello');


const buffer2 = Buffer.from(' World');
const buffer3 = Buffer.concat([buffer1, buffer2]);

console.log(buffer3.toString()); // "Hello World"

6. Buffer Methods

Here are a few commonly used buffer methods:

 buffer.toString([encoding], [start], [end]): Converts the buffer to a string (default encoding is


UTF-8).
 buffer.copy(targetBuffer, targetStart, sourceStart, sourceEnd): Copies data from one buffer to
another.
 buffer.fill(value, [start], [end]): Fills the buffer with a specific value.
 buffer.includes(value, [byteOffset], [encoding]): Checks if a buffer contains a specific value.

Example of copy():
const buffer1 = Buffer.from('Hello, ');
const buffer2 = Buffer.from('world!');
buffer1.copy(buffer2, 0, 0, buffer1.length); // Copy "Hello, " into buffer2

console.log(buffer2.toString()); // "Hello, world!"

7. Working with Binary Data (Examples)

Buffers are commonly used in scenarios involving file I/O, streams, or binary protocols. For
example, when reading files in binary mode, fs.readFile() returns a buffer.

Example of reading a file as a buffer:

const fs = require('fs');

fs.readFile('example.txt', (err, data) => {


if (err) throw err;
console.log(data); // This is a buffer with raw binary data of the file
});
USING STREAM MODULE TO STREAM DATA

The stream module in Node.js provides a powerful and flexible way to handle streaming data.
Streams allow you to process data piece-by-piece, which is particularly useful when working with
large amounts of data that may not fit into memory all at once, such as reading and writing files or
handling HTTP requests and responses.

Types of Streams in Node.js

1. Readable Streams: Streams that you can read from. Example: fs.createReadStream(), HTTP
requests.
2. Writable Streams: Streams that you can write to. Example: fs.createWriteStream(), HTTP
responses.
3. Duplex Streams: Streams that are both readable and writable. Example: a network socket.
4. Transform Streams: A special type of duplex stream that can modify the data as it is being
read or written. Example: zlib.createGzip() for compression.

1. Reading Data with Readable Streams

Readable streams allow you to read data in chunks, which can be processed as the data is received
(avoiding the need to load the entire file into memory).

Example: Using fs.createReadStream()

const fs = require('fs');

// Create a readable stream


const readableStream = fs.createReadStream('example.txt', 'utf8');

// Read the data chunk by chunk


readableStream.on('data', (chunk) => {
console.log('Received chunk:', chunk);
});

// Handle stream end


readableStream.on('end', () => {
console.log('End of stream reached');
});

// Handle errors
readableStream.on('error', (err) => {
console.error('Error reading stream:', err);
});

2. Writing Data with Writable Streams

Writable streams allow you to write data to a destination. Example: writing to a file, sending HTTP
responses.

Example: Using fs.createWriteStream()

const fs = require('fs');
// Create a writable stream
const writableStream = fs.createWriteStream('output.txt');

// Write data to the stream


writableStream.write('Hello, world!\n');
writableStream.write('Writing more data...\n');

// End the stream


writableStream.end(() => {
console.log('Finished writing to file');
});

// Handle stream errors


writableStream.on('error', (err) => {
console.error('Error writing stream:', err);
});

3. Piping Streams Together

One of the most common use cases for streams is piping data from one stream to another. You can
use the .pipe() method to send data from a readable stream to a writable stream. This is particularly
useful for tasks like file copying, or reading from a network socket and sending the data to a file.

Example: Copying a file using streams

const fs = require('fs');

// Create a readable stream (from input file)


const readableStream = fs.createReadStream('input.txt');
// Create a writable stream (to output file)
const writableStream = fs.createWriteStream('output.txt');

// Pipe the readable stream to the writable stream


readableStream.pipe(writableStream);

writableStream.on('finish', () => {
console.log('File has been copied!');
});

4. Transform Streams

A transform stream is a type of duplex stream where data is transformed as it is read and written.
For example, you could use it to compress data, convert it to a different format, or filter content.

Example: Using zlib to compress a file

const fs = require('fs');
const zlib = require('zlib');

// Create a readable stream (from input file)


const readableStream = fs.createReadStream('input.txt');

// Create a writable stream (to output compressed file)


const writableStream = fs.createWriteStream('output.txt.gz');

// Create a transform stream to gzip the data


const gzipStream = zlib.createGzip();

// Pipe the data through the gzip stream


readableStream.pipe(gzipStream).pipe(writableStream);

writableStream.on('finish', () => {
console.log('File has been compressed!');
});

5. Handling HTTP Requests and Responses with Streams

Streams are often used in web servers, especially when handling large HTTP requests and
responses.

Example: Using Express.js to stream a file

const express = require('express');


const fs = require('fs');
const app = express();

// Serve a large file using streams


app.get('/download', (req, res) => {
const fileStream = fs.createReadStream('large-file.txt');
res.setHeader('Content-Type', 'text/plain');
res.setHeader('Content-Disposition', 'attachment; filename="large-file.txt"');
fileStream.pipe(res);
});

app.listen(3000, () => {
console.log('Server is running on port 3000');
});

6. Flow Control with Streams

Streams in Node.js also support flow control, meaning you can control when data is read from a
stream. The default behavior is flowing mode, where data is read automatically as it becomes
available.

You can switch to paused mode if you want to control when data is read.

Example: Paused mode

const fs = require('fs');
// Create a readable stream in paused mode
const readableStream = fs.createReadStream('example.txt', { highWaterMark: 16 });

// Manually read chunks when ready


readableStream.on('readable', () => {
let chunk;
while (null !== (chunk = readableStream.read())) {
console.log('Read chunk:', chunk.toString());
}
});

7. Error Handling

Proper error handling is crucial when working with streams. You should listen for the 'error' event
on both readable and writable streams.

Example: Error handling in streams

const fs = require('fs');

// Create a readable stream


const readableStream = fs.createReadStream('nonexistent-file.txt');

// Handle errors in the readable stream


readableStream.on('error', (err) => {
console.error('Stream error:', err.message);
});

Summary of Key Points


 Readable Streams: Used for reading data piece-by-piece (fs.createReadStream(), HTTP
requests).
 Writable Streams: Used for writing data (fs.createWriteStream(), HTTP responses).
 Duplex Streams: Both readable and writable (e.g., network sockets).
 Transform Streams: Modify data as it’s read and written (e.g., compression with
zlib.createGzip()).
 Piping: Use .pipe() to pipe data from a readable stream to a writable stream.
 Flow Control: You can manage data flow with readable and read() for finer control.

Streams are powerful for handling large or continuous data and are a fundamental part of Node.js
for I/O-bound tasks.

ACCESSING THE FILE SYSTEM FROM NODE.JS - OPENING, CLOSING, WRITING,


READING FILES AND OTHER FILE SYSTEM TASKS

Node.js provides a built-in fs (file system) module that allows you to perform various tasks related
to files and directories, such as opening, closing, reading, writing, and manipulating files. The fs
module offers both synchronous and asynchronous methods to handle these tasks, which makes it
highly flexible for different use cases.

1. Importing the fs Module

To interact with the file system, you first need to require the fs module:

const fs = require('fs');

2. Opening a File

Asynchronous File Opening:

The fs.open() method opens a file asynchronously, providing you with a file descriptor that you can
use for reading or writing.

Const fs=require(‘fs’);
fs.open('example.txt', 'r', (err, fd) => {
if (err) {
console.error('Error opening file:', err);
return;
}
console.log('File opened successfully with file descriptor:', fd);
});

In the above example:

 'r' specifies that the file will be opened for reading.


 The callback provides the file descriptor fd for further operations (like reading or writing).

Synchronous File Opening:

If you prefer to use the synchronous version, you can use fs.openSync():
Const fs=require(‘fs’);
try {
const fd = fs.openSync('example.txt', 'r');
console.log('File opened successfully with file descriptor:', fd);
} catch (err) {
console.error('Error opening file:', err);
}

3. Closing a File

Asynchronous File Closing:

You can close an opened file using fs.close(), providing the file descriptor that you got from
fs.open().

Const fs=require(‘fs’);
fs.open('example.txt', 'r', (err, fd) => {
if (err) {
console.error('Error opening file:', err);
return;
}

fs.close(fd, (err) => {


if (err) {
console.error('Error closing file:', err);
} else {
console.log('File closed successfully');
}
});
});

Synchronous File Closing:

For synchronous closing, use fs.closeSync():

Const fs=require(‘fs’);
try {
const fd = fs.openSync('example.txt', 'r');
fs.closeSync(fd);
console.log('File closed successfully');
} catch (err) {
console.error('Error closing file:', err);
}

4. Reading a File

Asynchronous File Reading:

The fs.read() method reads data from an opened file. You must first open the file using fs.open(),
and then you can read from it.
fs.open('example.txt', 'r', (err, fd) => {
if (err) {
console.error('Error opening file:', err);
return;
}

const buffer = Buffer.alloc(1024); // Allocate buffer of 1024 bytes


fs.read(fd, buffer, 0, buffer.length, 0, (err, bytesRead, buffer) => {
if (err) {
console.error('Error reading file:', err);
} else {
console.log('Bytes read:', bytesRead);
console.log('Content:', buffer.toString('utf8', 0, bytesRead));
}

fs.close(fd, (err) => {


if (err) console.error('Error closing file:', err);
});
});
});

Synchronous File Reading:

For synchronous reading, you can use fs.readFileSync():

try {
const data = fs.readFileSync('example.txt', 'utf8');
console.log('File content:', data);
} catch (err) {
console.error('Error reading file:', err);
}

5. Writing to a File

Asynchronous File Writing:

The fs.write() method allows you to write data to a file. You must open the file first.

fs.open('example.txt', 'w', (err, fd) => {


if (err) {
console.error('Error opening file:', err);
return;
}

const data = 'Hello, world!';


fs.write(fd, data, 0, data.length, null, (err, written, string) => {
if (err) {
console.error('Error writing to file:', err);
} else {
console.log(`${written} bytes written to file`);
}

fs.close(fd, (err) => {


if (err) console.error('Error closing file:', err);
});
});
});

Synchronous File Writing:

To write to a file synchronously, use fs.writeFileSync():

try {
fs.writeFileSync('example.txt', 'Hello, world!');
console.log('Data written to file');
} catch (err) {
console.error('Error writing to file:', err);
}

6. Appending Data to a File

Asynchronous File Appending:

To append data to a file (instead of overwriting it), use fs.appendFile():

fs.appendFile('example.txt', 'Appended data.\n', (err) => {


if (err) {
console.error('Error appending data to file:', err);
} else {
console.log('Data appended successfully');
}
});

Synchronous File Appending:

To append data synchronously, use fs.appendFileSync():

try {
fs.appendFileSync('example.txt', 'Appended data.\n');
console.log('Data appended successfully');
} catch (err) {
console.error('Error appending data to file:', err);
}

7. Renaming a File

You can rename a file using fs.rename():


fs.rename('oldName.txt', 'newName.txt', (err) => {
if (err) {
console.error('Error renaming file:', err);
} else {
console.log('File renamed successfully');
}
});

8. Deleting a File

To delete a file, use fs.unlink():

fs.unlink('example.txt', (err) => {


if (err) {
console.error('Error deleting file:', err);
} else {
console.log('File deleted successfully');
}
});

9. Checking if a File Exists

To check if a file exists, use fs.existsSync() (synchronous):

if (fs.existsSync('example.txt')) {
console.log('File exists');
} else {
console.log('File does not exist');
}

For an asynchronous check, you can use fs.access():

fs.access('example.txt', fs.constants.F_OK, (err) => {


if (err) {
console.error('File does not exist');
} else {
console.log('File exists');
}
});

10. Reading Directory Contents

To list the contents of a directory, use fs.readdir():

fs.readdir('.', (err, files) => {


if (err) {
console.error('Error reading directory:', err);
} else {
console.log('Directory contents:', files);
}
});

11. Creating and Removing Directories

 Creating a directory: fs.mkdir() (asynchronous) or fs.mkdirSync() (synchronous):

fs.mkdir('newDir', (err) => {


if (err) {
console.error('Error creating directory:', err);
} else {
console.log('Directory created successfully');
}
});

 Removing a directory: fs.rmdir() (asynchronous) or fs.rmdirSync() (synchronous):

fs.rmdir('newDir', (err) => {


if (err) {
console.error('Error removing directory:', err);
} else {
console.log('Directory removed successfully');
}
});

Summary

The fs module provides both asynchronous and synchronous methods for performing file system
operations. It is highly versatile, allowing you to:

 Open, close, read, and write files


 Append data to files
 Rename and delete files
 Create and remove directories
 Check file existence and read directory contents

For most I/O tasks, the asynchronous methods are preferred, as they avoid blocking the event loop,
making your application more efficient and responsive.

IMPLEMENTING HTTP SERVICES IN NODE.JS-PROCESSING URLS

In Node.js, you can create HTTP services and handle requests using the built-in http module. The
http module allows you to process URLs, handle HTTP methods (GET, POST, PUT, DELETE,
etc.), and send responses to the client.

Setting Up a Basic HTTP Server

To start, let's create a basic HTTP server using Node.js:

const http = require('http');


// Create an HTTP server
const server = http.createServer((req, res) => {
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end('Hello, World!\n');
});

// Listen on port 3000


server.listen(3000, () => {
console.log('Server is running at http://localhost:3000/');
});

This basic server sends a "Hello, World!" message when you visit http://localhost:3000/ in your
browser.

1. Processing URLs in HTTP Requests

When a client makes an HTTP request, it includes a URL that contains important information such
as the path, query parameters, and sometimes fragments. In Node.js, we can process these
components and use them to perform different actions.

Example: Extracting Path and Query Parameters

To handle URLs and extract query parameters, we can use the url module, which provides utility
methods to parse and format URLs.

Here’s an example of how you can process URLs in an HTTP server:

const http = require('http');


const url = require('url');

// Create an HTTP server


const server = http.createServer((req, res) => {
// Parse the incoming request URL
const parsedUrl = url.parse(req.url, true); // 'true' to parse query parameters

const pathname = parsedUrl.pathname;


const query = parsedUrl.query; // Contains query parameters

// Log the path and query parameters to the console


console.log(`Path: ${pathname}`);
console.log(`Query Parameters:`, query);

// Send a response back


res.statusCode = 200;
res.setHeader('Content-Type', 'application/json');

// Example response with query parameters


res.end(JSON.stringify({
message: 'Request received',
path: pathname,
query: query,
}));
});

// Listen on port 3000


server.listen(3000, () => {
console.log('Server is running at http://localhost:3000/');
});

Example request:

 Visit http://localhost:3000/products?category=electronics&price=1000

The server will log the following:

Path: /products
Query Parameters: { category: 'electronics', price: '1000' }

And respond with:

{
"message": "Request received",
"path": "/products",
"query": {
"category": "electronics",
"price": "1000"
}
}

2. Handling Different HTTP Methods

The req (request) object contains the HTTP method (GET, POST, PUT, DELETE, etc.) that was
used in the request. We can check the HTTP method and route the request accordingly.

Example: Handling Different Methods

const http = require('http');


const url = require('url');

// Create an HTTP server


const server = http.createServer((req, res) => {
const parsedUrl = url.parse(req.url, true);
const pathname = parsedUrl.pathname;

// Handle different HTTP methods


if (req.method === 'GET') {
if (pathname === '/hello') {
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end('Hello, GET request received!');
} else {
res.statusCode = 404;
res.end('Not Found');
}
} else if (req.method === 'POST' && pathname === '/hello') {
let body = '';
req.on('data', chunk => {
body += chunk; // Collect the data chunks
});
req.on('end', () => {
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end(`Received POST data: ${body}`);
});
} else {
res.statusCode = 405; // Method Not Allowed
res.end('Method Not Allowed');
}
});

// Listen on port 3000


server.listen(3000, () => {
console.log('Server is running at http://localhost:3000/');
});

Example requests:

 GET request to http://localhost:3000/hello:


o Response: Hello, GET request received!
 POST request to http://localhost:3000/hello with body name=John:
o Response: Received POST data: name=John

3. Dynamic Routing with Parameters

You can use dynamic route matching by processing parts of the URL path. You can extract variables
from the path using regular expressions or by splitting the URL path.

Example: Handling Dynamic Routes

const http = require('http');


const url = require('url');

// Create an HTTP server


const server = http.createServer((req, res) => {
const parsedUrl = url.parse(req.url, true);
const pathname = parsedUrl.pathname;

// Extract dynamic parameters from the URL path


const parts = pathname.split('/').filter(Boolean); // Split and remove empty strings

if (parts[0] === 'user' && parts[1]) {


const userId = parts[1]; // Extract user ID from the URL path
res.statusCode = 200;
res.setHeader('Content-Type', 'application/json');
res.end(JSON.stringify({
message: `User details for user ID ${userId}`,
userId: userId,
}));
} else {
res.statusCode = 404;
res.end('Not Found');
}
});

// Listen on port 3000


server.listen(3000, () => {
console.log('Server is running at http://localhost:3000/');
});

Example request:

 Visit http://localhost:3000/user/12345

The server will respond with:

{
"message": "User details for user ID 12345",
"userId": "12345"
}

4. Query Parameters in Dynamic Routes

You can combine both dynamic path parameters and query parameters to create more sophisticated
URLs.

Example: Using Query Parameters with Dynamic Paths

const http = require('http');


const url = require('url');

// Create an HTTP server


const server = http.createServer((req, res) => {
const parsedUrl = url.parse(req.url, true); // Parse query parameters
const pathname = parsedUrl.pathname;

if (pathname === '/products') {


const category = parsedUrl.query.category || 'all';
const price = parsedUrl.query.price || 'any';
res.statusCode = 200;
res.setHeader('Content-Type', 'application/json');
res.end(JSON.stringify({
message: `Displaying products in category ${category} with price ${price}`,
category: category,
price: price,
}));
} else {
res.statusCode = 404;
res.end('Not Found');
}
});

// Listen on port 3000


server.listen(3000, () => {
console.log('Server is running at http://localhost:3000/');
});

Example request:

 Visit http://localhost:3000/products?category=electronics&price=1000

The server will respond with:

{
"message": "Displaying products in category electronics with price 1000",
"category": "electronics",
"price": "1000"
}

5. Handling 404 and Default Routes

To handle requests for routes that are not defined, you can implement a default route that returns a
404 response.

const http = require('http');

// Create an HTTP server


const server = http.createServer((req, res) => {
if (req.url === '/hello') {
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end('Hello, World!');
} else {
// Handle unknown routes with 404
res.statusCode = 404;
res.setHeader('Content-Type', 'text/plain');
res.end('404 Not Found');
}
});

// Listen on port 3000


server.listen(3000, () => {
console.log('Server is running at http://localhost:3000/');
});

Conclusion

In Node.js, handling HTTP requests and processing URLs is a core part of building HTTP services.
Using the built-in http and url modules, you can:

 Handle different HTTP methods (GET, POST, etc.)


 Extract and process path and query parameters from the URL
 Implement dynamic routing
 Handle 404 errors and default routes

These techniques enable you to build flexible and efficient web services and APIs in Node.js.

PROCESSING QUERY STRINGS AND FORM PARAMETERS

In Node.js, processing query strings (from URLs) and form parameters (from POST requests) is a
common task, especially when building web applications or APIs. We can handle both of these types
of data with the help of built-in modules like url, querystring, and handling request bodies directly.

Let’s dive into how we can process query strings (GET requests) and form parameters (POST
requests) in Node.js.

1. Processing Query Strings (GET Requests)

Query strings are the part of the URL that comes after the ? character and are used to send data to
the server. They are typically used in GET requests.

Example URL:

http://localhost:3000/search?category=electronics&price=1000

In the above URL:

 category=electronics
 price=1000

These are query parameters that we can extract from the request.

Processing Query Strings with url.parse()

Node.js provides the url module to parse the request URL and extract query parameters.
Here’s how we can extract query parameters:

const http = require('http');


const url = require('url');

// Create HTTP server


const server = http.createServer((req, res) => {
// Parse the incoming request URL
const parsedUrl = url.parse(req.url, true); // 'true' to parse query strings
const query = parsedUrl.query; // Contains query parameters

// Access individual query parameters


const category = query.category || 'all';
const price = query.price || 'any';

// Respond with the query parameters


res.statusCode = 200;
res.setHeader('Content-Type', 'application/json');
res.end(JSON.stringify({
message: 'Search results',
category: category,
price: price,
}));
});

// Listen on port 3000


server.listen(3000, () => {
console.log('Server running at http://localhost:3000/');
});

Example Request:

 URL: http://localhost:3000/search?category=electronics&price=1000

Response:

{
"message": "Search results",
"category": "electronics",
"price": "1000"
}

Query Strings with Default Values

In the example above, the code checks if the query parameters are provided. If they are missing, it
uses default values ('all' for category and 'any' for price).

2. Processing Form Parameters (POST Requests)


Form parameters are typically sent in POST requests, and they contain data submitted from forms
(e.g., login forms, registration forms). When dealing with form parameters, the data is usually sent
in the request body.

Using the querystring Module for Form Parameters

Node.js does not parse the body of a POST request by default. To process form parameters, you
need to read the body and parse it.

You can use the querystring module (or URLSearchParams in newer Node.js versions) to parse form
data, but we’ll first need to collect the data from the request body.

Here’s how you can handle form parameters in a POST request:

const http = require('http');


const querystring = require('querystring');

// Create an HTTP server


const server = http.createServer((req, res) => {
// Handle only POST requests to /submit
if (req.method === 'POST' && req.url === '/submit') {
let body = '';

// Collect the data in chunks as they come in


req.on('data', chunk => {
body += chunk;
});

// Once all the data is received


req.on('end', () => {
// Parse the form data (assuming it's URL-encoded)
const parsedData = querystring.parse(body);

// Access individual form fields


const username = parsedData.username || 'Guest';
const password = parsedData.password || 'No password provided';

// Respond with the form data


res.statusCode = 200;
res.setHeader('Content-Type', 'application/json');
res.end(JSON.stringify({
message: 'Form submitted successfully',
username: username,
password: password,
}));
});
} else {
// Handle unsupported requests
res.statusCode = 404;
res.end('Not Found');
}
});

// Listen on port 3000


server.listen(3000, () => {
console.log('Server running at http://localhost:3000/');
});

Example Request (POST):

 URL: http://localhost:3000/submit
 Form Data (submitted via a form):
o username=alice
o password=secret123

To simulate sending a POST request from a form, you can use curl or an HTML form.

Example with curl:

curl -X POST http://localhost:3000/submit -d "username=alice&password=secret123"

Response:

{
"message": "Form submitted successfully",
"username": "alice",
"password": "secret123"
}

3. Handling JSON Data in POST Requests

In modern web applications, it's common to send JSON data in POST requests, especially when
interacting with APIs. To handle JSON payloads, you need to parse the request body as JSON.

Example: Processing JSON Data (POST)

const http = require('http');

// Create an HTTP server


const server = http.createServer((req, res) => {
// Handle only POST requests to /json
if (req.method === 'POST' && req.url === '/json') {
let body = '';
// Collect data chunks as they come in
req.on('data', chunk => {
body += chunk;
});

// Once all data is collected, parse the JSON


req.on('end', () => {
try {
const parsedData = JSON.parse(body); // Parse JSON data

// Access individual fields from the JSON object


const name = parsedData.name || 'Anonymous';
const age = parsedData.age || 'Not provided';

// Respond with the JSON data


res.statusCode = 200;
res.setHeader('Content-Type', 'application/json');
res.end(JSON.stringify({
message: 'JSON data received',
name: name,
age: age,
}));
} catch (err) {
res.statusCode = 400; // Bad Request
res.end('Invalid JSON');
}
});
} else {
// Handle unsupported requests
res.statusCode = 404;
res.end('Not Found');
}
});

// Listen on port 3000


server.listen(3000, () => {
console.log('Server running at http://localhost:3000/');
});

Example Request (POST with JSON):

To send JSON data via curl, use the following command:

curl -X POST http://localhost:3000/json -H "Content-Type: application/json" -d


'{"name":"John","age":30}'

Response:
{
"message": "JSON data received",
"name": "John",
"age": 30
}

4. URL Encoding and Decoding

Sometimes, form data or query parameters can contain special characters that need to be encoded or
decoded properly. This can be done using encodeURIComponent and decodeURIComponent.

 Encoding: encodeURIComponent('Hello World!') will result in "Hello%20World%21"


 Decoding: decodeURIComponent('Hello%20World%21') will result in "Hello World!"

You can use these functions when constructing or processing URLs to handle special characters
properly.

Conclusion

In Node.js, you can process query strings (GET request parameters) and form parameters (POST
request body) using built-in modules like url, querystring, and http. Here are the main techniques we
covered:

 Query Strings (GET Requests): Extracting data from the URL using url.parse().
 Form Parameters (POST Requests): Parsing form data from the request body using
querystring.parse() for URL-encoded data.
 JSON Data (POST Requests): Handling JSON payloads by parsing the body as JSON with
JSON.parse().
 URL Encoding/Decoding: Using encodeURIComponent and decodeURIComponent to
handle special characters.

These are common tasks when building web services and APIs in Node.js, and these approaches can
be easily extended to handle more complex use cases.

UNDERSTANDING REQUEST, RESPONSE AND SERVER OBJECTS

In Node.js, the http module provides the core functionality to create HTTP servers and process
HTTP requests and responses. The http module's server is built around the request (req) and
response (res) objects, which play crucial roles in handling incoming requests and sending out
responses.

1. The request Object (req)

The request object represents the incoming HTTP request made by a client (e.g., a browser, mobile
app, or any HTTP client). It contains important information about the request such as headers, URL,
HTTP method (GET, POST, etc.), query parameters, request body, and more.
Key Properties of the req Object:

 req.method: The HTTP method used for the request (e.g., 'GET', 'POST', 'PUT', 'DELETE').
 req.url: The full URL of the request, including the path and query string (e.g., /home?
search=query).
 req.headers: An object containing the request headers, such as User-Agent, Content-Type,
Authorization, etc.
 req.body: This contains the body of the request, which is typically populated in POST and
PUT requests. It requires parsing (e.g., JSON or URL-encoded data).
 req.query: Contains the query string parameters from the URL (e.g., ?name=John&age=30).
For parsing query strings, url.parse() can be used or express framework can parse it
automatically.
 req.params: In routes with dynamic path parameters, the req.params object stores those
variables. For example, in a route like /user/:id, req.params.id will contain the value of id.

Example: Inspecting the Request Object

const http = require('http');

const server = http.createServer((req, res) => {


// Log request method, URL, and headers
console.log(`Method: ${req.method}`);
console.log(`URL: ${req.url}`);
console.log(`Headers:`, req.headers);

res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end('Request details logged.');
});

server.listen(3000, () => {
console.log('Server is running at http://localhost:3000/');
});

2. The response Object (res)

The response object represents the HTTP response that the server sends back to the client. It
contains methods and properties for setting the status code, headers, and body of the response.

Key Methods of the res Object:

 res.statusCode: Set the HTTP status code for the response (e.g., 200 for success, 404 for not
found).
 res.setHeader(name, value): Set a specific HTTP header in the response. For example, you
can use res.setHeader('Content-Type', 'application/json') to set the response type to JSON.
 res.end([data]): End the response and optionally send data (e.g., a string or a buffer). If no
data is provided, the response is ended with no body.
 res.write(data): Used to send data in chunks (e.g., for streaming large files). You can call
res.write() multiple times, and finally call res.end() to finish the response.
 res.json(obj) (in frameworks like Express): Sends a JSON response to the client. This is not
available in the core http module but can be implemented by calling
res.end(JSON.stringify(obj)).

Example: Responding with Data

const http = require('http');

const server = http.createServer((req, res) => {


// Set status code and headers
res.statusCode = 200;
res.setHeader('Content-Type', 'application/json');

// Send a JSON response


const responseData = {
message: 'Request successful',
method: req.method,
url: req.url,
};

res.end(JSON.stringify(responseData)); // Send the response body


});

server.listen(3000, () => {
console.log('Server is running at http://localhost:3000/');
});

3. The server Object

The server object represents the HTTP server itself. It's an instance of the http.Server class, which is
created using http.createServer().

Creating and Starting the Server

const http = require('http');

// Create an HTTP server


const server = http.createServer((req, res) => {
// Handle requests and send responses here
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end('Hello, World!');
});

// Start the server and listen on a port


server.listen(3000, () => {
console.log('Server is running at http://localhost:3000/');
});
The server.listen() method binds the server to a specific port and makes it listen for incoming
requests.

 server.listen(port, hostname, callback): This method listens on the specified port (e.g.,
3000) and hostname (optional, usually 'localhost'). The callback function is executed once
the server starts.
 server.close(): This method is used to stop the server from accepting new connections.

Example: Creating a Simple Server

const http = require('http');

const server = http.createServer((req, res) => {


res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end('Server is working!');
});

server.listen(3000, () => {
console.log('Server is running at http://localhost:3000/');
});

4. Workflow with req, res, and server Objects

When a request is made to your server, it follows a flow like this:

1. Server receives the request: The server created with http.createServer() listens for
incoming requests. When a request comes in, the callback function you provided is invoked,
and the req (request) and res (response) objects are passed to it.
2. Request processing: Inside the callback function, you inspect the req object to process the
request, including the HTTP method, headers, URL, and body. You can also access
parameters from the query string or URL path (e.g., req.query, req.params).
3. Response generation: After processing the request, you use the res object to build the
response. You can set status codes (e.g., res.statusCode = 200), set headers (e.g.,
res.setHeader('Content-Type', 'application/json')), and send the response body (e.g.,
res.end('Hello, World!')).
4. Server sends the response: Once the response is fully constructed, the res.end() method
sends the response back to the client.

Example with Full Flow

const http = require('http');


const url = require('url');

const server = http.createServer((req, res) => {


const parsedUrl = url.parse(req.url, true); // Parse URL and query parameters
const pathname = parsedUrl.pathname;
if (req.method === 'GET' && pathname === '/greet') {
const name = parsedUrl.query.name || 'Guest'; // Get the query parameter 'name'

res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end(`Hello, ${name}!`); // Respond with a greeting message
} else {
res.statusCode = 404;
res.setHeader('Content-Type', 'text/plain');
res.end('Not Found'); // Handle unknown routes
}
});

server.listen(3000, () => {
console.log('Server running at http://localhost:3000/');
});

Example Request:

 URL: http://localhost:3000/greet?name=John

Response:

Hello, John!

5. Summary of Key Objects

 req (Request Object): Represents the incoming HTTP request. It contains details about the
request, such as the URL, HTTP method, headers, query parameters, and body data.
 res (Response Object): Represents the outgoing HTTP response. You use it to set the status
code, headers, and body of the response.
 server (Server Object): Represents the HTTP server that listens for incoming requests and
manages responses. It's created using http.createServer() and listens on a specified port.

Together, these objects form the core functionality for handling HTTP requests and responses in
Node.js.

IMPLEMENTING HTTP CLIENTS AND SERVERS IN NODE.JS

In Node.js, you can implement both HTTP servers (to handle incoming requests) and HTTP
clients (to make requests to other servers). Node.js provides built-in modules such as http and https
to implement both.

1. Implementing an HTTP Server in Node.js

An HTTP server listens for incoming HTTP requests from clients and sends responses. The http
module allows you to create a simple server by using the http.createServer() method.

Here is a basic example of creating an HTTP server that responds with "Hello, World!" for every
request.
Example: Creating an HTTP Server

const http = require('http');

// Create an HTTP server


const server = http.createServer((req, res) => {
// Set status code and headers for the response
res.statusCode = 200; // HTTP 200 OK
res.setHeader('Content-Type', 'text/plain'); // Set content type as plain text

// Write the response body


res.end('Hello, World!');
});

// Make the server listen on port 3000


server.listen(3000, () => {
console.log('Server is running at http://localhost:3000/');
});

Explanation:

 http.createServer(callback): Creates an HTTP server and the callback is invoked whenever


a request is received.
 req (Request): Contains information about the incoming request (e.g., URL, headers, HTTP
method).
 res (Response): Used to send the response back to the client (e.g., setting status code,
headers, and body).
 server.listen(3000): Tells the server to start listening on port 3000.

2. Implementing an HTTP Client in Node.js

An HTTP client in Node.js makes requests to other servers (e.g., sending a GET request to an API
or making a POST request). You can use the http module or the https module, depending on whether
you're connecting to a secure or non-secure server.

Example: Creating an HTTP Client to Make a GET Request

In this example, we will create an HTTP client that makes a GET request to a server.

const http = require('http');

// Define the options for the HTTP GET request


const options = {
hostname: 'jsonplaceholder.typicode.com', // The server URL
path: '/todos/1', // Path of the resource we want to fetch
method: 'GET', // HTTP method
};

// Make the HTTP request


const req = http.request(options, (res) => {
let data = '';

// Collect the response data


res.on('data', (chunk) => {
data += chunk;
});

// Once the response is complete, process the data


res.on('end', () => {
console.log('Response:', data); // Log the response body
});
});

// Handle request errors


req.on('error', (err) => {
console.error('Error:', err.message);
});

// End the request (i.e., actually send it)


req.end();

Explanation:

 http.request(options, callback): Makes an HTTP request to the server. The options object
specifies the hostname, path, and method (GET in this case).
 res.on('data', callback): Listens for chunks of data that the server sends back in response.
 res.on('end', callback): Fires when the entire response is received.
 req.end(): Sends the request to the server.

Example: Making a POST Request with HTTP Client

You can also use the HTTP client to send data using the POST method.

const http = require('http');

// Data to be sent in the POST request


const postData = JSON.stringify({
title: 'foo',
body: 'bar',
userId: 1,
});

// Define the options for the HTTP POST request


const options = {
hostname: 'jsonplaceholder.typicode.com',
path: '/posts',
method: 'POST',
headers: {
'Content-Type': 'application/json', // Set content type to JSON
'Content-Length': Buffer.byteLength(postData), // Set the length of the content
},
};

// Make the HTTP request


const req = http.request(options, (res) => {
let data = '';

// Collect the response data


res.on('data', (chunk) => {
data += chunk;
});

// Once the response is complete, process the data


res.on('end', () => {
console.log('Response:', data); // Log the response body
});
});

// Handle request errors


req.on('error', (err) => {
console.error('Error:', err.message);
});

// Send the POST data


req.write(postData);
req.end();

Explanation:

 method: 'POST': The HTTP method used to send data to the server.
 headers: The Content-Type header tells the server that the body of the request contains
JSON data. The Content-Length header tells the server the length of the data being sent.
 req.write(postData): Writes the data to be sent in the body of the request.
 req.end(): Ends the request, sending the data.

3. Handling Errors in HTTP Client

When making requests with the HTTP client, it is important to handle errors. Errors can occur due
to network issues, timeouts, or server unavailability.

Example: Handling Request Errors

const http = require('http');

const options = {
hostname: 'jsonplaceholder.typicode.com',
path: '/nonexistent', // Invalid path to simulate an error
method: 'GET',
};

const req = http.request(options, (res) => {


let data = '';

res.on('data', (chunk) => {


data += chunk;
});

res.on('end', () => {
console.log('Response:', data);
});
});

req.on('error', (err) => {


console.error('Request failed:', err.message); // Log error message
});

req.end();

4. Using https for Secure Connections

If you are making requests to a secure server (e.g., https:// URLs), you should use the https module
instead of the http module. The usage is the same, but it works for HTTPS connections.

Example: HTTPS Client

const https = require('https');

// Define the options for the HTTPS GET request


const options = {
hostname: 'jsonplaceholder.typicode.com',
path: '/todos/1',
method: 'GET',
};

// Make the HTTPS request


const req = https.request(options, (res) => {
let data = '';

res.on('data', (chunk) => {


data += chunk;
});

res.on('end', () => {
console.log('Response:', data);
});
});
req.on('error', (err) => {
console.error('Error:', err.message);
});

req.end();

5. Using External Libraries for HTTP Requests

For more advanced HTTP client functionality (e.g., handling cookies, retries, redirects), you might
want to use external libraries like axios, node-fetch, or request (deprecated).

Example: Using axios (External HTTP Client Library)

const axios = require('axios');

axios.get('https://jsonplaceholder.typicode.com/todos/1')
.then(response => {
console.log('Response:', response.data); // Log the response data
})
.catch(error => {
console.error('Error:', error.message);
});

Conclusion

In Node.js, you can build both HTTP clients and servers using the built-in http and https modules:

 HTTP Server: Use http.createServer() to create a server that listens for incoming requests
and sends responses.
 HTTP Client: Use http.request() or https.request() to send requests to other servers and
handle their responses.
 POST Requests: Send data in the body of the request using req.write() for POST, PUT, or
PATCH methods.

For more advanced scenarios, you may want to explore external libraries like axios, which
simplifies making HTTP requests and handling responses.

IMPLEMENTING HTTPS SERVERS AND CLIENTS

In Node.js, you can implement both HTTPS servers (to handle secure incoming requests) and
HTTPS clients (to make secure requests to other servers). This is accomplished using the https
module, which provides the ability to create secure servers and clients that use SSL/TLS encryption.

1. Implementing an HTTPS Server in Node.js

To implement an HTTPS server in Node.js, you need an SSL/TLS certificate. These certificates are
used to encrypt the communication between the server and the client.

Prerequisites:
1. An SSL certificate (.crt) and a private key (.key). You can generate these using tools like
openssl, or use a service to obtain one.
2. These files will be used to configure the HTTPS server.

Example: Creating an HTTPS Server

const https = require('https');


const fs = require('fs');

// Read the SSL certificate and private key


const options = {
key: fs.readFileSync('path/to/private-key.key'), // Path to private key
cert: fs.readFileSync('path/to/certificate.crt'), // Path to SSL certificate
ca: fs.readFileSync('path/to/ca-certificate.crt'), // (Optional) Path to CA certificate
};

// Create the HTTPS server


const server = https.createServer(options, (req, res) => {
// Set the status code and content type
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');

// Write the response body


res.end('Hello, Secure World!');
});

// Make the server listen on port 3000


server.listen(3000, () => {
console.log('HTTPS server is running at https://localhost:3000/');
});

Explanation:

 SSL/TLS Certificates: The key, cert, and optionally ca are used to configure the HTTPS
server. These files contain the server’s private key, certificate, and optionally the CA
certificate, respectively.
o key: The server's private key.
o cert: The server's public certificate.
o ca: The certificate authority's certificate (if needed).
 https.createServer(options, callback): Creates an HTTPS server with the provided
SSL/TLS options.
 Server listens on HTTPS port (3000): The server is configured to listen on port 3000 for
secure HTTPS connections.

2. Implementing an HTTPS Client in Node.js

An HTTPS client can be used to make secure requests to remote HTTPS servers. The https module
in Node.js is used to make these requests.
Example: Creating an HTTPS Client to Make a GET Request

const https = require('https');

// Define the options for the HTTPS GET request


const options = {
hostname: 'jsonplaceholder.typicode.com', // Server URL
path: '/todos/1', // Path of the resource we want to fetch
method: 'GET', // HTTP method
};

// Make the HTTPS request


const req = https.request(options, (res) => {
let data = '';

// Collect the response data


res.on('data', (chunk) => {
data += chunk;
});

// Once the response is complete, process the data


res.on('end', () => {
console.log('Response:', data); // Log the response body
});
});

// Handle request errors


req.on('error', (err) => {
console.error('Error:', err.message);
});

// End the request (i.e., actually send it)


req.end();

Explanation:

 https.request(options, callback): Sends an HTTPS request to the specified server. The


options object contains the server details, such as the hostname, path, and HTTP method.
 req.end(): Sends the request.
 res.on('data', callback): Collects chunks of data from the server’s response.
 res.on('end', callback): Fires when the entire response has been received.
 req.on('error', callback): Catches any errors that occur during the request (e.g., connection
failure).

3. Handling SSL/TLS Options in HTTPS Client

In some cases, you might need to configure the HTTPS client to accept certain SSL/TLS certificates
(e.g., when dealing with self-signed certificates). You can pass SSL options in the https.request()
function.
Example: Using HTTPS Client with Custom SSL/TLS Certificates

const https = require('https');


const fs = require('fs');

// Define the options for the HTTPS request with custom SSL certificate
const options = {
hostname: 'example.com',
path: '/secure-endpoint',
method: 'GET',
cert: fs.readFileSync('path/to/certificate.crt'), // Client's SSL certificate
key: fs.readFileSync('path/to/private-key.key'), // Client's private key
ca: fs.readFileSync('path/to/ca-certificate.crt'), // Certificate authority's certificate
rejectUnauthorized: false, // Set to false if you want to allow self-signed certificates
};

// Make the HTTPS request with custom certificates


const req = https.request(options, (res) => {
let data = '';

// Collect the response data


res.on('data', (chunk) => {
data += chunk;
});

res.on('end', () => {
console.log('Response:', data); // Log the response body
});
});

// Handle request errors


req.on('error', (err) => {
console.error('Error:', err.message);
});

// End the request


req.end();

Explanation:

 cert: The client's certificate is sent as part of the request.


 key: The client's private key is used to authenticate the client to the server.
 ca: The certificate authority's certificate is used to verify the server’s certificate.
 rejectUnauthorized: false: This option tells the client not to reject connections to servers
with invalid certificates (e.g., self-signed certificates).

4. Self-Signed Certificates
If you are using self-signed certificates for local development or testing, you may need to add an
additional option to the client (rejectUnauthorized: false) to allow the connection, as self-signed
certificates are not trusted by default.

Example: Using Self-Signed Certificates (HTTPS Client)

const https = require('https');


const fs = require('fs');

// Define the options for the HTTPS client with self-signed certificate
const options = {
hostname: 'localhost',
port: 3000,
path: '/',
method: 'GET',
ca: fs.readFileSync('path/to/self-signed-ca.crt'), // Path to self-signed CA certificate
rejectUnauthorized: false, // Allow self-signed certificates
};

// Make the HTTPS request


const req = https.request(options, (res) => {
let data = '';

res.on('data', (chunk) => {


data += chunk;
});

res.on('end', () => {
console.log('Response:', data);
});
});

// Handle request errors


req.on('error', (err) => {
console.error('Error:', err.message);
});

req.end();

5. Using HTTPS Server with Node.js (For Local Development)

If you want to test an HTTPS server locally with a self-signed certificate, you can generate one
using OpenSSL and then configure your HTTPS server accordingly.

Example: Creating a Self-Signed Certificate Using OpenSSL

To create a self-signed certificate for local testing, you can run the following OpenSSL command:
openssl req -x509 -newkey rsa:4096 -keyout private-key.key -out certificate.crt -days 365

This will generate:

 private-key.key: Your private key.


 certificate.crt: Your self-signed certificate.

Example: Using a Self-Signed Certificate with HTTPS Server

const https = require('https');


const fs = require('fs');

// Read the self-signed certificate and private key


const options = {
key: fs.readFileSync('private-key.key'),
cert: fs.readFileSync('certificate.crt'),
};

// Create the HTTPS server


const server = https.createServer(options, (req, res) => {
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end('Hello, Secure World with Self-Signed Certificate!');
});

// Listen on port 3000


server.listen(3000, () => {
console.log('HTTPS server is running at https://localhost:3000/');
});

Conclusion

 HTTPS Server: Use the https module with an SSL/TLS certificate and private key to create
a secure server.
 HTTPS Client: Use the https module to make secure requests to HTTPS servers, passing
SSL/TLS certificates if needed.
 Self-Signed Certificates: If you're using self-signed certificates for local development,
configure both the client and server to allow them (e.g., rejectUnauthorized: false for the
client).

For more advanced use cases, you can also explore features like mutual TLS (client-side
authentication) and configure additional SSL options to enhance security.

USING ADDITIONAL NODE.JS MODULES-USING THE OS MODULE

The os module in Node.js provides a number of operating system-related utility methods that can be
helpful in retrieving information about the system, including CPU, memory, and network-related
data. It's a built-in module, so you don’t need to install anything extra to use it.
Here are some of the commonly used methods in the os module and how you can use them:

1. Getting Information About the System

1.1. os.platform()

This method returns a string identifying the operating system platform. For example, it could return
'darwin' for macOS, 'linux' for Linux, or 'win32' for Windows.

const os = require('os');
console.log('Platform:', os.platform());

1.2. os.arch()

This method returns a string identifying the architecture of the operating system. It could return
'x64', 'arm', 'ia32', etc.

console.log('Architecture:', os.arch());

1.3. os.hostname()

Returns the hostname of the operating system.

console.log('Hostname:', os.hostname());

1.4. os.type()

This method returns a string identifying the operating system name (e.g., 'Linux', 'Darwin', or
'Windows_NT').

console.log('OS Type:', os.type());

1.5. os.release()

Returns the operating system's release version.

console.log('OS Release:', os.release());

1.6. os.uptime()

Returns the system uptime in seconds (how long the system has been running).

console.log('System Uptime:', os.uptime(), 'seconds');

2. Getting Information About System Memory

2.1. os.totalmem()

Returns the total amount of system memory (in bytes).

console.log('Total Memory:', os.totalmem(), 'bytes');


2.2. os.freemem()

Returns the amount of free system memory (in bytes).

console.log('Free Memory:', os.freemem(), 'bytes');

2.3. os.cpus()

Returns an array of objects containing information about each logical CPU core. Each object
contains details like model, speed, and times spent in user, system, and idle modes.

console.log('CPU Info:', os.cpus());

3. Getting Network Interfaces Information

3.1. os.networkInterfaces()

Returns an object containing network interfaces that are available on the system. This is useful for
retrieving IP addresses and other network-related data.

console.log('Network Interfaces:', os.networkInterfaces());

Example output might look like this:

{
"eth0": [
{
"address": "192.168.1.10",
"netmask": "255.255.255.0",
"family": "IPv4",
"mac": "00:1a:2b:3c:4d:5e",
"internal": false,
"cidr": "192.168.1.10/24"
}
],
"lo0": [
{
"address": "127.0.0.1",
"netmask": "255.0.0.0",
"family": "IPv4",
"mac": "00:00:00:00:00:00",
"internal": true,
"cidr": "127.0.0.1/8"
}
]
}

4. Getting User Information

4.1. os.userInfo()
This method returns an object with the current user's information (e.g., username, home directory,
shell).

console.log('User Info:', os.userInfo());

Example output:

{
"username": "your-username",
"uid": 501,
"gid": 20,
"shell": "/bin/bash",
"homedir": "/Users/your-username"
}

5. Getting System Temporary Directory

5.1. os.tmpdir()

Returns the operating system’s default directory for temporary files.

console.log('Temp Directory:', os.tmpdir());

6. Getting System Endianness

6.1. os.endianness()

Returns the endianness of the system’s CPU. It can return 'BE' for big-endian or 'LE' for little-
endian.

console.log('Endianness:', os.endianness());

7. Getting System Load Average

7.1. os.loadavg()

Returns an array with three load averages for the last 1, 5, and 15 minutes. This is usually a measure
of the system's load and how many processes are in the queue for CPU.

console.log('Load Average:', os.loadavg());

Example output:

[ 0.72, 0.58, 0.44 ]

8. Getting System Constants

8.1. os.constants

This provides system constants such as signals, file system flags, and error codes.
Example usage:

console.log('Signal for SIGHUP:', os.constants.signals.SIGHUP);

Example of Using os Module Together

Here’s an example that combines some of the above methods to print a summary of system
information:

const os = require('os');

console.log('Platform:', os.platform());
console.log('Architecture:', os.arch());
console.log('Hostname:', os.hostname());
console.log('OS Type:', os.type());
console.log('OS Release:', os.release());
console.log('Uptime:', os.uptime(), 'seconds');
console.log('Total Memory:', os.totalmem(), 'bytes');
console.log('Free Memory:', os.freemem(), 'bytes');
console.log('CPUs:', os.cpus());
console.log('Network Interfaces:', os.networkInterfaces());
console.log('User Info:', os.userInfo());
console.log('Temporary Directory:', os.tmpdir());
console.log('Endianness:', os.endianness());
console.log('Load Average:', os.loadavg());

Conclusion

The os module is a valuable tool in Node.js for interacting with and retrieving system-related
information. It's commonly used when you need to gather data on the environment your application
is running on, such as CPU details, memory usage, and networking information.

USING THE UTIL MODULE

The util module in Node.js provides a set of utility functions that help in working with objects,
functions, and other built-in JavaScript types. It can be quite useful for tasks such as inspecting
objects, formatting strings, and using inheritance, among other things.

Here’s an overview of the commonly used functions in the util module:

1. util.format()

The util.format() function is similar to printf in other languages like C. It allows you to format
strings with placeholders, which are then replaced by the provided arguments.

Example:
const util = require('util');

const formattedString = util.format('Hello, %s!', 'World');


console.log(formattedString); // Output: Hello, World!

You can use multiple placeholders, like %s for strings, %d for integers, %j for JSON, and more.

Example with multiple placeholders:

const name = 'Alice';


const age = 30;

const formattedString = util.format('%s is %d years old', name, age);


console.log(formattedString); // Output: Alice is 30 years old

2. util.inspect()

The util.inspect() function is used to convert an object to a string representation. It is particularly


useful when you want to inspect an object with a deep structure or with circular references.

Example:

const util = require('util');

const obj = {
name: 'John',
age: 25,
hobbies: ['reading', 'coding'],
};

console.log(util.inspect(obj, { showHidden: false, depth: null, colors: true }));

Options for util.inspect():

 showHidden: If true, it will include non-enumerable properties.


 depth: How many levels of nested objects should be inspected. null means unlimited depth.
 colors: If true, it adds color to the output (useful in the terminal).

3. util.promisify()

The util.promisify() function converts callback-based functions into ones that return a Promise. This
is especially helpful when you want to use async/await syntax with legacy callback-based APIs.

Example:

Suppose you have a function that uses callbacks (like fs.readFile()):

const fs = require('fs');
const util = require('util');
// Convert fs.readFile to return a promise
const readFilePromise = util.promisify(fs.readFile);

async function readFile() {


try {
const data = await readFilePromise('example.txt', 'utf8');
console.log(data);
} catch (error) {
console.error('Error reading file:', error);
}
}

readFile();

This makes fs.readFile() return a promise instead of using a callback, allowing you to use
async/await with it.

4. util.callbackify()

util.callbackify() is the opposite of promisify(). It converts a function that returns a Promise into a
function that uses a callback. This can be useful when you're working with APIs that expect
callback-based functions.

Example:

const util = require('util');

async function getUserData(userId) {


// Simulating an asynchronous operation with Promise
return new Promise((resolve, reject) => {
if (userId === 1) {
resolve({ userId, name: 'John' });
} else {
reject(new Error('User not found'));
}
});
}

// Convert the Promise-based function into a callback-based one


const callbackifiedGetUserData = util.callbackify(getUserData);

// Using the callbackified version


callbackifiedGetUserData(1, (err, data) => {
if (err) {
console.error(err);
} else {
console.log(data); // Output: { userId: 1, name: 'John' }
}
});
5. util.inherits()

The util.inherits() function is used for setting up inheritance in Node.js. It allows one constructor
function to inherit from another.

Example:

const util = require('util');

// Base class (constructor function)


function Animal(name) {
this.name = name;
}

Animal.prototype.speak = function() {
console.log(`${this.name} makes a sound`);
};

// Derived class (constructor function)


function Dog(name) {
Animal.call(this, name); // Call the base class constructor
}

// Set up inheritance
util.inherits(Dog, Animal);

// Now Dog instances inherit from Animal


const dog = new Dog('Rex');
dog.speak(); // Output: Rex makes a sound

Here, Dog inherits from Animal. The Dog class can now access methods from Animal.

6. util.deprecate()

The util.deprecate() function allows you to mark a function as deprecated, and it will show a
warning when the function is called.

Example:

const util = require('util');

// A deprecated function
function oldFunction() {
console.log('This function is deprecated');
}

// Wrapping the function with deprecation warning


const deprecatedFunction = util.deprecate(oldFunction, 'oldFunction is deprecated and will be
removed in the future');
// Calling the deprecated function
deprecatedFunction();

This will print a warning when the function is called, alerting the developer that the function is
deprecated.

7. util.isArray()

This method checks if the given value is an array. It returns true if the value is an array, otherwise
false.

const util = require('util');

console.log(util.isArray([1, 2, 3])); // Output: true


console.log(util.isArray({})); // Output: false

8. util.isError()

This method checks if the value is an instance of Error. It returns true if the value is an error,
otherwise false.

const util = require('util');

const err = new Error('Something went wrong');


console.log(util.isError(err)); // Output: true
console.log(util.isError('Error message')); // Output: false

Example: Using Multiple util Methods Together

Here's a quick example that demonstrates multiple util methods together:

const util = require('util');

// Create an object
const obj = {
name: 'Alice',
age: 30,
greet: function() { return 'Hello, ' + this.name; },
};

// Format a string using util.format


const formattedString = util.format('Name: %s, Age: %d', obj.name, obj.age);
console.log(formattedString);

// Inspect an object using util.inspect


console.log(util.inspect(obj, { showHidden: true, depth: 2, colors: true }));

// Deprecate a function
const deprecatedFunc = util.deprecate(function() {
console.log('This is deprecated');
}, 'This function is deprecated!');

// Call the deprecated function


deprecatedFunc();

// Promisify a callback-based function


const fs = require('fs');
const readFilePromise = util.promisify(fs.readFile);
readFilePromise('example.txt', 'utf8')
.then(data => console.log(data))
.catch(err => console.error(err));

Conclusion

The util module in Node.js offers a variety of utility functions that simplify working with
asynchronous code, object inspection, formatting, inheritance, and more. It’s a powerful tool for
developers working with Node.js, especially when dealing with legacy callback-based APIs,
debugging, or working with complex data structures.

USING DNS MODULE

The dns module in Node.js provides an API for performing DNS (Domain Name System) lookups
and resolving domain names. This module is essential when you need to interact with DNS records,
perform reverse lookups, or resolve hostnames to IP addresses.

Here’s an overview of how you can use the dns module in Node.js:

1. Getting Started with the dns Module

To use the dns module, you simply need to require it:

const dns = require('dns');

2. Resolving Hostnames to IP Addresses

The dns module provides functions that allow you to resolve domain names into their respective IP
addresses. There are both callback-based and Promise-based versions of these methods.

2.1. dns.lookup()

The dns.lookup() method resolves a hostname (like 'www.google.com') to its corresponding IP


address using the system's DNS resolver.

Example:

const dns = require('dns');


// Resolving a hostname to an IP address
dns.lookup('www.google.com', (err, address, family) => {
if (err) {
console.error('Error:', err);
return;
}
console.log('Address:', address);
console.log('Family:', family); // IPV4 or IPV6
});

Example Output:

Address: 172.217.6.36
Family: 4

 address: The resolved IP address.


 family: Indicates whether the IP is IPv4 (4) or IPv6 (6).

2.2. dns.promises.lookup()

This is the Promise-based version of dns.lookup(). It returns a promise that resolves with the address
and family.

Example:

const dns = require('dns').promises;

async function resolveHostname() {


try {
const { address, family } = await dns.lookup('www.google.com');
console.log('Address:', address);
console.log('Family:', family);
} catch (err) {
console.error('Error:', err);
}
}

resolveHostname();

3. Resolving DNS Records

You can use the dns module to resolve various types of DNS records, such as A records, MX
records, and more.

3.1. dns.resolve()

The dns.resolve() method resolves a domain name into different types of records (like A, AAAA,
MX, etc.).

Example (A records):
const dns = require('dns');

// Resolving A records (IPv4 addresses) for a domain


dns.resolve('www.google.com', 'A', (err, addresses) => {
if (err) {
console.error('Error:', err);
return;
}
console.log('A Records:', addresses);
});

Example Output:

A Records: [ '172.217.6.36', '172.217.6.68' ]

3.2. dns.resolveMx()

The resolveMx() method retrieves the mail exchange (MX) records for a domain.

Example:

const dns = require('dns');

dns.resolveMx('gmail.com', (err, addresses) => {


if (err) {
console.error('Error:', err);
return;
}
console.log('MX Records:', addresses);
});

Example Output:

MX Records: [
{ exchange: 'alt1.gmail-smtp-in.l.google.com', priority: 5 },
{ exchange: 'alt2.gmail-smtp-in.l.google.com', priority: 5 },
{ exchange: 'gmail-smtp-in.l.google.com', priority: 10 },
{ exchange: 'alt3.gmail-smtp-in.l.google.com', priority: 5 },
{ exchange: 'alt4.gmail-smtp-in.l.google.com', priority: 5 }
]

3.3. dns.resolveTxt()

The resolveTxt() method retrieves the TXT records for a domain. TXT records are often used for
various domain-related configurations like SPF (Sender Policy Framework) records.

Example:

const dns = require('dns');


dns.resolveTxt('google.com', (err, records) => {
if (err) {
console.error('Error:', err);
return;
}
console.log('TXT Records:', records);
});

Example Output:

TXT Records: [ [ 'v=spf1 include:_spf.google.com ~all' ] ]

4. Reverse DNS Lookup

You can use the dns.reverse() method to get the hostnames for a given IP address. This is useful
when you want to perform a reverse DNS lookup to find out the domain associated with a given IP.

Example:

const dns = require('dns');

dns.reverse('8.8.8.8', (err, hostnames) => {


if (err) {
console.error('Error:', err);
return;
}
console.log('Hostnames:', hostnames);
});

Example Output:

Hostnames: [ 'dns.google' ]

5. dns.resolveSrv()

This method resolves a domain name to its SRV (Service) records, which are used to define the
location of servers for specified services.

Example:

const dns = require('dns');

dns.resolveSrv('_sip._tcp.google.com', (err, addresses) => {


if (err) {
console.error('Error:', err);
return;
}
console.log('SRV Records:', addresses);
});
Example Output:

SRV Records: [
{ priority: 10, weight: 60, port: 5060, name: 'sipserver.google.com' }
]

6. dns.resolveNs()

The resolveNs() method retrieves the nameserver (NS) records for a domain.

Example:

const dns = require('dns');

dns.resolveNs('google.com', (err, addresses) => {


if (err) {
console.error('Error:', err);
return;
}
console.log('NS Records:', addresses);
});

Example Output:

NS Records: [ 'ns1.google.com', 'ns2.google.com', 'ns3.google.com', 'ns4.google.com' ]

7. dns.resolveCname()

The resolveCname() method resolves a CNAME (Canonical Name) record, which maps an alias
domain name to the canonical domain name.

Example:

const dns = require('dns');

dns.resolveCname('www.google.com', (err, addresses) => {


if (err) {
console.error('Error:', err);
return;
}
console.log('CNAME Records:', addresses);
});

Example Output:

CNAME Records: [ 'www.l.google.com' ]

8. dns.lookupService()
The dns.lookupService() method performs a reverse lookup to get the service for a specific IP
address and port.

Example:

const dns = require('dns');

dns.lookupService('8.8.8.8', 53, (err, hostname, service) => {


if (err) {
console.error('Error:', err);
return;
}
console.log('Hostname:', hostname);
console.log('Service:', service);
});

Example Output:

Hostname: dns.google
Service: domain

9. Using Promises with DNS Methods

Node.js DNS methods also support promise-based versions for cleaner asynchronous code using
async/await.

Here’s an example using dns.promises for resolving MX records:

const dns = require('dns').promises;

async function resolveMX() {


try {
const mxRecords = await dns.resolveMx('gmail.com');
console.log('MX Records:', mxRecords);
} catch (err) {
console.error('Error:', err);
}
}

resolveMX();

Conclusion

The dns module in Node.js provides a rich set of methods to interact with DNS and perform domain
name resolution, reverse lookups, and retrieve various DNS record types. It supports both callback
and promise-based patterns, making it flexible for handling asynchronous operations.

This module is especially useful when building networked applications, setting up custom DNS
queries, or diagnosing DNS issues.
USING THE CRYPTO MODULE

The crypto module in Node.js provides a set of cryptographic functionalities that allow you to
perform tasks such as hashing, encryption, decryption, HMACs (Hash-based Message
Authentication Codes), and more. It's essential when building secure applications that need to
handle sensitive data.

1. Getting Started with the crypto Module

To use the crypto module in Node.js, you can simply require it like this:

const crypto = require('crypto');

2. Creating Hashes

Hashing is the process of converting data into a fixed-length string, typically used for storing
passwords or verifying data integrity. The crypto module supports several hashing algorithms like
SHA-256, SHA-512, MD5, etc.

Example: Creating a Hash

const crypto = require('crypto');

// Create a SHA-256 hash of a string


const hash = crypto.createHash('sha256');
hash.update('Hello, World!');
const result = hash.digest('hex'); // 'hex' gives a hexadecimal output

console.log(result); // Example output:


'a591a6d40bf420404a011733cfb7b190d62c65bf0bcda038098a4fe5a6fcfe5f'

3. Creating HMAC (Hash-based Message Authentication Code)

HMAC is used to verify both the data integrity and authenticity of a message. It uses a
cryptographic key along with a hash function.

Example: Creating an HMAC

const crypto = require('crypto');

const secret = 'mysecretkey';


const message = 'This is a secret message';

const hmac = crypto.createHmac('sha256', secret);


hmac.update(message);
const hmacResult = hmac.digest('hex'); // Returns the HMAC in hexadecimal format

console.log(hmacResult); // Example output:


'e5ba9283f16ab28504ed9e99f7b2800a54a11802fc1a4609d922f2e3b6a6bba1'
4. Encrypting and Decrypting Data

The crypto module can be used to perform symmetric encryption (using the same key for both
encryption and decryption) and asymmetric encryption (using a pair of public and private keys).

4.1. Symmetric Encryption (AES)

AES (Advanced Encryption Standard) is a symmetric encryption algorithm. Both the encryption and
decryption use the same key.

Example: Encrypting with AES


const crypto = require('crypto');

// Encrypt data using AES-256-CBC


const algorithm = 'aes-256-cbc';
const password = 'password123'; // You should use a strong password/key
const iv = crypto.randomBytes(16); // Generate a random initialization vector
const key = crypto.scryptSync(password, 'salt', 32); // Generate a key from the password

const cipher = crypto.createCipheriv(algorithm, key, iv);


let encrypted = cipher.update('Hello, World!', 'utf8', 'hex');
encrypted += cipher.final('hex');

console.log('Encrypted:', encrypted); // Example output: '2c6f91599156d92605e027f2b279232f'

Example: Decrypting with AES


const crypto = require('crypto');

// Decrypt data using AES-256-CBC


const algorithm = 'aes-256-cbc';
const password = 'password123';
const iv = Buffer.from('randomivfromencryption', 'hex'); // The same IV used during encryption
const key = crypto.scryptSync(password, 'salt', 32);

const decipher = crypto.createDecipheriv(algorithm, key, iv);


let decrypted = decipher.update(encrypted, 'hex', 'utf8');
decrypted += decipher.final('utf8');

console.log('Decrypted:', decrypted); // Output: 'Hello, World!'

4.2. Asymmetric Encryption (RSA)

RSA is a type of asymmetric encryption, where a public key is used for encryption, and a private
key is used for decryption.

Example: Encrypting and Decrypting Data with RSA

1. Generate Keys (this step should ideally be done in advance)

openssl genpkey -algorithm RSA -out private.pem -pkeyopt rsa_keygen_bits:2048


openssl rsa -pubout -in private.pem -out public.pem

2. Encrypting with the Public Key

const crypto = require('crypto');


const fs = require('fs');

// Read the public key


const publicKey = fs.readFileSync('public.pem', 'utf8');

// Encrypt the data using the public key


const data = 'Hello, this is a secret message!';
const encryptedData = crypto.publicEncrypt(publicKey, Buffer.from(data));

console.log('Encrypted data:', encryptedData.toString('base64')); // Example output:


'MIIBIjANBgkqh...'

3. Decrypting with the Private Key

const crypto = require('crypto');


const fs = require('fs');

// Read the private key


const privateKey = fs.readFileSync('private.pem', 'utf8');

// Decrypt the data using the private key


const decryptedData = crypto.privateDecrypt(privateKey, Buffer.from(encryptedData, 'base64'));

console.log('Decrypted data:', decryptedData.toString('utf8')); // Output: 'Hello, this is a secret


message!'

5. Generating Random Bytes

The crypto module can be used to generate random data for various purposes like generating keys,
salts, or nonces.

Example: Generating Random Bytes

const crypto = require('crypto');

// Generate 16 random bytes


const randomBytes = crypto.randomBytes(16);
console.log(randomBytes.toString('hex')); // Example output:
'f1b35d2a8a98483a92a30297dfc98b39'

You can also use the crypto.randomInt() method to generate a random integer.

const crypto = require('crypto');

// Generate a random integer between 0 and 100


const randomInt = crypto.randomInt(0, 100);
console.log(randomInt); // Example output: 57

6. Key Derivation with PBKDF2

PBKDF2 (Password-Based Key Derivation Function 2) is used to securely derive a key from a
password, making it more resistant to attacks like brute-forcing.

Example: Using PBKDF2 for Key Derivation

const crypto = require('crypto');

// Derive a key from a password using PBKDF2


const password = 'supersecretpassword';
const salt = crypto.randomBytes(16); // Random salt
const iterations = 100000; // Number of iterations
const keyLength = 32; // Length of the generated key
const algorithm = 'sha256'; // Hash algorithm

crypto.pbkdf2(password, salt, iterations, keyLength, algorithm, (err, derivedKey) => {


if (err) throw err;
console.log('Derived key:', derivedKey.toString('hex'));
});

7. Signatures and Verification

The crypto module allows you to sign data using a private key and verify it using a public key,
which is essential for ensuring data authenticity and integrity.

Example: Signing Data

const crypto = require('crypto');


const fs = require('fs');

// Read the private key


const privateKey = fs.readFileSync('private.pem', 'utf8');

// Sign the data with the private key


const data = 'This is a message to be signed.';
const sign = crypto.createSign('SHA256');
sign.update(data);
const signature = sign.sign(privateKey, 'base64');

console.log('Signature:', signature);

Example: Verifying a Signature

const crypto = require('crypto');


const fs = require('fs');
// Read the public key
const publicKey = fs.readFileSync('public.pem', 'utf8');

// Verify the signature using the public key


const verify = crypto.createVerify('SHA256');
verify.update(data);
const isVerified = verify.verify(publicKey, signature, 'base64');

console.log('Signature verified:', isVerified); // Output: true or false

8. Stream-Based Encryption

For larger data (such as encrypting/decrypting files), you can use streams to perform encryption in
chunks without loading the entire file into memory.

Example: Encrypting a File Using Streams

const fs = require('fs');
const crypto = require('crypto');

const algorithm = 'aes-256-cbc';


const password = 'strongpassword';
const iv = crypto.randomBytes(16);
const key = crypto.scryptSync(password, 'salt', 32);

const input = fs.createReadStream('input.txt');


const output = fs.createWriteStream('encrypted.txt');

const cipher = crypto.createCipheriv(algorithm, key, iv);

input.pipe(cipher).pipe(output);

Example: Decrypting a File Using Streams

const fs = require('fs');
const crypto = require('crypto');

const algorithm = 'aes-256-cbc';


const password = 'strongpassword';
const iv = Buffer.from('ivusedduringencryption', 'hex');
const key = crypto.scryptSync(password, 'salt', 32);

const input = fs.createReadStream('encrypted.txt');


const output = fs.createWriteStream('decrypted.txt');

const decipher = crypto.createDecipheriv(algorithm, key, iv);

input.pipe(decipher).pipe(output);

Conclusion
The crypto module in Node.js provides a wide range of cryptographic functions for hashing,
encryption, signing, key derivation, and more. It's essential for building secure applications,
handling sensitive data like passwords, and implementing encryption/decryption mechanisms.

You might also like