Robotics and its Applications
UNIT I
Introduction: Introduction - brief history - components of robotics – classification -
workspace - work-envelop - motion of robotic arm - end-effectors and its types -
service robot and its application - Artificial Intelligence in Robotics.
Introduction to Robotics
Robotics is an interdisciplinary branch of engineering and science that deals with the
design, construction, operation, and use of robots. Robots are programmable
machines capable of carrying out complex tasks autonomously or semi-
autonomously.
Brief History of Robotics
The concept of robotics dates back to ancient civilizations, where myths and
mechanical automata were imagined. The word "robot" was first introduced in 1921
in the play R.U.R. (Rossum's Universal Robots) by Karel Čapek. Modern robotics
began taking shape in the 20th century with the development of industrial robots like
Unimate, introduced in the 1950s for assembly line tasks. Since then, robotics has
evolved rapidly, integrating electronics, computing, and artificial intelligence.
1
Year/
Event/Development Description
Period
Ancient Automata and Mechanical Early mechanical devices and statues that mimicked
Times Devices human or animal actions (e.g., Greek automatons).
Leonardo da Vinci’s Designed a humanoid robot capable of basic
1495
Mechanical Knight movements.
Play R.U.R. (Rossum's Universal Robots) by Karel
1921 Term “Robot” Coined
Čapek introduced the word "robot."
First Programmable Robot Claude Shannon wrote a paper on chess-playing
1948
Concept machines.
First Industrial Robot George Devol invented "Unimate," the first
1954
Developed programmable robotic arm.
Used for die casting and spot welding at General
1961 Unimate Deployed in Industry
Motors.
Robots began widespread use in automotive and
1970s Rise of Industrial Robotics
electronics manufacturing.
Introduction of
1980s Enabled more complex control and intelligence.
Microprocessors in Robots
Mobile Robots and AI Robots with sensors and AI for autonomous navigation
1990s
Integration introduced.
Service Robots and Robots like ASIMO and Roomba introduced for
2000s
Humanoids domestic and service tasks.
AI, Machine Learning, and Robots work alongside humans with advanced sensing
Present
Collaborative Robots and decision-making capabilities.
Laws of Robotics (Isaac Asimov)
1. First Law:
A robot may not injure a human being or, through inaction, allow a human
being to come to harm.
2. Second Law:
A robot must obey the orders given it by human beings except where such
orders would conflict with the First Law.
3. Third Law:
A robot must protect its own existence as long as such protection does not
conflict with the First or Second Law.
4. Zeroth Law:
A robot may not harm humanity, or, by inaction, allow humanity to come to
harm.
2
Purpose of the Laws
Ensure robots behave safely and ethically around humans.
Prevent robots from causing harm, either intentionally or accidentally.
Maintain human control over robots.
Components of Robotics
A typical robotic system consists of several key components:
1. Manipulator (Robotic Arm): The mechanical structure resembling a human
arm with joints and links.
2. Actuators: Motors or hydraulic systems that drive the movement.
3. Sensors: Devices to detect changes in the environment or the robot itself (e.g.,
position sensors, vision sensors).
4. Controller: The brain of the robot, which processes input data and sends
commands to actuators.
5. End-Effector: The tool or device at the end of the robotic arm used to interact
with the environment (e.g., grippers, welding torches).
6. Power Supply: Source of energy for robot operation.
3
Classification of Robots
Robots can be classified based on several criteria:
By Application: Industrial robots, service robots, medical robots, military
robots, etc.
By Degree of Freedom: Number of independent movements a robot can
perform.
By Control Type: Autonomous robots, remote-controlled robots.
By Structure: Articulated robots, SCARA robots, Cartesian robots, cylindrical
robots, spherical robots.
By Mobility: Fixed robots and mobile robots.
4
Workspace and Work-Envelope
Workspace: The total volume or area within which the robot can operate or
perform tasks.
Work-Envelope: The three-dimensional space that the robot's end-effector can
reach. It defines the spatial limits of the robot's movement.
5
Motion of Robotic Arm
Robotic arms move through combinations of rotational and linear motions:
Rotational Motion: Movement around an axis (joints that rotate).
Linear Motion: Straight-line movement along an axis.The motion is
controlled by joints such as revolute (rotational) joints or prismatic (sliding)
joints.
The number of degrees of freedom (DOF) determines the complexity of
motion.
6
7
End-Effectors and Its Types
End-effectors are the devices attached to the robot's arm to perform specific tasks.
Types include:
Grippers: Mechanical, vacuum, magnetic, or adhesive grippers used for
picking and placing objects.
Tools: Welding torches, spray guns, cutters.
Sensors: For inspection or feedback.
8
Specialized End-Effectors: Designed for surgery, bomb disposal, or
agriculture.
9
Service Robots and Its Applications
Service robots perform useful tasks for humans beyond industrial applications.
Examples include:
Domestic robots: Vacuum cleaners, lawn mowers.
Medical robots: Surgical assistants, rehabilitation robots.
Security robots: Surveillance, bomb disposal.
Entertainment robots: Toys, companion robots.
Agricultural robots: Harvesting, planting.
Artificial Intelligence in Robotics
AI integration has revolutionized robotics by enabling:
10
Autonomous decision-making: Robots can perceive, analyze, and act without
human intervention.
Machine Learning: Robots learn from experience to improve performance.
Computer Vision: Allows robots to interpret visual data.
Natural Language Processing: Enables communication with humans.
Path Planning and Navigation: Robots can plan routes and avoid obstacles
dynamically.
UNIT II
11
Actuators and sensors :Types of actuators - stepper-DC-servo-and brushless motors-
model of a DC servo motor-types of transmissions-purpose of sensorinternal and
external sensor-common sensors-encoders tachometers-strain gauge based force
torque sensor-proximity and distance measuring sensors
Types of Actuators
Actuators are devices that convert energy (usually electrical, hydraulic, or
pneumatic) into mechanical motion, enabling the robot to perform tasks.
1. Stepper Motors
Operate by moving in discrete steps, allowing precise control over position and
speed.
Commonly used in applications requiring accurate positioning without
feedback.
Advantages: Simple control, high precision.
Limitations: Limited speed, less torque compared to other motors.
12
2.
DC
Motors
Use direct current to generate rotational motion.
Simple to control speed by varying voltage.
Widely used in robotics for continuous rotation.
Advantages: Good speed control, high efficiency.
Limitations: Requires brushes that wear over time.
13
3. Servo Motors (DC Servo Motor)
DC motor combined with feedback sensors (usually encoders) for precise
control of angular position, speed, and acceleration.
Used in applications needing precise motion control like robotic arms.
Model of DC Servo Motor: Consists of the motor itself, a feedback device
(encoder or potentiometer), and a control circuit. The control system adjusts
the motor based on error between desired and actual position.
4. Brushless DC Motors (BLDC)
Use electronic commutation instead of brushes, resulting in higher efficiency
and longer life.
Common in drones, robotics, and electric vehicles.
Advantages: Low maintenance, high speed, and torque.
Types of Transmissions
Transmissions transfer power from actuators to robot joints.
Common types include gears, belts, chains, and lead screws.
Purpose: To convert motor output into desired motion type (e.g., rotational to
linear), change speed/torque, or improve precision.
Purpose of Sensors in Robotics
Sensors provide feedback about the robot’s environment and internal states, enabling
it to make decisions and adjust actions.
Internal Sensors: Monitor the robot’s own condition (position, speed, torque).
14
External Sensors: Sense the surrounding environment (distance to objects,
temperature, light).
Common Sensors Used in Robotics
1. Encoders
Measure rotational position or speed of shafts.
Types: Incremental and absolute encoders.
Provide feedback for motion control in servo motors and robotic joints.
2. Tachometers
Measure angular velocity (speed) of a rotating shaft.
Often used with DC motors to provide speed feedback.
3. Strain Gauge-Based Force/Torque Sensors
Measure mechanical strain caused by applied forces or torques.
Important for delicate tasks requiring force control (e.g., assembly, gripping).
4. Proximity Sensors
Detect the presence or absence of objects near the robot without contact.
Types: Inductive (metal objects), capacitive (any material), ultrasonic (distance
by sound waves).
5. Distance Measuring Sensors
Provide quantitative measurement of distance to objects.
Technologies include ultrasonic sensors, laser range finders (LIDAR), and
infrared sensors.
Used in navigation, obstacle avoidance, and mapping.
UNIT III
15
Kinematics of robots: Representation of joints and frames - frames transformation,
homogeneous matrix - D-H matrix - Forward and inverse kinematics: two link planar
(RR) and spherical robot (RRP). Mobile robot Kinematics: Differential wheel mobile
robot.
Kinematics of Robots
Kinematics deals with the motion of robots without considering the forces causing
the motion. It involves the study of positions, velocities, and accelerations of the
robot's parts.
Representation of Joints and Frames
Joints connect links and provide relative motion. Types include:
Revolute (R): Rotational motion around an axis.
Prismatic (P): Linear sliding motion along an axis.
16
Frames: Coordinate systems attached to robot links/joints to describe positions
and orientations in space.
Frame Transformation and Homogeneous Matrix
Frame Transformation: Converting coordinates of points from one frame to
another.
Uses rotation matrices (to represent orientation changes) and translation
vectors (to represent position changes).
Homogeneous Transformation Matrix (4x4):
Combines rotation and translation in a single matrix to easily perform
coordinate transformations in 3D space.
General form:
Where R is a 3x3 rotation matrix and d is a 3x1 translation vector.
Denavit-Hartenberg (D-H) Matrix
A standardized method to represent robot geometry using 4 parameters per
link:
θ: Joint angle (rotation around z-axis)
d: Offset along z-axis
a: Link length along x-axis
α: Twist angle between z-axes
Each link’s transformation from one frame to the next is expressed as a D-H
matrix.
Forward Kinematics
Calculates the position and orientation of the robot’s end-effector based on
given joint parameters.
Example: Two-link planar robot (RR robot)
Two revolute joints connected in series.
17
Position (x,y)(x, y)(x,y) of end-effector is:
Where l1,l2 are link lengths, θ1,θ2 are joint angles.
Example: Spherical robot (RRP robot)
Two revolute and one prismatic joint.
The first two joints control orientation; the prismatic joint controls radial
extension.
Inverse Kinematics
Given a desired position and orientation of the end-effector, calculate the
required joint parameters.
For the two-link planar robot, inverse kinematics involves solving the above
equations for θ1 and θ2.
18
For the spherical robot, find joint angles and prismatic extension needed to
reach the target.
Mobile Robot Kinematics: Differential Wheel Mobile Robot
A common mobile robot uses two independently driven wheels mounted on a
common axis.
Its movement is described by:
x = v cos θ, y = v sin θ, θ=ω
Where:
x,y: Robot position coordinates
θ: Orientation angle
v: Linear velocity (average of two wheel velocities)
ω\omega: Angular velocity (difference of wheel velocities divided by axle
length)
The differential drive allows the robot to move forward/backward and rotate on
the spot by varying wheel speeds.
19
UNIT IV
20
Path Planning: Introduction - path planning-overview-road map path planning-cell
decomposition path planning potential field path planning-obstacle avoidance-case
studies
Vision system: Robotic vision systems-image representation-object recognitionand
categorization-depth measurement- image data compression-visual
inspectionsoftware considerations
Introduction
Path planning is the process by which a robot determines a feasible route from a
starting point to a goal location while avoiding obstacles. It is a critical component in
autonomous navigation and manipulation.
Overview of Path Planning
The objective is to generate a collision-free path.
Takes into account the robot’s kinematics, workspace, and dynamic constraints.
21
Balances between shortest path, safety, and computational efficiency.
Road Map Path Planning
Creates a graph representing the free space using nodes and edges.
Nodes: Collision-free configurations.
Edges: Feasible paths between nodes.
Common algorithms: Probabilistic Roadmap (PRM).
After building the roadmap, shortest path algorithms like Dijkstra or A* are
used to find the route.
Cell Decomposition Path Planning
Divides the workspace into non-overlapping cells (usually rectangular or
triangular).
Each cell is classified as free or occupied.
The free cells form a connectivity graph.
The path is planned by moving through adjacent free cells from start to goal.
22
Useful for simple, structured environments.
23
Potential Field Path Planning
Treats the robot as a particle under the influence of artificial potential fields.
The goal exerts an attractive force pulling the robot toward it.
Obstacles exert repulsive forces pushing the robot away.
The robot moves by following the resultant force vector.
Advantages: Simple and fast.
Drawbacks: Can get stuck in local minima.
24
Obstacle Avoidance
Critical to all path planning methods.
Uses sensor feedback to detect and avoid obstacles in real-time.
Techniques include:
Reactive methods (e.g., potential fields).
Planning methods that explicitly model obstacles in the environment.
Ensures safety and collision-free motion.
25
Case Studies
1. Autonomous Mobile Robots
Use PRM or cell decomposition for indoor navigation.
Combine with obstacle avoidance for dynamic environments.
2. Industrial Robot Arms
Use inverse kinematics and path planning to avoid collisions with
objects or humans.
Planning paths in joint space or Cartesian space.
3. Self-Driving Cars
Use advanced path planning algorithms integrating sensors, maps, and
AI.
Plan smooth and safe trajectories accounting for other vehicles and road
rules.
Vision Systems in Robotics
Robotic vision systems enable robots to perceive their environment through cameras
and image processing, enhancing autonomy and precision in tasks such as object
recognition, navigation, and inspection.
26
Image Representation
Images are captured as arrays of pixels.
Types of image representation:
Grayscale: Each pixel represents intensity (0-255).
Color images: Represented by multiple channels (e.g., RGB).
Binary images: Pixels are either black or white, used in segmentation.
Images can also be represented in different formats: bitmap, vector, etc.
27
Object Recognition and Categorization
The process of identifying and classifying objects within images.
Techniques include:
Feature extraction: Edges, corners, textures.
Template matching: Comparing parts of the image to stored templates.
Machine learning approaches: Using classifiers like SVM, CNNs
(deep learning).
Categorization groups objects into classes (e.g., distinguishing between
different types of tools or parts).
Depth Measurement
Estimating the distance of objects from the camera.
Methods include:
Stereo vision: Uses two cameras spaced apart to triangulate depth.
Structured light: Projects a known pattern and analyzes distortion.
Time-of-flight sensors: Measure the time light takes to reflect back.
Laser range finders (LIDAR): Provide precise distance mapping.
Depth data is crucial for 3D object recognition and robot navigation.
28
Image Data Compression
Reduces the size of image data for storage and faster processing.
Types:
Lossless compression: No data loss (e.g., PNG).
Lossy compression: Some data discarded for higher compression (e.g.,
JPEG).
Important for real-time vision systems where bandwidth and storage are
limited.
Visual Inspection
Uses robotic vision to automate quality control by checking defects, surface
finish, and dimensions.
Applications:
29
Detecting cracks, missing parts, or misalignments.
Measuring object dimensions.
Improves accuracy and speed compared to manual inspection.
Software Considerations
Vision system software involves:
Image acquisition: Capturing images from cameras.
Preprocessing: Noise reduction, filtering, contrast enhancement.
Segmentation: Dividing the image into meaningful parts.
Feature extraction and classification: For recognition.
Real-time processing: Essential for dynamic environments.
Popular vision libraries and tools: OpenCV, HALCON, MATLAB Image
Processing Toolbox.
UNIT V
Application: Ariel robots-collision avoidance robots for agriculture-
miningexploration-underwater-civilian- and military applications-nuclear
applicationsspace Applications-Industrial robots-artificial intelligence in robots-
30
application of robots in material handling-continuous arc welding-spot welding-spray
paintingassembly operation-cleaning-etc.
Aerial Robots (Drones)
Used for surveillance, mapping, agriculture (crop monitoring), and delivery.
Equipped with cameras and sensors for inspection and data collection.
Applications: Disaster management, environmental monitoring, and military
reconnaissance.
Collision Avoidance Robots for Agriculture
Autonomous tractors and harvesters that navigate fields avoiding obstacles.
Use sensors and AI for precision farming: planting, watering, and harvesting.
Benefits: Increased efficiency, reduced labor costs, and improved crop yield.
31
Mining Exploration Robots
Operate in hazardous, inaccessible areas underground.
Perform mapping, drilling, and sample collection.
Improve safety by reducing human presence in dangerous environments.
Underwater Robots
Used for marine exploration, pipeline inspection, and underwater repairs.
Types include remotely operated vehicles (ROVs) and autonomous underwater
vehicles (AUVs).
Applications in oceanography, defense, and oil industry.
32
Civilian Applications
Domestic robots: Vacuum cleaners, lawn mowers, window cleaners.
Healthcare: Surgical robots, rehabilitation devices.
Service robots in hospitality and customer service.
33
Military Applications
Surveillance and reconnaissance robots.
Bomb disposal robots.
Autonomous vehicles and drones.
Provide tactical advantages and reduce human risk.
34
Nuclear Applications
Robots perform inspection and maintenance in radioactive environments.
Handle hazardous materials safely.
Used in nuclear power plants for cleaning, repair, and decommissioning tasks.
Space Applications
Robotic arms on space stations (e.g., Canadarm).
Planetary rovers exploring Mars and the Moon.
Satellites with robotic components for repair and refueling.
Enable exploration and maintenance in space environments.
Industrial Robots
Widely used in
manufacturing for
repetitive and precise
tasks.
35
Artificial Intelligence in Robots
Enhances adaptability, decision-making, and autonomy.
Enables robots to learn from the environment and optimize tasks.
Used in complex applications like self-driving cars and smart manufacturing.
Applications in Material Handling
Robots handle loading, unloading, palletizing, and packaging.
Improve speed and reduce workplace injuries.
36
Continuous Arc Welding
Robots perform high-quality, consistent welds on assembly lines.
Used in automotive and heavy machinery industries.
Spot Welding
Common in car manufacturing for joining sheet metal.
Robots provide speed, accuracy, and repeatability.
37
Spray Painting
Robots ensure uniform paint application with minimal waste.
Used in automotive and consumer product industries.
Assembly Operation
Robots assemble parts with precision.
In electronics, automotive, and appliance manufacturing.
38
Cleaning
Robots clean floors, windows, and industrial equipment.
Used in domestic, commercial, and hazardous environments.
39