KEMBAR78
Smart and Energy-Efficient Home Automation: Project A | PDF | Artificial Neural Network | Wavelet
0% found this document useful (0 votes)
114 views24 pages

Smart and Energy-Efficient Home Automation: Project A

This document describes a student project on developing a smart home automation system using gesture recognition. It was submitted by four students - DivyaJyoti Rajdev, Aman Chadha, Shruti Nirantar, and Deeptha Narayan - to their professor K. Y. Rajput at Thadomal Shahani Engineering College, University of Mumbai, in 2011, in partial fulfillment of their Bachelor of Engineering degrees. The project aims to build a system that allows users to control home appliances and technologies through gestures, making the control of home devices more intuitive and hands-free. It discusses both the software and hardware implementation of such a system using techniques like image processing, gesture recognition, and communication between computational and hardware

Uploaded by

Shreyasi Sarkar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
114 views24 pages

Smart and Energy-Efficient Home Automation: Project A

This document describes a student project on developing a smart home automation system using gesture recognition. It was submitted by four students - DivyaJyoti Rajdev, Aman Chadha, Shruti Nirantar, and Deeptha Narayan - to their professor K. Y. Rajput at Thadomal Shahani Engineering College, University of Mumbai, in 2011, in partial fulfillment of their Bachelor of Engineering degrees. The project aims to build a system that allows users to control home appliances and technologies through gestures, making the control of home devices more intuitive and hands-free. It discusses both the software and hardware implementation of such a system using techniques like image processing, gesture recognition, and communication between computational and hardware

Uploaded by

Shreyasi Sarkar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

SMART AND ENERGY-EFFICIENT HOME

AUTOMATION

Project A
Submitted in partial fulfillment of the requirements
for the degree of
BACHELOR OF ENGINEERING
By
DivyaJyoti Rajdev

(085019)

Aman Chadha

(085007)

Shruti Nirantar

(085053)

Deeptha Narayan

(D095069)

UNDER THE GUIDANCE OF


Prof. K. Y. Rajput

DEPARTMENT OF ELECTRONICS AND


TELECOMMUNICATION ENGINEERING
THADOMAL SHAHANI ENGINEERING COLLEGE
UNIVERSITY OF MUMBAI
2011

Approval Sheet
Project entitled: Smart and Energy-Efficient Home Automation

Submitted by:

DivyaJyoti Rajdev

(085019)

Aman Chadha

(085007)

Shruti Nirantar

(085053)

Deeptha Narayan

(D095069)

In partial fulfillment of the degree of B.E. Electronics & Telecommunication


Engineering is approved.

______________________
Guide
Prof. K. Y. Rajput

______________________

______________________

Head of the Department

Principal

Dr. Uday Pandit Khot

Dr. G. T. Thampi

Date:

CONTENTS

ABSTRACT
1. INTRODUCTION

1
2

1.1 BLOCK DIAGRAM OF THE PROPOSED SYSTEM

1.2 SOFTWARE SIDE IMPLEMENTATION

1.3 HARDWARE SIDE IMPLEMENTATION

2. REVIEW OF LITERATURE
2.1 EXISTING SYSTEMS AND THEIR LIMITATIONS

2.2 ASPECTS OF IMAGE PROCESSING

2.3 IMAGE PROCESSING IN MATLAB

3.

PROBLEM FORMULATION

12

4.

IDEA OF PROPOSED SOLUTION

14

4.1 SMART, ENERGY EFFICIENT AND AUTOMATION


CONSIDERATIONS

14

4.2 COMMONLY USED TECHNIQUES FOR GESTURE


RECOGNITION

16

4.3 SIGNAL FLOW DIAGRAM

19

4.4 PROPOSED ALGORITHM

20

ABSTRACT
The project proposes to show the benefits of a smart house and the areas of usage of
smart living Systems. Details about the technical substructure and application of the
designed home automation system through Gesture Recognition are described. This
report presents the extension of existing vision based gesture recognition system using
an algorithm that combines the classical and novel approach for gesture recognition.
Under future work, several improvements and support structures may be implemented
in order to increase the capabilities and the functionality of the system. These
improvements include position independent recognition, rejection of unknown
gestures, and continuous online recognition of spontaneous gestures. These gesture
matches when obtained are used to control technologies deployed in an Intelligent
Room. They may either be power or utility related devices and thus centralizing their
control gives rise to a truly dynamic Home Automation system. This is because rather
than pull people into the virtual world of the computer we are trying to pull the
computer out into the real world of people. The highest level application we aspire to
achieve is to build application systems that provide occupants of the room with
specialized services for command and control of ambient conditions.

1. INTRODUCTION
Technology has advanced with leaps and bounds over the last decade thus giving rise
to computerization of our everyday living environment. A time progresses, the
development of IT technology further adds luxury to common devices that merely
served utilitarian purposes in the past. Commonplace devices like the washing
machine come with sixth sense that can inform the end user about what stage of
operation the appliance is on. Further, as networking of the home appliances is being
realized and they are become even more intelligent. Therefore, for such appliances
frequently used in everyday life, intuitive operation is desirable for a user. This is
where a non-contacting interface based on man's natural actions comes into play.
Gestures, which we use frequently and intuitively in our everyday communication, are
one of such man machine interfaces.
Our project proposes to demonstrate a functional automation based on Gesture
recognition using the following:

Video content analysis and multi-frame type of processing

Image enrollment and skin color registration

Combination of classic and novel approach to Image processing

Portable computational devices

Communication with handheld hardware mainframe module

The vision behind the project was to put forth a practical, commercially feasible and
easily implementable home automation system that puts the entire control in the hands
of the end user, literally. The system can be better understood by judging the software
and hardware modules on their individual merit and their performance on conjunction
with each other.

1.1 BLOCK DIAGRAM OF THE PROPOSED SYSTEM

End
User

Gesture is
sampled

Sample
Again

Light
Gesture
Recognition
Device

Match

Main Frame
or Switch
Trigger
respective
component

No
Match

Lock
Fan

The above figure shows the basic flow of control. It starts with image samples of the
end user. These images may also be individual frames after splitting a video stream.
This data is then sent to the software module where the matching takes place. The
control signal generated by decision device is converted to suitable data packet format
and sent to hardware module. Here, depending on the data packet received, high
power appliances like refrigerators or low power appliances like CFLs may be
switched on or off. Further, depending on desired complexity of the system,
additional states like dim the light, slow the fan etc. may be added alongwith
plain ON and OFF commands. The Hardware and software blocks are complimentary
in function. The signals are generated on the software module and executed on the
hardware module.
1.2 SOFTWARE SIDE IMPLEMENTATION
Enrolment
End
User

Take
Gesture
Sample

Training
database
Image and
Motion
Vector
Matching

Decision
Device

To
Hardware
Interface

The software side module breaks down the entire complicated Recognition process
into simple steps. The first of which is to enroll the user samples into the training
database so that for all test gestures there exists a reference within the system. All test
gestures will be matched with these training samples at a later stage and the output of
this will be used to drive the decision device. The actual process of image matching
starts with what is known as feature extraction; the feature vector may be in the form

of Frequency domain components of the image, or difference image formed by


subtracting two consecutive frames from each other or the motion trajectory of the
entire gesture etc. Which method is used for feature extraction depends on the
technique employed for image processing. Popular techniques and their corresponding
software tools are discussed subsequently in the report.
1.3 HARDWARE SIDE IMPLEMENTATION
From
Software
Interface
Level
Shifter

To different relays,
relays are connected
to actual components

MUX

Logic Unit

The Hardware module accepts the control signal or data packet from the software
module through an interface and drives the actual components to action. The data
packet undergoes level shifter to make the output from computational device
compatible with on board hardware i.e. controllers and drivers. Then it is fed to the
multiplexer which selects the relay for which the data packet was intended. The bias
of the relay switches the actual device circuit ON or OFF. The level shifter and
multiplexer together are known as the logic unit of the Hardware as it is here that
based on logic levels the operation is actually implemented. Once the above basic
module is realized successfully it can be ported to a more convenient wearable unit.

2. REVIEW OF LITERATURE
2.1 EXISTING SYSTEMS AND THEIR LIMITATIONS
In order to successfully implement the software module, the techniques in use
currently for image processing must be taken into account. This is because Video
content analysis and Gesture recognition, when broken down to the very basics are
nothing but Image processing. Even background dependent movement detection,
employs simple difference vector principle to find relative displacement between two
frames and calculate the motion trajectory.
Popular transform-based Image Processing methods:
1. Discrete Cosine Transform: DCT is a well-known signal analysis tool used in
compression due to its compact representation power. DCT is a very useful
tool for signal representation both in terms of information packing and in
terms of computational complexity due to its data independent nature. DCT
helps separate the image into parts (or spectral sub-bands) of differing
importance (with respect to the image's visual quality). DCT is conceptually
similar to Discrete Fourier Transform (DFT), in the way that it transforms a
signal or an image from the spatial domain to the frequency domain.
2. Discrete Wavelet Transform: DWT is a transform which provides the timefrequency representation. Often a particular spectral component occurring at
any instant is of particular interest. In these cases it may be very beneficial to
know the time intervals these particular spectral components occur. For
example, in EEGs, the latency of an event-related potential is of particular
interest. DWT is capable of providing the time and frequency information
simultaneously, hence giving a time-frequency representation of the signal. In
numerical analysis and functional analysis, DWT is any wavelet transform for
which the wavelets are discretely sampled. In DWT, an image can be analyzed
by passing it through an analysis filter bank followed by decimation operation.
The analysis filter consists of a low pass and high pass filter at each
decomposition stage. When the signal passes through filters, it splits into two
bands. The low pass filter which corresponds to an averaging operation,
extracts the coarse information of the signal. The high pass filter which
corresponds to a differencing operation, extracts the detail information of the
signal.

3. Discrete Sine Transform: The discrete sine transform (DST) is a Fourierrelated transform similar to the discrete Fourier transform (DFT), but using a
purely real matrix. It is equivalent to the imaginary parts of a DFT of roughly
twice the length, operating on real data with odd symmetry (since the Fourier
transform of a real and odd function is imaginary and odd), where in some
variants the input and/or output data are shifted by half a sample.
4. Hidden Markov Model Based: In the research area of dynamic gesture
recognition, Hidden Markov Models are one of the mostly used methods. The
movements of a person over a sequence of images is classified. The first
approach for the recognition of human movements based on Hidden Markov
Models has been described in a paper by Volder. It distinguishes between six
different tennis strokes. This system divides the image into meshes and counts
the number of pixels representing the person for each mesh. The numbers are
composed to a feature vector that is converted into a discrete label by a vector
quantizer. The labels are classified based on discrete HMMs. The system
described in the paper by Rochelle is capable of recognizing 40 different
connected person dependent gestures of the American Sign Language. This
system uses colored gloves to track the hands of the user, but can also track
the hands without the help of gloves. The position and orientation of the hands
are used for the HMM based classification.
5. Neural network based: An artificial neural network (ANN), usually called
neural network (NN), is a mathematical model or computational model that is
inspired by the structure and/or functional aspects of biological neural
networks. A neural network consists of an interconnected group of artificial
neurons, and it processes information using a connectionist approach to
computation. In most cases an ANN is an adaptive system that changes its
structure based on external or internal information that flows through the
network during the learning phase. Modern neural networks are non-linear
statistical data modeling tools. They are usually used to model complex
relationships between inputs and outputs or to find patterns in data.

6. Touchscreen based: A white screen is used as the touch plate to give an input
in the form of a symbol. The user needs to create a database of the various
symbols required. A camera placed below white screen catches the reflected
shadow of the symbol created and interprets it as particular action that is
required in home automation using micro-controllers. The following figure
explains the 2-D gesture recognition concept:

Apart from the above mentioned, there are techniques other techniques as well which
are employed for gesture recognition. However, most of them are subject to
conditions of surroundings. Thus for accurate and reliable results, a combination of
the classic and novel approach with be used to finally implement gesture recognition.
2.2 ASPECTS OF IMAGE PROCESSING
It is convenient to subdivide different image processing algorithms into broad
subclasses. There are different algorithms for different tasks and problems, and often
we would like to distinguish the nature of the task at hand.
Image enhancement:
This refers to processing an image so that the result is more suitable for a particular
application.
Examples include:

Sharpening or de-blurring an out of focus image

Highlighting edges

Improving image contrast, or brightening an image

Removing noise

Image restoration:
This may be considered as reversing the damage done to an image by a known cause,
For example:

Removing of blur caused by linear motion

Removal of optical distortions

Removing periodic interference

Image segmentation:
This involves subdividing an image into constituent parts, or isolating certain aspects
of an image:

Finding lines, circles, or particular shapes in an image, in an aerial photograph

Identifying cars, trees, buildings, or roads

These classes are not disjoint; a given algorithm may be used for both image
enhancement or for image restoration. However, we should be able to decide what it
is that we are trying to do with our image: simply make it look better (enhancement),
or removing damage (restoration).
2.2 IMAGE PROCESSING USING MATLAB
MATLAB is a data analysis and visualization tool which has been designed with
powerful support for matrices and matrix operations. As well as this, MATLAB has
excellent graphics capabilities, and its own powerful programming language. One of
the reasons that MATLAB has become such an important tool is through the use of
sets of MATLAB programs designed to support a particular task. These sets of
programs are called toolboxes, and the particular toolbox of interest to us is the image
processing toolbox. MATLAB supports a range of image formats including BMP,
HDF, JPEG, PCX, TIFF, XWB etc.
When you start up MATLAB, you have a blank window called the Command
Window in which you enter commands. A command line style interface is used and
the prompt consists of two right arrows:

>>
MATLAB supports several image types like Grayscale images, RGB images, L*a*b*
space images etc.
Basic Image Commands in MATLAB
a. <variable name>=imread(filename): Reads the image into a variable\
b. imshow(g): displays the matrix g as an image.

c. size(a): returns three values: the number of rows, columns, and _pages_ of a,
which is a three dimensional matrix, also called a multidimensional array.
d. impixel(a,200,100): returns the red, green, and blue values of the pixel at
column 200, row 100.
e. imfinfo('emu.tif'): Returns several information fields of the image
ans =
Filename: 'emu.tif'
FileModDate: '26-Nov-2002 14:23:01'
FileSize: 119804
Format: 'tif'
FormatVersion: []
Width: 331
Height: 384

BitDepth: 8
ColorType: 'indexed'
FormatSignature: [73 73 42 0]
ByteOrder: 'little-endian'
NewSubfileType: 0
BitsPerSample: 8
Compression: 'PackBits'
PhotometricInterpretation: 'RGB Palette'
StripOffsets: [16x1 double]
SamplesPerPixel: 1
RowsPerStrip: 24
StripByteCounts: [16x1 double]
XResolution: 72
YResolution: 72
ResolutionUnit: 'Inch'
Colormap: [256x3 double]
PlanarConfiguration: 'Chunky'
TileWidth: []
TileLength: []
TileOffsets: []
TileByteCounts: []
Orientation: 1
FillOrder: 1
GrayResponseUnit: 0.0100
MaxSampleValue: 255
MinSampleValue: 0
Thresholding: 1
Data Types and Conversions
Elements in MATLAB matrices may have a number of different numeric data types;
the most common are listed as below.
Data type

Description

Range

int8
uint8
intl6
uintl6
double

8-bit integer
8-bit unsigned integer
16-bit integer
16-bit unsigned integer
Double precision real number

-128 to 127
0 to 255
32768 to 32767
0 to 65535
Machine specific

10

We can convert images from one image type to another. The table below lists all of
MATLAB's functions for converting between different image types. Note that the
gray2rgb function does not create a colour image, but an image all of whose pixel
colors were the same as before. This is done by simply replicating the grey values of
each pixel.
Image Type Conversions
Function

Use

Format

ind2gray

Indexed to Greyscale

y=ind2gray(x,map);

gray2ind

Greyscale to indexed

[y,map]=gray2ind(x);

rgb2gray

RGB to greyscale

y=rgb2gray(x);

gray2rgb

Greyscale to RGB

y=gray2rgb(x);

rgb2ind

RGB to indexed

[y,map]=rgb2ind;

ind2rgb

Indexed to RGB

y=ind2rgb(x,map);

It is important to make the distinction between the two functions double and
im2double: double changes the data type but does not change the numeric values;
im2double changes both the numeric data type and the values. The exception of
course is if the original image is of type double, in which case im2double does
nothing. Although the command double is not of much use for direct image display, it
can be very useful for image arithmetic. We have seen examples of this above with
scaling.

11

3. PROBLEM FORMULATION
In the urban setting, the high standards of living have encouraged automation to come
on the forefront and be an integral factor of any home design. At the same time, the
environmental concerns have ensured that energy efficient housing models and
appliances are used. We reckon that given the above concepts, it would make sense to
integrate them and give rise to a model smart, energy efficient home with employs
device control using a technique as intuitive as gestures.
For decades now remote controls have enabled us to dictate the working and
immediate functions of everyday appliances like the AC, TV etc. However, consider a
scenario where a person say Carol, is all tucked in her bed and she realizes that she
has forgotten to switch off the lights and fans of the living room, or the heating
system. In such a scenario and many more, it would indeed help to have a centralized
control unit located at a convenient location from where Carol could command any
and all of the electrical or electronic devices connected to the mainframe. This is
where the need for centralized automation comes. Also imagine, if Carol needed to
find the remote to control the main GUI that controls all the other devices, then the
purpose of automation would be lost, thus, it was concluded that Gestures being an
intuitive means of expression are the best way to control household appliances at
moderate efficiency levels.
Forgot to switch off
the living room
lights again!

Most techniques systems currently employed for automation have the following
drawbacks:
1. The one-time cost of installing the systems is ginormous
2. The maintenance costs further add on to overall expenses on devices

12

3. Due to extravagant prices such systems are affordable only to the upper class
in society
4. Most automation systems cannot be ported or made compatible for all
appliances of use in a house, thus very flimsy levels of automation are
achieved
5. Human Computer Interface (HCI) based automated homes are designed for
lavish spacious houses but are often found to work better for smaller spaces
The proposed project is targeted to reduce the implementation cost of a simple central
automation system. Thus, the generic idea requires minimal hardware and any simple
computational device like laptop, smart-phones etc. for its complete working and
operation. At the same time, Gesture Recognition under Human Computer Interaction
specially that using Hidden Markov Models (HMMs) is a highly sought after and
researched upon field. The report subsequently explains the algorithm and design
considerations to realize the above defined target.

13

4. IDEA OF PROPOSED SOLUTION


4.1 SMART, ENERGY EFFICIENT AND AUTOMATION
CONSIDERATIONS
The proposed project aims to integrate advanced Image Processing techniques with
their plausible application to a real world scenario. In recent years the introduction of
network enabled devices into the home environment has proceeded at an
unprecedented rate. Moreover, there is the potential for the remote or centralized
control and monitoring of such network enabled devices. Thus, the project undertakes
the task of coupling common household devices with gesture recognition based
decision unit. This will provide high levels of automation using next to none
additional cost.
Elements of a Smart Home
A Smart home is usually centrally controls most of its devices. To realize such a case
first consider some of the electronic and electric appliances which can be installed in a
house or flat, such as light, heating, home entertainment, motorized window blinds,
telephone system or security devices. Now imagine that a user of our system could
control all of this functionality from her convenient spot at home i.e. her bed or even
do it remotely while in the car. This, concept of applications which facilitate the
remote control of home appliances are the essence of Smart Homes. Apart from the
above mentioned applications the above apparatus comprises of home control and
automation hardware. In most cases these appear in the form of a small web server
which is installed on the home server available for the particular system. The user can
observe and control the installed devices via these web interfaces.

Smart Home Elements

14

Elements of Energy Efficient Home


Often the brunt of the busy lifestyle in todays world is borne by energy consuming
appliance. This may happen if the inhabitant in a haste cannot switch off the light and
fans of the entire house. Or if the refrigerator is left on normal mode even if the
owners are out for the week end, etc. Such scenarios lead to needless energy wastage.
To avoid this, a carefully connected system of relays may be implemented to centrally
trip power supply to parts or whole of the house through a central gesture controlled
server. Such an application scenario can be implemented based on profiles. For
instance, imagine an I'm leaving now" gesture from the door which turns off all the
lights and lowers the heating. Before going for dinner, the user could initiate a
"Coming back in 90 minutes" function which will switch heating back to comfort
level, including the preparation of hot water necessary for a relaxing bath. Besides the
obvious comfort enhancements, this promises significant energy savings by allowing
centralized adjustment of resource intensive appliances.
Elements of Home Automation
Home automation can be achieved with embedded computing power and memory
within dozens of pieces of domestic equipment, each of which can communicate with
the user and with other equipment. The connected web of these devices forms a
system that works as a smart home automation. Namely, it enables the user to control
several home security and electrical devices by the concept of smart life system. This
concept means the routines about house are realized automatically, the ideal comfort
conditions, and probable malfunctions and danger warnings in a living area can be
managed by the system. There are many classical definitions of home automation
available in the literature and often home automation is described as the introduction
of technology within the home to enhance the quality of life of its occupants, through
the provision of different services such as telehealth, multimedia entertainment and
energy conservation. The project is designed to achieve this automation through
Gestures.

15

Gesture controlled appliances

4.2 COMMONLY USED TECHNIQUES FOR GESTURE RECOGNITION


The actual gesture may comprise of several simple or complex motion trajectories. A
combination of the simple wrist/hand movements gives rise to complex gestures. The
following basic movements may be combined in any order to form a decipherable
gesture:

The following are the sample 3D gesture recognition templates that give a quantitative
idea of the function of each gesture:

Switch off

Switch on

Switch on

Switch on

Future

Light 1

Light 2

Light 3

Programming

Gesture 1 is used to switch off all the lights, while Gestures 2, 3 and 4 are used to
switch on Light1, Light2 and Light 3. Gesture 5 is reserved for future programming
for complicated functions like dimming the light, switching it on and off in succession
etc.

16

Switch off

Switch on

Switch on

Future

Fan 1

Fan 2

Programming

Gesture 1 is used to switch off all the fans, while Gestures 2 and 3 are used to switch
on Fan1 and Fan2. Gesture 4 is reserved for future programming for complicated
functions like adjusting the speed inclination (in case of a table fan) of the fan etc.

Gesture 1

Gesture 2

Gesture 1

Gesture 2

The above figures are general gestures which may be used to control auxiliary devices
like water heaters, Refrigerators etc. Gesture1 may be used switch off all the
appliances while Gesture2 may be used to switch on the appliance.
Gestures for the Touchscreen based Approach

The above diagram shows templates for the gestures as registered on the screen
surface. Here, a circle can be interpreted to switch on the fans. A square can be
interpreted to turn off the fans. Similarly, other symbols can be used to operate on the
other circuits in a home.
Also, symbols can be generated by just touching the finger on white screen. If one
finger touch can be interpreted to switch on bulb, two fingers touch can be interpreted
to switch on fan. Using the above concept 10 symbols can be created. The following
figure gives the idea about this concept. For five fingers touch and four fingers touch.

17

The above diagrams show the actual images generated on screen surface.

18

4.3 SIGNAL FLOW DIAGRAM

Start

Initialize
camera

Input sample

Match
consecutive
samples

If match
found

Transmit
database code

YES

If device A code

NO

YES

Switch on
A

If device n
code

Stop

19

4.4 PROPOSED ALGORITHM


Stage I
1. Take first sample image from user
2. After performing necessary normalization and resizing, enroll image in
database
3. Calculate feature vectors from image
4. Save the matching parameters obtained above
5. Repeat process for all samples of all gestures under test
Stage II
1.
2.
3.
4.
5.

Switch on and initialize appropriately both hardware and software modules


Obtain testing samples
Calculate feature vectors
Match with parameters stored during enrolment
Send result to decision device
a. If match exists transmit unique code word indicating its recipient
device
b. If no match exists take next sample image and repeat above steps
6. Transfer control to hardware module
Stage III
1. Decode the received device code using logic circuitry
2. Check device codes of interfaced devices for a match
a. If match exists command the device to perform designated function
b. If no match exists check next device code for a match
3. Repeat the above steps till the required device is parsed
4. Enter power saving or idle mode

20

REFERENCES
[1]

S. Eickeler, A. Kosmala, and G. Rigoll, Hidden Markov Model Based Continuous Online
Gesture Recognition, IEEE International Conference on Pattern Recognition, 1998., vol. 2,
pp. 1206-1208 Aug.1998.

[2]

K. Irie, K. Umeda and N. Wakamura, Construction of an Intelligent Room Based on Gesture


Recognition, in 2004 IEEE International Conference on Intelligent Robots and Systems,
Sandai, Japan, Oct 2004, pp 193-198.

[3]

K. Gill, S. H. Yang, F. Yao, and X. Lu, A ZigBee-Based Home Automation System, in IEEE
Transactions on Consumer Electronics, Vol. 55, No. 2, May 2009, pp. 422-430.

[4]

A. R. Al-Ali and M. Al-Rousan, "Java-based home automation system", IEEE Transactions


on Consumer Electronics, vol. 50, no. 2, pp. 498-504, 2004.

[5]

N. Sriskanthan, F. Tan and A. Karande, "Bluetooth based home automation system",


Proceedings of Microprocessors and Microsystems, Vol. 26, no. 6, pp. 281-289, 2002.

[6]

H. Ardam and I. Coskun, "A remote controller for home and office appliances by telephone",
IEEE Transactions on Consumer Electronics, vol. 44, no. 4, pp. 1291-1297, 1998.

You might also like