KEMBAR78
All About Hci | PDF | Human–Computer Interaction | Usability
0% found this document useful (0 votes)
191 views12 pages

All About Hci

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
191 views12 pages

All About Hci

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 12

HCI (human-computer interaction) is 

the study of how people


interact with computers and to what extent computers are
or are not developed for successful interaction with human
beings. As its name implies, HCI consists of three parts: the
user, the computer itself, and the ways they work together.

What is HCI and its importance?


Human-computer interaction (HCI) is a multidisciplinary
subject that focuses on computer design and user
experience. It brings together expertise from computer
science, cognitive psychology, behavioural science, and design
to understand and facilitate better interactions between users
and machines.

The Meteoric Rise of HCI


HCI surfaced in the 1980s with the advent of personal computing, just as
machines such as the Apple Macintosh, IBM PC 5150 and Commodore 64
started turning up in homes and offices in society-changing numbers. For
the first time, sophisticated electronic systems were available to general
consumers for uses such as word processors, games units and accounting
aids. Consequently, as computers were no longer room-sized, expensive
tools exclusively built for experts in specialized environments, the need to
create human-computer interaction that was also easy and efficient for less
experienced users became increasingly vital. From its origins, HCI would
expand to incorporate multiple disciplines, such as computer science,
cognitive science and human-factors engineering.

HCI soon became the subject of intense academic investigation. Those who
studied and worked in HCI saw it as a crucial instrument to popularize the
idea that the interaction between a computer and the user should resemble
a human-to-human, open-ended dialogue. Initially, HCI researchers
focused on improving the usability of desktop computers (i.e., practitioners
concentrated on how easy computers are to learn and use). However, with
the rise of technologies such as the Internet and the smartphone, computer
use would increasingly move away from the desktop to embrace the mobile
world. Also, HCI has steadily encompassed more fields:

“…it no longer makes sense to regard HCI as a specialty of computer


science; HCI has grown to be broader, larger and much more diverse than
computer science itself. HCI expanded from its initial focus on individual
and generic user behavior to include social and organizational computing,
accessibility for the elderly, the cognitively and physically impaired, and for
all people, and for the widest possible spectrum of human experiences and
activities. It expanded from desktop office applications to include games,
learning and education, commerce, health and medical applications,
emergency planning and response, and systems to support collaboration
and community. It expanded from early graphical user interfaces to include
myriad interaction techniques and devices, multi-modal interactions, tool
support for model-based user interface specification, and a host of
emerging ubiquitous, handheld and context-aware interactions.”

— John M. Carroll, author and a founder of the field of human-computer


interaction.

HCI is a broad field which overlaps with areas such as user-centered design
(UCD), user interface (UI) design and user experience (UX) design. In
many ways, HCI was the forerunner to UX design.

Despite that, some differences remain between HCI and UX design.


Practitioners of HCI tend to be more academically focused. They're
involved in scientific research and developing empirical understandings of
users. Conversely, UX designers are almost invariably industry-focused and
involved in building products or services—e.g., smartphone apps and
websites. Regardless of this divide, the practical considerations for products
that we as UX professionals concern ourselves with have direct links to the
findings of HCI specialists about users’ mindsets. With the broader span of
topics that HCI covers, UX designers have a wealth of resources to draw
from, although much research remains suited to academic audiences. Those
of us who are designers also lack the luxury of time which HCI specialists
typically enjoy. So, we must stretch beyond our industry-dictated
constraints to access these more academic findings. When you do that well,
you can leverage key insights into achieving the best designs for your users.
By “collaborating” in this way with the HCI world, designers can drive
impactful changes in the market and society.

Minimalist information emphasized supporting goal-directed activity in a


domain. Instead of topic hierarchies and structured practice, it emphasized
succinct support for self-directed action and for recognizing and recovering
from error.

Other historically fortuitous developments contributed to the establishment


of HCI. Software engineering, mired in unmanageable software complexity
in the 1970s (the “software crisis”), was starting to focus on nonfunctional
requirements, including usability and maintainability, and on empirical
software development processes that relied heavily on
iterative prototyping and empirical testing. Computer graphics and
information retrieval had emerged in the 1970s, and rapidly came to
recognize that interactive systems were the key to progressing beyond
early achievements. All these threads of development in computer science
pointed to the same conclusion: The way forward for computing entailed
understanding and better empowering users. These diverse forces of need
and opportunity converged around 1980, focusing a huge burst of human
energy, and creating a highly visible interdisciplinary project.

2.2 From cabal to community

The original and abiding technical focus of HCI was and is the concept
of usability. This concept was originally articulated somewhat naively in the
slogan "easy to learn, easy to use". The blunt simplicity of this
conceptualization gave HCI an edgy and prominent identity in computing.
It served to hold the field together, and to help it influence computer
science and technology development more broadly and effectively.
However, inside HCI the concept of usability has been re-articulated and
reconstructed almost continually, and has become increasingly rich and
intriguingly problematic. Usability now often subsumes qualities like fun,
well being, collective efficacy, aesthetic tension, enhanced creativity, flow,
support for human development, and others. A more dynamic view of
usability is one of a programmatic objective that should and will continue to
develop as our ability to reach further toward it improves.

Usability is an emergent quality that reflects the grasp and the reach of
HCI. Contemporary users want more from a system than merely “ease of
use”.

Although the original academic home for HCI was computer science, and
its original focus was on personal productivity applications, mainly text
editing and spreadsheets, the field has constantly diversified and outgrown
all boundaries. It quickly expanded to encompass visualization, information
systems, collaborative systems, the system development process, and many
areas of design. HCI is taught now in many departments/faculties that
address information technology, including psychology, design,
communication studies, cognitive science, information science, science and
technology studies, geographical sciences, management information
systems, and industrial, manufacturing, and systems engineering. HCI
research and practice draws upon and integrates all of these perspectives.

A result of this growth is that HCI is now less singularly focused with
respect to core concepts and methods, problem areas and assumptions
about infrastructures, applications, and types of users. Indeed, it no longer
makes sense to regard HCI as a specialty of computer science; HCI has
grown to be broader, larger and much more diverse than computer science
itself. HCI expanded from its initial focus on individual and generic user
behavior to include social and organizational computing, accessibility for
the elderly, the cognitively and physically impaired, and for all people, and
for the widest possible spectrum of human experiences and activities. It
expanded from desktop office applications to include games, learning and
education, commerce, health and medical applications, emergency
planning and response, and systems to support collaboration and
community. It expanded from early graphical user interfaces to include
myriad interaction techniques and devices, multi-modal interactions, tool
support for model-based user interface specification, and a host of
emerging ubiquitous, handheld and context-aware interactions.

There is no unified concept of an HCI professional. In the 1980s, the


cognitive science side of HCI was sometimes contrasted with the software
tools and user interface side of HCI. The landscape of core HCI concepts
and skills is far more differentiated and complex now. HCI academic
programs train many different types of professionals: user experience
designers, interaction designers, user interface designers, application
designers, usability engineers, user interface developers, application
developers, technical communicators/online information designers, and
more. And indeed, many of the sub-communities of HCI are themselves
quite diverse. For example, ubiquitous computing (aka ubicomp) is subarea
of HCI, but it is also a superordinate area integrating several
distinguishable subareas, for example mobile computing, geo-spatial
information systems, in-vehicle systems, community informatics,
distributed systems, handhelds, wearable devices, ambient intelligence,
sensor networks, and specialized views of usability evaluation,
programming tools and techniques, and application infrastructures. The
relationship between ubiquitous computing and HCI is paradigmatic: HCI
is the name for a community of communities.

Two visualizations of the variety of disciplinary knowledge and skills


involved in contemporary design of human-computer interactions
Indeed, the principle that HCI is a community of communities is now a
point of definition codified, for example, in the organization of major HCI
conferences and journals. The integrating element across HCI communities
continues to be a close linkage of critical analysis of usability, broadly
understood, with development of novel technology and applications. This is
the defining identity commitment of the HCI community. It has allowed
HCI to successfully cultivate respect for the diversity of skills and concepts
that underlie innovative technology development, and to regularly
transcend disciplinary obstacles. In the early 1980s, HCI was a small and
focused specialty area. It was a cabal trying to establish what was then a
heretical view of computing. Today, HCI is a vast and multifaceted
community, bound by the evolving concept of usability, and the integrating
commitment to value human activity and experience as the primary driver
in technology.

2.3 Beyond the desktop

Given the contemporary shape of HCI, it is important to remember that its


origins are personal productivity interactions bound to the desktop, such as
word processing and spreadsheets. Indeed, one of biggest design ideas of
the early 1980s was the so-called messy desk metaphor, popularized by the
Apple Macintosh: Files and folders were displayed as icons that could be,
and were scattered around the display surface. The messy desktop was a
perfect incubator for the developing paradigm of graphical user interfaces.
Perhaps it wasn’t quite as easy to learn and easy to use as claimed, but
people everywhere were soon double clicking, dragging windows and icons
around their displays, and losing track of things on their desktop interfaces
just as they did on their physical desktops. It was surely a stark contrast to
the immediately prior teletype metaphor of Unix, in which all interactions
were accomplished by typing commands.

The early Macintosh desktop metaphor: Icons scattered on the desktop


depict documents and functions, which can be selected and accessed (as
System Disk in the example)

Even though it can definitely be argued that the desktop metaphor was
superficial, or perhaps under-exploited as a design paradigm, it captured
imaginations of designers and the public. These were new possibilities for
many people in 1980, pundits speculated about how they might change
office work. Indeed, the tsunami of desktop designs challenged, sometimes
threatened the expertise and work practices of office workers. Today they
are in the cultural background. Children learn these concepts and skills
routinely.
As HCI developed, it moved beyond the desktop in three distinct senses.
First, the desktop metaphor proved to be more limited than it first seemed.
It’s fine to directly represent a couple dozen digital objects as icons, but this
approach quickly leads to clutter, and is not very useful for people with
thousands of personal files and folders. Through the mid-1990s, HCI
professionals and everyone else realized that search is a more fundamental
paradigm than browsing for finding things in a user interface. Ironically
though, when early World Wide Web pages emerged in the mid-1990s, they
not only dropped the messy desktop metaphor, but for the most part
dropped graphical interactions entirely. And still they were seen as a
breakthrough in usability (of course, the direct contrast was to Unix-style
tools like ftp and telnet). The design approach of displaying and directly
interacting with data objects as icons has not disappeared, but it is no
longer a hegemonic design concept.

The early popularity of messy desktops for personal information spaces


does not scale.

The second sense in which HCI moved beyond the desktop was through the
growing influence of the Internet on computing and on society. Starting in
the mid-1980s, email emerged as one of the most important HCI
applications, but ironically, email made computers and networks into
communication channels; people were not interacting with computers, they
were interacting with other people through computers. Tools and
applications to support collaborative activity now include instant
messaging, wikis, blogs, online forums, social networking, social
bookmarking and tagging services, media spaces and other collaborative
workspaces, recommender and collaborative filtering systems, and a wide
variety of online groups and communities. New paradigms and
mechanisms for collective activity have emerged including online auctions,
reputation systems, soft sensors, and crowd sourcing. This area of HCI,
now often called social computing, is one of the most rapidly developing.

 A huge and expanding variety of social network services are part of


everyday computing experiences for many people. Online communities,
such as Linux communities and GitHub, employ social computing to
produce high-quality knowledge work.

The third way that HCI moved beyond the desktop was through the
continual, and occasionally explosive diversification in the ecology of
computing devices. Before desktop applications were consolidated, new
kinds of device contexts emerged, notably laptops, which began to appear
in the early 1980s, and handhelds, which began to appear in the mid-1980s.
One frontier today is ubiquitous computing: The pervasive incorporation of
computing into human habitats — cars, home appliances, furniture,
clothing, and so forth. Desktop computing is still very important, though
the desktop habitat has been transformed by the wide use of laptops. To a
considerable extent, the desktop itself has moved off the desktop.

: Computing moved off the desktop to be everywhere all the time.


Computers are in phones, cars, meeting rooms, and coffee shops.

The focus of HCI has moved beyond the desktop, and its focus will continue
to move. HCI is a technology area, and it is ineluctably driven to frontiers of
technology and application possibility. The special value and contribution
of HCI is that it will investigate, develop, and harness those new areas of
possibility not merely as technologies or designs, but as means for
enhancing human activity and experience.

2.4 The task-artifact cycle

The movement of HCI off the desktop is a large-scale example of a pattern


of technology development that is replicated throughout HCI at many levels
of analysis. HCI addresses the dynamic co-evolution of the activities people
engage in and experience, and the artifacts — such as interactive tools and
environments — that mediate those activities. HCI is about understanding
and critically evaluating the interactive technologies people use and
experience. But it is also about how those interactions evolve as people
appropriate technologies, as their expectations, concepts and skills develop,
and as they articulate new needs, new interests, and new visions and
agendas for interactive technology.

Reciprocally, HCI is about understanding contemporary human practices


and aspirations, including how those activities are embodied, elaborated,
but also perhaps limited by current infrastructures and tools. HCI is about
understanding practices and activity specifically as requirements and
design possibilities envisioning and bringing into being new technology,
new tools and environments. It is about exploring design spaces, and
realizing new systems and devices through the co-evolution of activity and
artifacts, the task-artifact cycle.
Author/Copyright holder: Courtesy of John M. Carroll. Copyright terms and
licence: CC-Att-SA-3 (Creative Commons Attribution-ShareAlike 3.0)

Figure 2.10: Human activities implicitly articulate needs, preferences and


design visions. Artifacts are designed in response, but inevitably do more
than merely respond. Through the course of their adoption
and appropriation, new designs provide new possibilities for action and
interaction. Ultimately, this activity articulates further human needs,
preferences, and design visions.

Understanding HCI as inscribed in a co-evolution of activity and


technological artifacts is useful. Most simply, it reminds us what HCI is
like, that all of the infrastructure of HCI, including its concepts, methods,
focal problems, and stirring successes will always be in flux. Moreover,
because the co-evolution of activity and artifacts is shaped by a cascade of
contingent initiatives across a diverse collection of actors, there is no reason
to expect HCI to be convergent, or predictable. This is not to say progress in
HCI is random or arbitrary, just that it is more like world history than it is
like physics. One could see this quite optimistically: Individual and
collective initiative shapes what HCI is, but not the laws of physics.

https://www.interaction-design.org/literature/book/the-encyclopedia-of-
human-computer-interaction-2nd-ed/human-computer-interaction-brief-
intro

https://www.interaction-design.org/courses/human-computer-interaction

History
Human Computer Interaction is the academic discipline that most of us
think of as UI design. It focuses on the way that human beings and
computers interact to ever increasing levels of both complexity
and simplicity.

It’s a Very New Discipline


It’s perhaps easy to see that until the mid to late 1970s this discipline wasn’t
particularly important. The few people who had access to computers were
academics or professionals with a few incredibly dedicated (and wealthy)
hobbyists thrown into the mix. Without a broad base of users; it wasn’t necessary
to focus on how those users interacted with computers – they just made do with
whatever was to hand or created what they needed themselves.

Then with the dawn of personal computing; the flood gates opened. The masses
wanted computing and they didn’t want to go through complicated rigmarole to do
what they wanted with a computer. They weren’t prepared to build and program
their own joysticks for the games they bought, they didn’t expect to design the
mouse before they could use a word processor and so on…

Cognitive Sciences
Luckily, for the masses, there was a discipline waiting in the wings to help with the
tasks that lay ahead. Cognitive sciences (a broad and heady mix which includes
psychology, language, artificial intelligence, philosophy and even anthropology)
had been making steady progress during the 1970s and by the end of the decade
they were ready to help articulate the systems and science required to develop user
interfaces that worked for the masses.

Engineering
This is known as “cognitive engineering” e.g. building things that work with our
thoughts. And once again the engineering discipline had also come on leaps and
bounds during the 1970s in order to support this change. In aviation, for example,
engineering had already started to simplify the user interface of complex airplanes.
It was natural for some of this work to move into the UI field for computing
devices.

Documentation
It’s also important to recognize the challenge of documenting these developments.
New systematic approaches needed to be taken in order to record developments
and to share these with other practitioners of the new discipline worldwide. There
really is, after all, no advantage in reinventing the mouse over and over again.

John Carroll the Edward Frymoyer Chair Professor of Information Sciences and
Technology at the Pennsylvania State University says that the discipline of Human
Computer Interaction was born (or perhaps “emerged” is a better word) in 1980 as
all these separate disciplines began to realign around a single objective; making
computing easier for the masses.

You can read the full text of John’s book on Human Computer Interaction here on
the IxDF website. It’s completely free to read online and our members can also
download a free copy to their preferred e-book reader.

https://www.interaction-design.org/courses/human-computer-interaction

Take a deep dive into Human-Computer Interaction (HCI) with our


course Human-Computer Interaction - HCI .

Interactions between computers and humans should be as intuitive as conversations


between two humans—and yet many products and services fail to achieve this. So,
what do you need to know so as to create an intuitive user experience? Human
psychology? Emotional design? Specialized design processes? The answer is, of
course, all of the above, and this course will cover them all.

Human-computer interaction (HCI) is about understanding what it means to be a


user of a computer (which is more complicated than it sounds), and therefore how
to create related products and services that work seamlessly. It’s an important skill
to master, because it gives any company the perspective and knowledge needed to
build products that work more efficiently and therefore sell better. In fact, the
Bureau of Labor Statistics predicts the Computer and IT occupation to grow by
12% from 2014–2024, faster than the average for all occupations. This goes to
show the immense demand in the market for professionals equipped with the right
computer and IT skills.

This course provides a comprehensive introduction and deep dive into HCI, so you
can create designs that provide outstanding user experiences. Whether you are a
newcomer to the subject of HCI or a professional, by the end of the course you will
have a deep understanding of what it means to be a user and how to implement
user-centered design for the best possible results.
This course is based on in-depth videos created by the amazing Alan Dix. You'll be
in great company with this renowned professor and Director of the Computational
Foundry at Swansea University, a specialist in HCI and co-author of the classic
textbook, Human-Computer Interaction.

the only humans who interacted with computers were information


technology professionals and dedicated hobbyists. This changed
disruptively with the emergence of personal computing in the later 1970s.
Personal computing, including both personal software (productivity
applications, such as text editors and spreadsheets, and interactive
computer games) and personal computer platforms (operating systems,
programming languages, and hardware), made everyone in the world a
potential computer user, and vividly highlighted the deficiencies of
computers with respect to usability for those who wanted to use computers
as tools.

Personal computing rapidly pushed computer use into the general


population, starting in the later 1970s. However, the non-professional
computer user was often subjected to arcane commands and system
dialogs.

The challenge of personal computing became manifest at an opportune


time. The broad project of cognitive science, which incorporated cognitive
psychology, artificial intelligence, linguistics, cognitive anthropology, and
the philosophy of mind, had formed at the end of the 1970s. Part of the
programme of cognitive science was to articulate systematic and
scientifically informed applications to be known as "cognitive engineering".
Thus, at just the point when personal computing presented the practical
need for HCI, cognitive science presented people, concepts, skills, and a
vision for addressing such needs through an ambitious synthesis of science
and engineering. HCI was one of the first examples of cognitive
engineering.

The Model Human Processor was an early cognitive engineering model


intended to help developers apply principles from cognitive psychology.

This was facilitated by analogous developments in engineering and design


areas adjacent to HCI, and in fact often overlapping HCI, notably human
factors engineering and documentation development. Human factors had
developed empirical and task-analytic techniques for evaluating human-
system interactions in domains such as aviation and manufacturing, and
was moving to address interactive system contexts in which human
operators regularly exerted greater problem-solving discretion.
Documentation development was moving beyond its traditional role of
producing systematic technical descriptions toward a cognitive approach
incorporating theories of writing, reading, and media, with empirical user
testing. Documents and other information needed to be usable also.

You might also like