The Profession of Human-Computer Interaction

What HCI researchers do, and how to become one

The Profession of Human-Computer Interaction

The philosopher St. Thomas Aquinas argues that everything has an essence or ideal at its core, as well as an existence – an appearance and manifestation in the real world. The only being whose essence and existence are identical, according to Aquinas, is God. For all other things – material things, and ideas too – discrepancies between what is essential and what is existential are unavoidable.

In trying to make sense of complex topics – like entire fields of academic and scientific research – I find it useful to distinguish between theoretical ideals and existential facts. This framing has made me a better judge of people, facts, and how I live my own life. Unsurprisingly, it has also helped me to make sense of what's going on in my profession: human-computer interaction.

HCI as a Phenomenon

"Human-computer interaction," or "HCI" for short – we hear this often, in the context of design, innovation, computer science, digital products...

The relevance of HCI to our lives is very obvious: for many of us, it's literally what we do with most of our waking hours. Right now, I'm writing this on a computer, I'm getting messages on my phone (also a computer), and I have four other computers on my desk – an iPad, a camera, another phone, and a little prototype we built with a Raspberry Pi. As you read these lines, you are interacting with a computer. Even in our sleep, we interact with computers – we have smartwatches and rings for sleep tracking, and we wake up to digital alarms.

Human-computer interaction is the defining feature of human life and culture today.

HCI as a Profession

However, when people talk about "HCI," they are probably not talking about the phenomenon of living with computers.  Rather, they refer to a profession which deals with this phenomenon. So "human-computer interaction" has two meanings: it's the name of what happens, and it's the name of a profession where people have the job of studying and designing it.

This scientific and professional discipline that is called HCI is basically my job. I believe I'm incredibly fortunate to have this job, and many people seem to agree: there is a lot of interest in the profession of HCI. However, compared to its relevance and appeal, the world of HCI is challenging to make sense of and to get into. Part of the challenge is because HCI – despite having managed to subjugate a generous label like "human-computer interaction" as its name – is actually a small, niche community.* So there aren't a lot of HCI jobs; but it still becomes much easier to understand and, if you desire, to enter this community, once you have a clear mental model of it.

* My estimate is on the order of 10,000 people.

I'll give you an example: A year ago, I was involved in a university project where I, together with two professors, had the job of hiring two HCI researchers.* For two positions, we got more than 200 applications.

* Technically, we were hiring PhD students. I call them "researchers" rather than "students" because, in my experience, it's a more truthful description of what they actually do; and also because these particular positions provided a full salary and benefits.

Most of our applicants were very smart and successful: very talented engineers, designers, entrepreneurs, scientists... However, among these 200 people who were all impressive in their own ways, maybe 10 of them seemed to be fully aware of what they are getting into.

They all understood HCI as a phenomenon. Most of them understood the essential ideas in HCI as a scientific and professional field. But only 5% understood the boundaries and landscape of HCI in terms of its existential facts. Naturally, those people pulled ahead.

To help the 99% of you who wish to make your way into HCI but have – yet – not found it, I'd like to tell you about my mental model of it. This will be particularly relevant if you're interested in graduate studies – especially PhD programs – in HCI, for two reasons: First, when I posted on Twitter looking for questions about HCI, one of the most salient requests was to talk about how to get into HCI PhD programs (shoutout and thanks to Faria). Second, and more importantly, even though the phenomena of computers are everywhere in our lives, "HCI" actually refers to a very particular niche within academic and scientific research.

HCI is a field of academic research

The first thing you must understand if you wish to make HCI your profession is that this isn't really a job that exists in companies, in the way you'd expect: there aren't really any jobs like "HCI designer" or "HCI manager." Perhaps the closest thing to HCI in the commercial world are UX jobs, as well as design, requirements analysis, or product validation. But if you actually have "HCI" in the name of your job or department, it has a few implications about the work and qualifications involved.

HCI is a field of academic and scientific research. The vast majority of HCI researchers are employed at universities. A smaller but still significant number of them are associated with large tech companies that allocate large budgets for research and innovation: Microsoft Research, Spotify, Nokia Bell Labs, IBM, Google...*

* Independent labs and individuals like Ink & Switch and Andy Matuschak exist, but their numbers are few, and they are somewhat outside the academic HCI community.  That said, I do believe this space will grow exponentially and eventually integrate with the academic world.

Basically the only way you have HCI in your actual job title is being a HCI researcher – either at a university, within an academic career; or at a tech corporation which has the luxury of financing its own scientific research. And for the latter, you still need a PhD – as well as publishing your research in peer-reviewed conferences and journals.

The essence of HCI is expansive. Go to Wikipedia, or the "brief intro" in the Interaction Design Foundation's Encyclopedia, and you'll find a very broad and inclusive definition – trying to do justice to all that HCI is and all that it could be. But the existential fact is that HCI is the name of a very particular niche in scientific and academic research.

This is the first thing you need to understand, if you're interested in HCI: it's an academic research discipline. This means that it comes with all of the concerns and constraints that apply to scientific and academic research disciplines: You must be interested in the rigor and philosophy of science and academia. You have to read thousands of research papers, and write a lot of your own. You have to engage with the academic world: a whole ecosystem of universities, academic careers, research funding, peer-reviewed conferences and journals... You will be teaching courses, giving lectures... And crucially, all of this takes up most of your time – you have very little time left to spend on creating actual HCI designs.

The vast majority of HCI researchers I know don't design anything in a hands-on fashion. They don't develop a lot of software. They don't work on 3D modeling or graphic design.

There are many exceptions to this: for example, a lot of technical and tactical work is presented at research conferences like UIST (User Interface Software and Technology), DIS (Designing Interactive Systems), and TEI (Tangible, Embedded, and Embodied Interaction). However, if you look closely, you realize that the vast majority design and development in HCI research projects is done by graduate students. It's only at the earliest stages of a this career that you get hands-on  with design and engineering. As a professional HCI researcher, you will definitely collaborate with creative and technical people, but your job will be closer to that of an analyst, a critic, a teacher, a philosopher, or a project manager; much more than it is to that of a designer or developer.

Many people who say they are interested in HCI are actually interested in design and development. That said, the lines between them are not so clear in the real world. Researchers, designers, and developers often work together in teams. Many people switch between these jobs at different points in their career. You can get an education in one of them and a job in another – in fact, HCI, design, and engineering graduates all find rewarding careers in each other's worlds and many others like marketing or sales. Many senior HCI researchers maintain an interest in technical work, cherishing every opportunity to build things. Last but not least, companies and even university departments may organize themselves along completely different mental models of what words like HCI, design, or product even mean.*

* In fact, observing that we say the same words but mean different things when I talk to my colleagues in HCI was one of the reasons I began Design Disciplin.

That said, in addition to being situated in the traditions of science and academia, there are two more things that define HCI.

HCI researchers are computer scientists

I'll go ahead and say it:

HCI is the branch of computer science that builds on social sciences and humanities.

Believe it or not, this is actually a somewhat controversial thing to say. But I will get to that soon, after we talk about the two parts of what I just said. First: the idea that HCI is a subset of computer science.

In practice, the existential core of any scientific discipline today is publications. All scientific fields exist, first of all, as a collection of written documents: journal articles, conference papers, books... These are the infrastructures of scientific communities.

In HCI, most of the publications which have the highest rankings and the largest readership are produced by two organizations which define themselves in terms of computer science and engineering: ACM and IEEE.

As a philosophical aside: There's a legitimate argument that HCI extends beyond computer science. It does. But when it does, I believe, it also extends the borders of computer science, redefining what it is.*

* There's also an argument that HCI encompasses CS – lol.

If we leave the philosophy behind, what follows for practical reasons is that HCI researchers are computer scientists. In HCI, we specialize in ideas and methods that come from elsewhere (as we will soon get to). But we do science, we deal with computers, and we must do so properly: algorithms and data structures, information theory, graphics, signal processing, machine learning, electronics, software engineering... You cannot avoid or outsource dealing with these things in HCI. You must have the equivalent of a university degree in computer science to do this work.*

* Notice that I said "the equivalent." I don't have an actual computer science degree.

Not only do we have "computer," literally, in the name of what we do; many professors and other leaders who recruit HCI researchers today are in fact computer scientists and engineers. If you speak the same language as them, it carries you forward.

This is my argument that, for HCI, you need a strong foundation in computer science. I use mine on a daily basis. I certainly expect it from colleagues, though I would lower the bar for a grad student with other talents (expecting them to raise it in the course). But I've seen that many leaders who hire for HCI positions value a CS foundation above all else.

And you need one more thing: a foundation – and ideally a specialization – in social sciences and humanities.

HCI uses assumptions and methods from social sciences and humanities

The fact that HCI researchers need and use competence in social sciences and humanities is best illustrated with a bit of history.

The origin of HCI has materialized in the 1980s. A book from 1983 stands out: The Psychology of Human-Computer Interaction. The authors take assumptions, methods, results, and arguments from psychology; and from these they derive answers about how to design computers, expanding computer science and psychology towards each other.

Psychology is just the beginning. Over three decades, practically all branches of social sciences and humanities find their way into HCI.

If you get involved in HCI you will see that people talk about three "waves" – or intellectual movements – that define its history.* The first wave is psychology and computer science coming together. The second wave is the rest of the social sciences coming into HCI.

* Many scholars have written about the "three waves" of HCI. I chose some for the illustration above, and also included links to these at the end of the article. Unfortunately, not all are freely available at their official links, but a bit of searching will lead you to free versions on authors' websites and such.

Mobile phones, laptops, and the internet – all invented in the 1980s. All of this technology, finding its way into our lives, creates new questions and possibilities for how we live. To make sense of it, we need new ways of thinking about and doing research – social science: assumptions and methods from anthropology, sociology, and media studies, applied to the questions of HCI.

The third wave of HCI, then, is the same thing happening with humanities. As technology becomes ingrained in our lives in incredible ways we could have never imagined, it becomes necessary to bring in literature, philosophy, critical theory, arts, and of course design, in order to make sense of the place and possibilities of computers in our lives.

The practical implication is this: in addition to being a citizen of the academic world and mastering its rituals, and in addition to being a competent computer scientist; there is a third area of competence which HCI researchers need to command – social sciences and humanities. The methods that we use to do our work originate from these fields, like psychology, anthropology, critical theory, sociology, and design. We turn to our mentors and the literature they have written in these disciplines, for precious know-how. We use the criteria that evolved in the social sciences and humanities over many decades, to set the bar for the quality of our work.

Caveats

Having learned from its history, HCI today is a diverse and inclusive branch of computer science where we take knowledge and methods – or if you want to be technical about it, the epistemologies and methodologies – from schools of thought which traditionally have nothing to do with computer science, and we use them to figure out how to deal with all of this technology. It's a very fortunate place to be – very tolerant to ideas that initially may not make sense to computer scientists.

But this tolerance and inclusivity is a double-edged sword. On one hand, it allows us to consider unconventional ideas. It paves the way to innovation and invention, and creates a beautiful community where all kinds of individuals are appreciated. On the other hand, it challenges outsiders and beginners to make sense of what's going on; and challenges us to have standards that separate the wheat from the chaff.

As we might expect from a community with deep ties to social sciences and humanities, people in HCI really love to have debates about the essence of what we do. We publish articles intended solely for the purpose of "provoking discussions" and "promoting debates" about the field itself. Our culture of encouraging these discussions is so strong, it's near-impossible to get your writing through peer review if you make conclusive statements about how anything "should" be done in HCI. God forbid we actually stop debating.

Jokes aside, there are good reasons to continue debating. All science is essentially a never-ending debate – it's about getting as close as we can to truth, by admitting we will never have the real truth. But the challenges are real: it took me 10 years of wading through these debates to understand what I'm dealing with, and how to do my job.

All models are wrong, but some are useful. This is my mental model of HCI: a branch of computer science which uses methods and assumptions from social sciences and humanities. I know it's wrong – these are extremely complex worlds of knowledge we are talking about, and they keep expanding and evolving. But I know it's useful too – it leads me to jobs, funding, publications, and all the other markers of success in this profession, without undue strain. And my peers who apply the same model achieve the same.

For newcomers, it's a lot of work. It takes years of study to cultivate competences in computer science and, let's say, anthropology. And this is why most HCI schools are graduate level – master's and PhD programs. It's expected that you gain these, as well as an appreciation of academic research, in years of schooling and experience before you launch into this profession.

If you're already in a HCI school, or on the cusp of joining one, and you realize that you're not as excited about some of these topics, that's actually fine. You can find projects and collaborators which will have you focus on software development, graphic design, ethnographic research, or whatever you enjoy. But simply realizing that HCI is a branch of computer science research which brings in the social sciences and humanities allows you to make sense of things, to make faster and easier progress.



We will continue to explore the world of HCI in future episodes. We have already had guests like Jofish Kaye and Erik Stolterman who have made very successful and very different careers in this world. We are going to host many more conversations with HCI professionals, and present concepts from this fascinating world where design, technology, and science all come together.

Make sure that you don't miss them: Subscribe to our YouTube channel, subscribe to our podcast, follow us on Twitter, follow us on Instagram, and subscribe to our newsletter.