Tent Talks Featuring Steve Portigal: Interviewing Users

Tent Talks Featuring: Steve Portigal
Steve Portigal
Author & User Researcher
Portigal Consulting
Steve Portigal is an experienced user researcher who helps organizations to build more mature user research practices.

On Monday, November 20th at 5:00pm Central, Steve Portigal joins us for a live Q&A session: “Interviewing Users.”

Session Notes

The session with Steve Portigal, discussing the second edition of his book “Interviewing Users,” delved into how the field of user research has evolved over the past decade. Steve highlighted significant shifts, including changes in societal norms, the rise of remote work due to the pandemic, and advancements in technology, particularly in user interview techniques. He also touched on ethical considerations in user research and the role of AI in shaping future dynamics. Throughout the session, Steve shared insights from his extensive experience, emphasizing the importance of context, adaptability, and the ever-changing nature of user research.

Evolution in User Research:

  • User research practices have shifted significantly, particularly in compensating participants. The trend moved from cash payments to more convenient, digital forms.
  • The rise of remote work, accelerated by the pandemic, has transformed user research methodologies, with a notable increase in remote interviews.
  • There’s a greater focus on data privacy and regulatory compliance in research, reflecting societal and legal shifts.
  • Adapting interview techniques for remote settings has become crucial, with adjustments needed for communication styles and technological limitations.

Impact of Remote User Interviews:

  • Remote interviews lack the personal connection and context-rich environment of in-person interactions, affecting the depth of insights.
  • Collaboration within research teams and post-interview synthesis have become more challenging in remote settings.
  • New norms of communication, like managing turn-taking and interpreting non-verbal cues, have emerged, necessitating adaptation by researchers.

Ethical Implications in Research:

  • The ethical landscape in user research is complex, with a growing emphasis on informed consent and transparent data practices.
  • Resources like Alba Villamil’s “Ethical Researcher’s Checklist” provide guidance on navigating these ethical considerations effectively.
  • The approach to consent has evolved, with more nuanced methods being developed to respect participants’ autonomy and privacy.

AI in User Research:

  • The role of AI in user research is evolving, with its potential impact still largely uncertain.
  • AI’s current strength lies in data summarization rather than synthesis, which remains a predominantly human-driven process.
  • As AI technology advances, its application in user research could extend to supporting creative thinking and problem-solving.

Most Profound Learning Experience:

  • Steve recounted an experience where he confronted and overcame his own age bias during an interview, highlighting the human nature of biases in research.
  • This experience underlined the importance of being aware of and challenging personal biases to gain true insights in user research.

Notable Quotes:

  • “We operate on biases, but research allows us to overcome and revisit our assumptions.”
  • “Remote research has changed our norms of communication and collaboration.”
  • “Ethical considerations are vital in user research, especially in the age of data privacy.”
  • “I had a conversation with someone that I respect the other day, and they said to me, a large language model, they can summarize, but it can’t synthesize because it can only be based on what is, so summarization is like a great use of that, but synthesis isn’t.”
  • “AI’s potential in user research lies more in aiding creativity than replacing human analysis.”

Session Transcript

[00:00:31] Chicago Camps: Congrats on the second edition of “Interviewing Users.” It’s been a decade since the first edition – what’s changed in the industry that compelled you to update the book and how did that impact what you added?

[00:00:43] Steve Portigal: It’s a really good question to think about what has changed because the activity of interviewing people, that sort of human to human interaction is, it’s pretty evergreen.

[00:00:54] And I thought for a long time why the book wouldn’t need to change. It’s just going to keep offering the same advice forever. But… Lots of small things changed. For example, the first edition talks about how great it is to give your participants cash when you meet them to interview them, that’s the thing.

[00:01:13] And it was a great practice and it, the meaning of money and how we use money changed like actually pretty quickly. I don’t think we talk about it very much, but giving somebody cash is generally an inconvenience versus giving them some money that they can put into some other form that they can use. So the practices have changed.

[00:01:31] And I don’t think that’s a big deal, but it’s a significant look at what else has changed. In society, we do a lot more work remotely like this that we’re doing now, having a pandemic kind of helps drive that change in work and just every aspect of our lives. Not every aspect, but many of them took place on the computer.

[00:01:51] So a lot of user research, it all became remote. Some of it was before. Some of it is gone back to in person now, but we’ve been doing remote research for a long time, but we didn’t have that kind of collective movement to doing lots of collaborative work online. Terms like Zoom fatigue come into the common parlance.

[00:02:13] We start coming up with workarounds and solutions like that. How we use this technology to do research is something that we hadn’t, as a practice, thought about as much as those last few years forced us to, and then I think the context in which we do research has changed the regulatory compliance and legal stuff, thoughts about data privacy, like our perspective on it, our ethical perspective on it, our obligations legally have moved along in 10 years.

[00:02:44] And so you see how organizations have built up processes and practices to deal with that. So the advice is out of date because we don’t work the same way anymore because of these sort of external factors. And these are all signs of the maturation of the field, not any individual research or any team, but that we all have grown up over 10 years and we do things differently.

[00:03:06] So yeah, those are some of the things that drive the thoughts about where to go. That’s not everything that’s different, but that’s a start.

[00:03:13] Chicago Camps: How have you seen user interview techniques evolve over the years, especially with the rapid advancements in technology?

[00:03:19] Steve Portigal: We were talking about that in terms of remote. It’s harder to meet a new person, engage, even some very tactical things like when you talk to somebody, and I think we might be doing it, this will happen in this venue as well, turn taking is this interesting thing in podcasting and in interviewing where I just held what I was going to say.

[00:03:41] And sometimes that’s a cue to the other person to jump in when you’re with your friends and you’re excited, you look for those quiet moments you talk over because that’s what sort of social talking is. In meetings we’re encouraged to give people space to talk and it’s seen as respectful.

[00:03:57] And in interviews you certainly want to give people the space to go on. There’s, I think some of those cues are very automatic for us. And they’re often about breath as well as body language. We don’t get as much bandwidth of sort of information coming through. Body language is less, you can’t see all of me. I’m clearly like on stage for you all. So we change how we present, we change how we breathe and what we’re able to hear. And I think these technologies, they flatten some of this stuff out. I don’t know what gets filled but I don’t have a scientific answer for this, but it’s different.

[00:04:33] And so you might need to do more explicit, either you cut somebody off and you say, no, you go ahead or leave an extra long time. I’ve talked to people, I’ve been watching this lately, and some people you talk to don’t do anything while you’re talking. And then when you stop making sounds, they also don’t do anything for a beat or two beats.

[00:04:56] It feels unnatural because we don’t talk in person like that. And so as you interview somebody over these kind of media you have to be prepared to re normalize every time to that person who’s not thinking about it, who’s just doing. They’re reacting to these kinds of tools and these environments. So I think that it’s subtle and I can’t say it’s like easy to be good at, do this and then you’ll have it work out.

[00:05:23] This is a challenge anytime you meet a person is figuring out what’s their speed, what’s their sort of, what do they give back, what do they not give back, and how do you adapt to that. And it’s going to be harder over remote than maybe if you do this a lot. Your brain trains itself to look for other kinds of things.

[00:05:41] I’m making gestures right now while I’m talking. Or even just to be comfortable with radio silence or things that we wouldn’t normally do in an interview that we need to shift to what is less comfortable. And I think we’ve had a few years of doing this all the time for work. I think it’s a longer term sort of change.

[00:06:00] We don’t talk about it as much as we did like in the first eight months of the Zoom era of the pandemic. But I think in 5 years or 10 years, I think you’re going to see some more detailed analysis of how does this kind of change our interpersonal dynamics. But that’s an example anyway, rest of it works differently and you have to work harder to pay attention.

[00:06:22] I wonder if we need to have the eye contact thing fixed because if you’re not making eye contact and scene is not focusing, but how many times are you on a call with someone who’s doing this because they have a second monitor and you’re looking over there and talking and I would have to be looking at you closely to see if you’re making eye contact to, to, and I’m not, cause that’s very, I think some of those zoom fatigue studies talked about like how stressful it is, like the drain on the body to monitor your face for those cues.

[00:06:52] Which is different than if we’re in person and I can see the whole body from the right distance. And I used to find it weird and find it less weird. Even though I think we’re in a transition state around these sort of norms as they’re being refactored in this context. Yeah, a lot of the stuff’s in flux that’s so fundamental to person to person connections.

[00:07:12] Chicago Camps: With the shift to folks doing more remote work, how have remote user interviews impacted the process and quality of the insights gathered?

[00:07:20] Steve Portigal: Let’s just start with my bias and kind of work from there. I haven’t done an in person interview for whatever, however many years it’s been. And I really miss it.

[00:07:30] I think part of it is like the things we’re talking about, the comfort of the connection and how much work it is. I think that’s my convenience or inconvenience. But I miss what happens to me when I go, quote, out in the field, is that I connect with somebody in a different way, I get exposed to a different environment, I’m really provoked to think deeply about something.

[00:07:53] That person and their story, and of course the thing that I want to understand about them are the things that we have to do, and I’ve always valued research as an experience. Yeah, generate reflection and thinking and like synthesis before synthesis, just sense making from these experiences because they can be profound in a lot of ways, intellectually profound, you know, personally profound.

[00:08:17] And I just, I haven’t found the kind of research that I’ve been doing in the last few years to be changing me that much. And it’s a little bit of a, I gotta ask myself some questions like, is this just because of what I like or am I being as effective? I think remote also makes it harder to collaborate.

[00:08:37] I think you go out in the field, you drive in a car or take a train or something to some environment, a house or an office or a playground, sometimes you do that with a colleague and you talk with them beforehand about the topic and you talk with them about who else you’ve interviewed and you have the interview and it’s a shared experience and then you talk about it in the car on the way back or when you go get a slice of pizza before you go back to the office.

[00:09:00] And you have that experience that creates a bond with your team, but also you’re surfacing stuff that makes sense through these informal or even formal debrief conversations. And yeah, I could schedule somebody for a 50 minute block and hope they show up in the zoom after the interview, after they’ve gone for the bio break and they don’t have to drop off for the next meeting.

[00:09:21] It feels harder to get them and get their minds and get their attention. And we’re not even into sitting down and like having a workshop to synthesize or having a session to talk about what this means and what we’re going to do. So there’s a remoteness in the way that someone’s emotionally distant.

[00:09:38] There’s remoteness to all of the work that I think I’m talking about research, but I think anybody that does anything in a knowledge worker kind of realm has their own version of this. And I miss those things too, they’re part of the experience and they’re part of how I become more excited and confident and insightful about the people and whatever it is that we’re doing.

[00:09:59] So I’ve had to lift like a lot more on my own. And I think that I find out really interesting things that are helpful and actionable. I’m so biased about how the experience feels different. And what am I going to say here? The research that I’ve been doing for the last few years, it’s not as good.

[00:10:16] That’s why would I say that? People pay me to do this work, but I will say that the experience has felt different and it’s, I worry about what we’ve lost or what’s missing, at least for now.

[00:10:29] Chicago Camps: Do you find that going to someone’s environment or meeting in person in an environment gives you even more context clues to influence research outcomes?

[00:10:41] Steve Portigal: A hundred percent. The great thing about research in some environment is the thing that you didn’t know you’re going to ask about.

[00:10:49] Yes, there’s context clues. If you go to an office, you have to get there, go in the front door, go sign in through security, take an elevator, walk into the kitchen so they can give you a coffee or something like that. There’s so much to see there, even though it may sound boring when I’m saying it, but you walk into company A versus company B and you’re thinking about where your products are being used.

[00:11:15] You haven’t even got to like the server room or the config screen or anything that’s actually on the product. You’re just in those environments and it teaches you a lot about what’s the person that you’re going to meet works inside a culture, works inside a context where there’s rules and norms and expectations and that’s really different from place to place oftentimes.

[00:11:36] And so it’s nice to get those for free. You get those things. And then yes, there’s all the stuff that you didn’t know that you’re going to ask. No, I see something else on your desktop. I see something in your, what’s that? You’ve got a poster from the Avengers on your wall. Why do you have that? Oh, that was from a retreat where we all talk about our mission.

[00:11:53] You can give people the chance to talk about what they really care about, what they’re passionate about, like their thing. In their environment are all the clues about their thing, right? Most people present themselves in their space. You can ask for some of that, but some of it seems like a heavy lift, right? Like we want to get your time.

[00:12:11] Also, can you take a picture of your cube in case there’s any bobbleheads I want to ask about later? That’s probably not what you’re going to do as the homework assignment for your pre-research activity. I love being able to see that stuff. And even if we don’t even talk about it in the interview, when you get back in the car, you’re like, did you see?

[00:12:28] They have a Stephen Hawking bobblehead. Of course they do. That guy’s into this. You have things to pick up and talk about. And that you have to be very intentional about it. It’s hard to get that serendipity, which… I don’t, it’s the fun of research, right? Is the stuff that you don’t know that you’re going to learn about that opens the problem up, that’s just harder to do in the box.

[00:12:48] Chicago Camps: And now a question from our live studio audience, Mario asks, what are some of the ethical implications of user research in design and how can researchers ensure they are respecting the privacy and dignity of their subjects?

[00:13:02] Steve Portigal: There’s a great resource it’s by Alba Villamil. It’s called “The Ethical Researcher’s Checklist.” it’s this great document that just asks all these questions, including, which is like such a great one that I would never have thought of, should we be doing this research to begin with? And she’s done a lot of work and there’s other folks that, and I’m kicking it to these folks because they really are much more thoughtful, insightful about this than I am.

[00:13:30] I think resources like that are good to, I love that she’s written something like that and shared it so that we can, we’re not going to pull it up and go through it bit by bit, but I think it’s something to check out because there’s questions that we wouldn’t have, there’s a surface level response to your question, which I’m trying to avoid here, but let’s talk about a little more specifically privacy and data are really interesting.

[00:13:52] And privacy is organizations. I think that have good data, privacy policies. What do we collect? How long do we keep it for? How do we minimize what we collect? How do we make it transparent to the person? What it is we’re going to collect, how long we’re gonna keep it for, why we’re collecting it. How do we consent? We talk consent sometimes as like a binary. But I think you can be more nuanced with your consent and say, do you agree to this? Do you agree to this? Do you agree to this? And giving people an out.

[00:14:23] I think the way in which you consent, I think you can do it in a way that reasserts the power that we have over people. I think there are many more sort of kind ways to do it. And you can even see some experiments about how do we consent in a way that makes sense for the person we’re talking to. I think I link in the book to the experiment they did at Sesame Workshop, and they created these Muppet videos. That are in many different languages that sort of show a scenario of a family being visited by an interviewer and consenting. So people are going into environments to do research where they, there might be low literacy and different as well as different languages.

[00:15:05] So they are a printed form on corporate letterhead is not the right thing. How do we make people inform partners about what’s going to happen and why, and even just their information is kind of part of it. There’s a lot to this and I’m about at the end of kind of my expertise about it, but it’s a starter response anyway.

[00:15:25] Plus Muppets.

[00:15:26] Chicago Camps: What’s your approach to ensuring that the feedback gathered from user interviews is effectively communicated and incorporated into the design process?

[00:15:35] Steve Portigal: First part of that I think is that you have to do something, you have to make sense of what you gather and some of this kind of goes to maturity of any individual practice.

[00:15:45] I think the less experienced folks are, the more they want to just take what they remember about what was said and type it up. And that verb is, that’s stenography maybe, or collation as far as you get, you put these pieces together. And then you’re just taking requests or gathering complaints, you might as well use a survey for that.

[00:16:05] I think it’s the iceberg model, right? Some of it is above the surface, but a lot of it is below the surface. Below the surface means going back to what was said. And looking at it, and making inferences, what wasn’t said, how was it said, what’s said at the beginning and what’s said at the end, and that’s just within one interview, what did person A say, what did person B say, and, and there’s a whole new chapter about this, the analysis and synthesis process.

[00:16:33] And some folks say that the ratio should be two to one for every hour of the feedback that you gather. You should spend two hours analyzing and synthesizing. And I think in a less evolved practice, it’s the inverse. You might spend half an hour for every hour or even less. The caveat here is not every research question merits this.

[00:16:54] If we are looking for, I don’t know, choice preference between something and something else, we might be really clear about what that is. We come back and say, do this. But for anything where we want to understand why, or understand opportunities, or understand motivation, a new space we want to go into, characterize a customer that we haven’t worked with before…

[00:17:14] Really is worthwhile to go and do this analysis and synthesis. How do we have impact? We have to have something impactful to say. That’s why I’m saying that. Some other factors that I think can make or break it is working collaboratively with stakeholders, the folks that you want to inform, influence, take action before you do the research.

[00:17:37] And so having an understanding of what business challenges are or business goals. Like what are we trying to do as a company? And then formulating really good research questions. What are we going to learn in order to inform that? And then choosing methods and approaches that kind of can support that. And not doing that in a vacuum.

[00:17:55] And then this has the effect of switching your role from being proactive to reactive. I think it’s hard to have an impact with reactive work. Those requests that come are often late. They’re often based on a shallow assumption about what kind of value research can provide.

[00:18:15] And so you are going to give a thumbs up, thumbs down in some direction. So your sort of role as a provider of these kinds of insights is diminished. If you can be proactive, which means maybe understanding a roadmap or what decisions are being made or who else is going to do what and proposing research on your own roadmap that is intentional and is ahead of time. And you leave space, of course, for things that come up, fire drills and so on. But we’re trying to work in a proactive, collaborative way, aligning on goals, and then putting the effort in to make sense, changes the whole conversation about what you’ve learned when you get to that point of sharing with somebody.

[00:18:54] Chicago Camps:  Another question from our live studio audience. We’ve seen how the pandemic shifted the dynamics of interviewing. How do you see AI changing that dynamic once again, if at all?

[00:19:04] Steve Portigal: I like the “if at all.” You can tell when a researcher comes in because they write the questions really well.

[00:19:09] It can’t be a thing about any topic where we don’t talk about AI a little bit, right? I’m sure that the Tent Talk has an AI question for every single topic imaginable. Yeah, I don’t know is my short answer and it’s interesting because I feel like stuff is changing really quickly. It’s funny, I’ve stayed away from this because I am curmudgeonly and old and I feel like every sort of shiny thing comes along and people want to talk about, I don’t know, user research, data privacy on the blockchain.

[00:19:38] And then we stopped talking about it and I feel just grouchy about it. So you, I’m just going to own that. So I mostly stayed away from it. There was a really great article, a social science researcher in Berkeley took one of those, Hey, you don’t need users. We can generate user responses.

[00:19:56] She wrote this great article where she just did an experiment. She took this tool and she took her own project that was about some kind of civic participation in the Berkeley area. And she just did a comparison about what it produced. That I’ll engage in cause that’s someone giving a clear answer and not pontificating.

[00:20:13] I am working for a client right now who’s asked me to do some desk research on this topic. So I’ve actually been reading about it and it’s fascinating to see there’s stuff that works. I guess people that are doing coding are like working with these tools and kind of this partnership. And there was an interesting New Yorker article the other day about, I think it might be like the death of coding or something like that, that’s a little philosophical, but describes it.

[00:20:37] And then a lot of other stuff. It’s classic Silicon Valley, the vaporware stuff. It’s hard to see the difference between this is what AI could do. And this is a thing that exists. That’s just my brain dump on this question here. I had a conversation with someone that I respect the other day, and they said to me, a large language model, they can summarize, but it can’t synthesize because it can only be based on what is, so summarization is like a great use of that, but synthesis isn’t.

[00:21:08] And so maybe that’s fine, right? Maybe that’s how we’ll continue to use these things. So, I don’t know. There’s just so much, like everybody has an opinion and writes a LinkedIn post about it and so on. I think it’s creates stress for us all to try to figure out like how should we be working? What skills should we be developing? Will we have jobs?

[00:21:27] The coming robot revolution, it feels very complex. Which is why I hear you, I am grouchy about it, and I don’t have any good answers. I’m thinking about it. So I appreciate the, “if at all” caveat in the question, I don’t have the, the citation like off the top of my head, but someone did, they compared, I don’t know, they uploaded screenshots to chat GPT and asked it to find UX flaws.

[00:21:49] And then they had some experts do a heuristic analysis. I don’t know enough to critique their methodology and general purpose, large language model versus a domain specific one. Anyway, lots of false negatives and just really, it performed very poorly. The next iteration could work well.

[00:22:08] There’s where we are today and where we are tomorrow. And then unrelated to that is an article in the Atlantic by Ian Bogost who just writes about all sorts of things. And he talked about using MidJourney and giving us weird prompts. But not to replace illustrators. So there he’s writing in a magazine and they throw around ideas that are going to be cover stories or whatever.

[00:22:31] And I think this is part of their Slack dialogue. And what if burgers were sentient? Atlantic monthly cover story on that. And he’s built this practice for himself of putting that into MidJourney and getting an image of a cover. And he’s using that as outside the box creativity partner is what I’m trying to say.

[00:22:50] Not to make the thing, but to make a realization of a future, like absolutely crazy thing, and then be able to see it. So you can think about doing a How Might We? Activity or even just feeding into kind of a creative session warmup and asking people to make visualizations of fantastical stuff. I think it lets us see ridiculous things as possible, which is a way of being.

[00:23:17] We’re not trying to replace an illustrator. We’re trying to create the absolute most bonkers stuff in a way that’s grounded. I don’t think he’s trying to just be crazy, psychedelic mushroom crazy. Huh? Now that I see it, I have a different thought about should I approach it? Would I approach it? How I feel about it.

[00:23:35] Creativity is a hard thing to untangle there, but. It’s not replacing repetitive labor that a human should be doing. It’s extending what collaboratively we can do into the impossible. That is, I don’t know what to do with that for research. I guess I do. I said it could be a How Might We session, but the active gathering data, I don’t quite know what to do with that or analyzing data, but.

[00:23:58] It starts to say there’s a lot to this technology that hasn’t really been considered. So yeah, a few things that are for everyone to think about and know more than we know.

[00:24:10] Chicago Camps: And now a question from our live studio audience. Mario asks, throughout your career in user research and design, what has been your most profound learning experience and how has it shaped your approach to work?

[00:24:23] Steve Portigal: I learn about a lot of kinds of things. Like I run my own business and I write proposals for clients and have contracts and have to go through a procurement and a lot of not very interesting stuff. In that way that people who do research think that maybe when they start, they’re going to just be doing research and you realize, Oh, there’s a lot of other stuff.

[00:24:42] Like 10 percent of my time is in the field or the other 90 percent is meetings or analysis or socializing or whatever it is. So when I think about profound learning experiences, my mind doesn’t go to doing research. I’ll give an example of that. My mind goes to being a business person, being an independent consultant.

[00:25:03] And the more that I do this and it’s 22 years or something like that, since I started my own practice, like the more I do, the less, I feel like I know just around, I don’t know how to have a certain kind of meeting, like I had a sales meeting the other day. And at the end of it, I thought, wow, I don’t really know what I’m doing.

[00:25:22] Like I’ve been doing these meetings for forever and I don’t know if I know what I’m doing. And I was going to someone at lunch and they’re like, why don’t you do some research about this? Why don’t you create some protocols for yourself? Like you might for field work. So yeah, I’ve been doing this a really long time and I worked at an agency before that.

[00:25:39] And so I guess that is… it’s an ongoing profound thing. Well, geez, I don’t know how to do this. I don’t know how to do this. I’m lucky to be in a community of practice, not just with researchers, but freelancers and consultants to like constantly get advice and guidance and help and suggestion. Well, that’s maybe that’s the more honest answer.

[00:25:59] Maybe the more sort of a researchy answer is thinking about an experience that I had of interviewing somebody who… I went to like an architecture consultancy and that had. Nice lobby with the person’s last name, who is the founder of this agency, was like on the wall. Have you ever been to like a creative company’s lobby?

[00:26:18] You know where I was. And there’s just people bustling around and it’s, many of them are young and extremely well dressed. And I was younger then than I am now, part of my excuse here. And so the person I’m meeting with is like the founder of this company. And I don’t know, like, how does he have time to meet with me?

[00:26:36] And so this is going to be a story about bias. So let me just define that there. Comes out, he’s like a gray haired guy, comes out, takes me to the conference room, we start talking. And at some point he starts talking about the structure and how it works and what the roles are and what his short term and long term aspirations are.

[00:26:55] And when he says this long term aspirations. I hear myself being surprised that he has those and then I realized why I am surprised at that is because I concocted a whole narrative about this guy based on these various cues in what’s clearly like ageism, right? Age bias on my part. I saw this is the guy that’s a figurehead.

[00:27:16] He’s not really involved. He’s passed some peak. And when I had that moment of realization, it actually, and you can judge me for it. And that’s, I think what telling the story invites, but that was like a great moment. It was like a really profound and really like positive experience because I discovered my bias and in discovering my bias, I was able to see past that.

[00:27:39] And this isn’t about being a better person in the world. I was there to learn something about this class of customers for my clients. And if you don’t get past your bias, you just return with the same bias. It’s cool to learn something. That’s what I love about research. And so hearing this guy talk and realizing I was making up all this stuff about him.

[00:27:59] And now I understand it once I got over myself, that’s so cool. And I, once I felt that pattern, I realized that was common thing that there are often moments. I think that’s what helps us move around in the world and try to survive in it. We have to quickly assess things that we encounter and come up with narratives about them.

[00:28:19] But the joy of doing this work is that you get to overcome that and revisit what it is you assume about the world and see in fact how it’s actually different. And that is, that’s joyful, that’s a privilege. And yeah, if you have a bias then you’re just human. That’s just how human beings work. We have a lot of shaming around those things and I think it’s good to set cultural standards about what we expect from people and how we want to treat each other and how we want to be treated.

[00:28:48] I think we just have to be fair to ourselves and say, yeah, we operate that way. That’s human nature. Those are cues that we’re given from whatever television stock characters, whatever that is. But we can make choices and we can revisit those. And research is a really great way to do that.

 

Event Details
Interviewing Users
Expired
$Free
November 20, 2023
5:00 pm
November 20, 2023
6:00 pm
Tent Talks Featuring Steve Portigal: Interviewing Users On Monday, November 20th at 5:00pm Central, Steve Portigal. joins us for a live Q&A session: “Steve Portigal: Interviewing Users.” Steve Portigal Steve Portigal is an experienced user researcher who helps organizations to...
April 2024
S M T W T F S
 123456
78910111213
14151617181920
21222324252627
282930  
Categories