15:32

Derek Frisicchio for Understanding UX practice for pedagogy

May 22, 2024

Video Transcript


Speaker: Derek Frisicchio

Tell us about a time when storytelling had a positive impact on your work.

Derek Frisicchio: The one way I had a positive impact at work was at my last job um where I was working on a um a streaming user interface for the TV. Um And the research question I was dealt with was like, basically finding pain points of navigation within the journey. And the feature in the streaming app was a shopping feature. So basically, they would have to find a live show. Uh click some click like a kind of side tab of that show on the remote, um select the items that they wanted, select the size color amount and then they would receive a QR code with a link to their car. So we didn't really have a great way to do quantitative study on this to really find out um where most of the issues are, where what was causing most of the pain points because we were noticing that not a lot of people were coming into this um feature, but we are also in the analytics. We learned that there weren't a lot of people actually uh new people, especially at uh navigating to the live shows, which was really our bread and butter. Um If you watch, if you want, you want to watch a live show to see what the live sale is because there's a product usually associated with the show. So, what we did was you ran this first click test of uh showing different the different screens, the happy path of different screens of the uh through the uh journey. Um And we're asking people, ok, you know, you want to, you know, the, the overall goal is to find a product and live show you want to buy. Um So first step, what would you click on and um from the home page of the streaming uh app, which we didn't really, weren't really expecting. We found that um participants had a really low rate of click correctness on live thumbnails from the home page, which is a big red flag. And then we also learned that uh but, you know, even though they made the incorrect click, they thought they made the right one, which is a big red flag to me because if they think that if they, we know and we know that they made the incorrect click, but we, they thought they made the right one. That's like a pretty uh like false positive on their end, like, you know, uh I think um so what we did, hey, we worked with the design team. I was like, we ventured out, we were asking like, OK, where do people normally? Like, what are the biggest content libraries that have live shows and what do these live thumbnails look like. Um So we were able to isolate certain elements into new, new live show thumbnails to see how effective they were um alone uh with correcting corrective n rate, correct net rates and uh perceived correct uh task completion, correct rates. Um And we were able to learn that uh the a thumbnail that had a red live tag at the top left hand corner that also had a image thumbnail of the show. Um helped uh really uh helped with click correctness and perceived a task completion. Um And also we learned that customer uh the people were expecting that thumbnail, the the image of that thumbnail to actually be a video. Um And this was around the time youtube and Netflix started um making thumbnails play the video in the like explore in the feed itself instead of having to click into it. So it was like kind of a window into the future of uh helping people select what kind of content they want to watch. Um We were able to deliver this all the way up to the VP to say like this is a really good plan. This is uh really well thought out and we recommend launching these thumbnails uh with the live tag in the video feed and a certain size for the title. Um Those were launched in A B test and retention, re adoption of live show. Thumbnails from the users and retention uh uh increased exponentially. Um So, you know, we're, we, people started spending a lot more time in the streaming app um and finding the most relevant content for them. And that was probably the biggest impact I had had in my career so far.

Is there anything you want to add about the previous story that you were not able to answer in the five-minute limit?

Derek Frisicchio: I think the only thing I wanted to add was it was a little difficult working with the design team um who was responsible for designing this streaming UI and the user experience. Um They, we were working with a vendor who uh said that they specialized in making streaming um user interfaces and user experiences and they were very uh there was a lot of pushback from them. Um And I of all these new thumbnails that were launched. Um And it really, because it really boiled down to um they, you know, learned that they weren't the experts in the field of making thumbnails, like really, you know, basic thumbnails for the TV. Um And it was also um a different that we were also like making thumbnails from our internal designers and not using them. Um which is actually what our business wanted to do. We eventually wanted to move away from this vendor. Um So, um that was just something like internally. We, you know, I was not expecting to deal with uh we made a better experience for the user. Um But the ownership of um the responsibility of the design um I think caused a lot of friction in the politics that were going on there.

Now, tell us another story. Tell us about a time when storytelling did not work as you planned/hoped.

Derek Frisicchio: Yeah. So something that didn't work was uh in my current uh position, we um needed to have a quantitative quarterly benchmark of the products that we were working on in our, in our company's product portfolio. Um And what that would basically help us do is help create initiatives for the product team. At least that's what we wanted to happen. That's what we thought would happen. Um And be able to measure like the kind of the health of the product, the ease of use, the positive attitudes of the product, um the intention to use of the product amongst other things on a on a lagging uh quarterly basis, like they were lagging scores because they weren't answered in real time uh based off of events. Um So, um it really like the launch of it, uh had a lot of applause but when what we ran into, um we were trying to tell a story of like, hey, like, you know, these activities, these attitudes correlate with a high ease of use and not correlate. But um uh they are high drivers of ease of use and some things correlate with each other. So we were finding out like which ones correlate with the, with, with each other to or just internally on our research team to like, um make sure that we were filtering their questions, making sure every question was valuable on its own. And um over time, what we learned was that the uh product team, um you know, after they were launching features and, and lots of features and um uh it was a lot of feature, it, I think, um you know, we, there were times where we couldn't, not, not only in some case, we couldn't explain certain increases. Um Like, because it was hard to like align them with an event or a feature. Um We had like user I, but it just took a lot of a math and there were uh and just like a data cleaning and analysis and there were cases where there are more cases than we thought where the product team was not happy because they saw that things in their domain were not increasing. Um They or they were, they might have be uh falling uh quarter by quarter. Um So, uh I don't, so, and now what we're doing is we're kind of making this benchmark into um pop up uh survey questions and we're, we had to get rid of like the positive sentiments towards using the product because um people had a hard, we had a hard time just like explaining what these were and advocating for how to measure them and how to manipulate them. Um, and, uh, you know, that's something that I haven't really figured out how to, uh, get around, um, especially with the team that I was working on, uh, working with to make this quarterly benchmark.

Is there anything you want to add about the previous story that you were not able to answer in the five-minute limit?

Derek Frisicchio: I didn't meet the like five minute mark for the last one. But I think so, you know, the team I was working on for this benchmarking. Um One guy was extreme is just extremely intelligent, has a lot of um experience in uh just human motivations. Um um And uh but, and he's, you know, knowing how to measure it, how to statistically test um looking for um how to describe the data. Uh But I think what happened was he was too verbose. Um And there were just situations where he was, he was a talking head and he knows he knows that and he's, we're, we're working together and I know I, I've done that myself and uh we're working together, we're working, we're, now we're still working together on like helping him uh with that. Uh And so, um you know, I just wanted to add it in there that uh yeah, although he was the expert on this thing and I was helping like gather the data, clean the data and run statistical tests on the data that we gathered for this benchmark. Um explaining it, it's, it turned to relief for Bose explanation, especially in the report, we're making it really hard to read. Um, but it got better. Uh, I think it just didn't get better in the time that we needed it to get better.

Is there anything else you would like to tell us?

Derek Frisicchio: Um I think throughout my career as a researcher um in the UX field, what I've ran into just wanted to add this, like I've ran into just like kind of evolving the way that I told the story. Um when I was training and uh learning at Pratt Institute, um you know, we were, we were set with like, we, we were given ample opportunities to learn how to tell a story. And I learned that really well in the program I was in. Um but we had to, when I, when I came into my first job as a researcher, um I had to adapt to the way that they wanted me to be and how to present, which, you know, I understand I wanna follow before I make and understand before I make change more, I suggest to make changes. Um And the presentations were um just extremely stale. Uh People didn't know how to, there were times where people who worked in research that I was told that, you know, they work for research, they didn't understand what the difference was between insights and observations and feedback. Um And sometimes there were situations where a researcher would be talking in a presentation for 45 minutes. Um I learned that that's not fun and it's really boring and people tend to like get to their laptops after like even the seven minute mark of talking. Um So what I've instead, what instead I do for storytelling now is um I kind of create like a one page report that has like a massive appendix. Um You know, I, I include in the email, like I in my meeting or presentation before I tell a story. Um It's kind of more collaborative instead of just me saying it, I um I do like share the data, I share the hard data, the clean data um with the team and the interpretation of that data. Um Either if it's qualitative or quantitative and um you know, I send out a memo of the presentation we're going to go over. Um Usually I ask them to ask the stakeholders to read the memo beforehand or we read it or if they don't, we just read it during the meeting and we just talk, we just discuss, we debate. Um And that helps people. I think that's been helping people be heard. Um It's been helping people understand how research, what research is and what the purpose of this, you know, of certain work was. Um And I also, if I'm in person, I try to like big bring like a baked good or something. Um Just to get people like, more interested uh which I think is funny. Um So, yeah, it's just over time, I've evolved from like a basic format of like, you know, here's the background research questions, blah, blah. I don't really share that a lot anymore because the stakeholders already know what I've been working on for, for, for months. Um And uh yeah, so that's kind of how I've, I've been doing it.



Produced with Vocal Video