• 50:35

Episode number: 74

The QA Process

with Lewis Francis

Summary

Quality Assurance isn’t just testing. QA is a process that spans the lifecyle of a project. Lewis Francis, Director of Quality Assurance at Threespot, joins the show with a primer on QA for websites and apps. He talks about the difference between quality assurance and quality control, and explains where QA should fit into the project cycle. Lewis details how budget can dictate the scope of QA, but also the elements that are important for any QA process. And he shares a bunch of great tools and resources, from automated full-site validation to front-end testing to issue tracking.

Tags

Sponsored by

  • Craft CMS - Craft Commerce
  • Your ad here (dimensions: 520 pixels wide and 60 pixels tall)

Episode Transcript

CTRL+CLICK CAST is proud to provide transcripts for our audience members who prefer text-based content. However, our episodes are designed for an audio experience, which includes emotion and emphasis that don't always translate to our transcripts. Additionally, our transcripts are generated by human transcribers and may contain errors. If you require clarification, please listen to the audio.

[Music]

Lea Alcantara: From Bright Umbrella, this is CTRL+CLICK CAST! We inspect the web for you! Today Lewis Francis joins the show to talk about the ever important process of quality assurance testing. I’m your host, Lea Alcantara, and I’m joined by my fab co-host:

Emily Lewis: Emily Lewis!

Lea Alcantara: This episode is brought to you by Craft Commerce, a brand new e-commerce platform for Craft CMS. If you’re a web shop that likes to create custom-tailored websites for your clients, you’re going to love Craft Commerce. It’s extremely flexible, leaving all the product modeling and front-end development up to you, and it’s got a simple and intuitive back end for content managers. To learn more and download a free trial, head over to craftcommerce.com.

[Music ends]

Emily Lewis: Today we’re excited to have Lewis Francis on the show. Lewis is the director of quality assurance at Threespot, an interactive agency in DC where he basically breaks things for a living. Welcome to the show, Lewis.

Lewis Francis: Oh, thank you.

Lea Alcantara: So Lewis, can you tell our listeners a bit more about yourself?

Lewis Francis: Well, I am a father of two young adults, and I do photography for fun. And not any kind of photography; I actually sneak into abandoned buildings. Now, this isn’t actually public? Are people going to hear this with me talking about it?

Emily Lewis: [Laughs]

Lewis Francis: Maybe I shouldn’t be talking about my hobbies, but yeah, that’s the kind of thing I do when I’m not breaking things for a living.

Emily Lewis: Well, that’s cool. Is your portfolio online? Do you show your photographs publicly?

Lewis Francis: I do. I have a Flickr account. Of course, it’s flickr.com/lewisfrancis, and if you go to lewisfrancis.com, I’ve got a little single page with the links to that account and my Instagram account.

Lea Alcantara: Oh, interesting.

Lewis Francis: So if you’re interested in seeing those, yeah, take a look.

Emily Lewis: Cool. So for today’s topic, I wanted to start with the basics. How do you define quality assurance as it applies to the web?

Lewis Francis: Well, quality assurance, I guess what we first have to define is what quality assurance means, and in the larger QA world, quality assurance is a kind of process that begins at the start of an engagement and ends after a delivery, and what we do or what I do is closer maybe to quality control.

Lea Alcantara: [Agrees]

Lewis Francis: Because I’m not always able to be involved in every single project our agency is working on.

Lea Alcantara: Right.

Lewis Francis: So that’s one of the major differences. What I do try to do is be as involved in projects upfront as I can, and if I’m unable to do that, to make sure that our folks are well versed and knowledgeable about the kinds of things that I would ask if I were in their place.

Emily Lewis: So if you’re doing quality control, can you clarify how that’s different than quality assurance itself, like how the nuances between the two? Is quality control a more management kind of aspect of it?

Lewis Francis: Now, what I do really falls somewhere in between those two points on the spectrum. Quality control typically would be the end-stage process where your tester doesn’t see the product until it’s ready for testing.

Lea Alcantara: [Agrees]

Lewis Francis: Traditional quality assurance is your tester is involved from the very beginning.

Emily Lewis: Oh.

Lewis Francis: So he or she might be in team kick-off meetings, so he might be asking questions that other folks that are more focused on production or budget might not be thinking about. So depending on the project, the client, the budget, actually, I see myself falling somewhere along that spectrum. These days, it’s typically a little closer to the end. I think that’s really because our people are a lot better now.

Lea Alcantara: Right.

Lewis Francis: We’ve been doing this for a few years and they already know much of what I’m going to ask because they know if they don’t ask, I’ll be mad at them.

Lea Alcantara: [Laughs]

Emily Lewis: [Laughs]

Lea Alcantara: I feel like it’s a little bit of a semantic issue because the term “assurance” means you’re kind of confirming something while “control” is kind of — well, I guess I don’t know. I guess, what is the difference between making sure everything is as people expect or what? I guess I’m still a little bit confused over the assurance and control part, like what is the difference between starting at the beginning and dealing with things at the end?

Lewis Francis: Well, it is a tricky thing to wrap your head around. It took me a while to really understand it. Assurance, the process of ensuring or assuring quality in a product is made easier by being inserted in the process earlier. So if you can solve a problem before it becomes a problem, it is a little easier left.

Lea Alcantara: Right.

Lewis Francis: And this is a very common thing. For instance, you’re working for an agency that requires by law Section 508 compliance and if nobody brings that up until the very end after everything has been coded, after all the designs have been done, then that’s going to be a problem. It’s going to be hard to fix at that stage than it would if it had been brought up earlier.

Lea Alcantara: Right.

Emily Lewis: That makes sense. What you’re describing makes me think a bit about a kitchen because I worked in the restaurant industry a lot in my 20s, and it sounds like the quality assurance is like the cook who’s constantly tasting as they’re building the recipe.

Lewis Francis: Right.

Emily Lewis: And the quality control is the expediter who makes sure the plate is perfect before it goes to the user.

Lea Alcantara: Right.

Lewis Francis: That’s a great analogy.

Emily Lewis: Is that right?

Lewis Francis: It totally is. That’s an awesome analogy.

Lea Alcantara: Mind blown.

Lewis Francis: [Laughs]

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Emily Lewis: I think I’m hungry that’s why I’m thinking about the kitchen. [Laughs]

Lewis Francis: [Laughs]

Emily Lewis: But what you’re also describing, Lewis, we actually talked on our last episode with a UX designer that a really good UX design actually starts with making sure that you’re building something that needs to be built and then testing all along the way to make sure that what you confirmed you want to build in the beginning is still going to be that in the end.

Lewis Francis: Right.

Emily Lewis: So with your quality assurance and you’re describing someone or maybe it’s more than one person, but being inserted into the project from the very beginning and sort of being there throughout, is that person a QA person or is it more like each phase of a project a person, let’s say the designer, the front-end developer, the programmer, they’re each responsible for their own QA?

Lewis Francis: Well, if you have a project that has a large enough budget and you have enough staff on hand, then it’s good to have someone who’s dedicated to QA in that process from the very beginning.

Emily Lewis: [Agrees]

Lewis Francis: Now, that’s not always possible. When you’re working with smaller organizations with smaller budgets, then if the quality of your work is going to be maintained, then everyone in your agency has to wear that QA hat. So a designer has to be thinking about, for instance, Section 508 compliance issues, and then the UX person has to understand what that might mean, how it might apply to some of the interface patterns there that they’re designing, and in the same way with technology.

When you’re in a smaller agency, it counts if you can train them. There are some cross-disciplinary knowledge sharing that needs to happen, and I think at Threespot, that’s the kind of thing that we’ve actually gotten pretty good at over the years. Everyone needs to wear that hat and switch it off, especially if you’re in a smaller agency. Anyway, you need to be flexible and have knowledge outside of your sphere of discipline.

Emily Lewis: So when you have this scenario where, let’s say you have someone who’s dedicated to the quality assurance and they’re in from the beginning to the end. What does that look like? Are they just always involved in meetings? Are they more one on one with the individual people with each phase? Is there involvement with the client?

Lewis Francis: It’s usually just participating in meetings, listening to what’s going, offering input when it’s appropriate to do so, and then after those meetings, sometimes the team or the client will ask the question, and we’re not exactly sure how to address or how to answer that, so then we regroup privately and discuss the options, and I guess it’s my role to bring up the issues that could happen. Again, like if we’re talking Section 508, “Well, this particular design, there’s going to be a problem with Section 508 world, or here’s a way we can get around this problem by simplifying the interface or providing some sort of accessible alternative.” Most of the time, it’s just meetings.

Lea Alcantara: Okay, so for those particular meetings, do you have a standard process on how you parse the information from those particular meetings, like do you have a clipboard of checklists?

Lewis Francis: Yes, we actually have something that’s, oddly enough, called the QA checklist.

Emily Lewis: [Laughs]

Lea Alcantara: It makes sense.

Timestamp: 00:09:51

Lewis Francis: It’s actually just a simple spreadsheet that I’ve templatized that I ask our team to fill out before testing can begin. Now, ideally, and my ulterior motive for this actually was to capture this information at the beginning of the engagement when I’m not there, for instance, and I have people thinking about the implications of these things. So the checklist is going to include things, like are we on tap for Section 508 compliance? Which spread of browsers and platforms do we need to support? What’s the back end? Is there an email component on one of the logins? It’s all that sort of thing.

Emily Lewis: So all the different areas that may or may not come into play for a project, you kind of flag it and then that sort of let your team know that they need to be aware of these things as they’re building, and then at the end, I guess your barometer, if they built it right?

Lewis Francis: Well, it enables testing. I mean, you have to know what you are testing in order to do a test, and the same thing applies to designers and developers, they have to know that they’re building for, god forbid, IE 8.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Lewis Francis: If you get to the end of the project and your understanding with your client was you’re only supporting modern browsers and then suddenly they’ve come back and said, “Hey, you know, we’ve got this IE 6 that our CEO uses and he won’t change for anything, so it’s got to work on his machine,” well, that’s a problem. So not only do you need to gather their requirements for browser support, which are typically informed by analytics, if you’re lucky enough to have that, and in these days, that’s pretty typical, but you need to know who the important stakeholders are and what the important stakeholders are using.

Lea Alcantara: [Agrees]

Emily Lewis: And so on this checklist, you’ve got stuff about browsers, devices, 508, email. Are there other common things that you are always making sure to account for in projects?

Lewis Francis: Well, we’re looking for coding standards. If the client has particular coding standards or validation requirements that we need to follow. If they’re doing Section 508, which Section 508 validation tools are they using, because ideally, we will all synchronize with whatever they’re testing, whatever they’re using to validate. What else? I think I mentioned back end and the email stuff. We’re looking at any kind of interaction models that might not be immediately obvious.

Emily Lewis: I’m curious, one thing that we’re noticing a lot more, not just within the industry, but clients are starting to notice it is like the speed, the performance of their website. Is that anything that ever comes up in quality assurance if you’re working on a website project?

Lewis Francis: It does. One of the checklist items is maximum page weight, but honestly, that’s not something that very often gets checked off.

Emily Lewis: [Agrees]

Lewis Francis: I think I have my validation tools’ test set to automatically flag anything that’s over one MB.

Lea Alcantara: [Agrees]

Lewis Francis: Honestly, it depends on the client and the budget. We have a few clients that their product is going to be presented on the African continent.

Lea Alcantara: [Agrees]

Lewis Francis: So in that case, it was very important to manage bandwidth and to pay attention on that sort of thing.

Emily Lewis: [Agrees]

Lewis Francis: Bloat is also a consideration, but I don’t think our particular work product is particularly bloated because we don’t really like that kind of site and try to do our best to optimize our product.

Lea Alcantara: Right. So I keep hearing a lot of tools and testing in regards to development, specifically from the code to the back end to, say, accessibility with Section 508, but you kind of touched on design a little bit with, say, “Okay, maybe this is not an accessible design,” but how do you test that? Like what are the different processes when you’re doing quality assurance or control for design?

Lewis Francis: Well, there are a lot of different things to consider, and I will just start out by saying that you can pass Section 508 and WCAG to compliance and still not have a usable site for someone who suffers from accessibility issues.

Emily Lewis: [Agrees]

Lewis Francis: But you can automate a lot of that process.

Lea Alcantara: [Agrees]

Lewis Francis: One of the tools I use is from PowerMapper. It called SortSite. One of the tougher things is contrast checking. It’s one of the tougher compliance points for WCAG. That’s really tough to do manually, but if you have a tool that can roll through the site and flag all that. What it does it is it looks at the CSS and the code, and then it flags potential compliance issues, but it can’t do everything, so you have to manually go through that yourself.

Lea Alcantara: [Agrees]

Lewis Francis: And for a designer, the first time you do that, it’s probably really, really painful. I don’t know if you guys have had to work with WCAG’s contrast rules. There are a lot of things that you just can’t get away with if you need to be WCAG compliant.

Emily Lewis: Yeah, several episodes ago, we talked about color and the importance of contrast for accessibility, and I think, Lea, just even after that conversation that we had for that episode, you’ve kind of learned more as a designer about contrast and have incorporated more things into your processes to check for that.

Lea Alcantara: Yeah, absolutely, especially the one that I really liked was Lea Verou’s contrast checker because usually what I’ve done, especially checking what color I should use, is first I choose the color that I want, and if I like it, then I put it through that contrast checker and if it gives me a high number, the higher the number, the better, that’s great. But if it doesn’t, it will give you kind of like a score, kind of. It will tell you if this color will only work if the font size was 36 point, right?

Lewis Francis: Right.

Lea Alcantara: But if you’re trying to make this as a body copy font, then it’s not going to work, and what I like about Lea’s tool is you can actually press the Up key to just keep moving the color contrast a little bit higher or lower, depending to just tweak the shade enough, and it’s like a super easy way for you to just be like, “Okay, it’s still in the same shade range, what’s the better contrast?” And then you just play around with the arrow keys until it gets you the score [laughs] that you like while showing you a good preview of what that even looks like, so you’re not feeling compromised as a designer as well.

Lewis Francis: Did you find that painful when you first ran your tests?

Lea Alcantara: Well, yeah, because as a designer, you’re like, “This is how I want it to look.” [Laughs]

Lewis Francis: [Laughs]

Lea Alcantara: Well, there are some not just that, but there are branding reasons, right?

Lewis Francis: Right.

Lea Alcantara: Like there were choices in regards to why this particular direction is where it’s going, but sometimes you have to make concessions over, well, who’s my audience here? Specifically, we just put our presentation slides through a contrast checker because someone in the audience, the last time we presented, were like, “Hey, great content, but I couldn’t see your subheads.” It’s because it just wasn’t the right contrast for that particular person.

Lewis Francis: [Laughs]

Emily Lewis: [Laughs]

Lea Alcantara: And so you can’t ignore that, how many other people don’t actually come up to you and say that? When you are trying to convey important pieces of information, you don’t want that to be a stopping point. So I thought having that type of discussion, having that flagged was really important and having these types of tests, even just like as a little check and to understand the concessions you’re making is really important.

Lewis Francis: Right, right. As a 56-year-old, I have an investment in legibility. Yeah, as I get older, I find that small type is just really tough to read.

Lea Alcantara: Right.

Lewis Francis: So yeah, it’s a personal thing that helps me too. It makes me keep everyone honest.

Emily Lewis: [Laughs] I’m curious, Lewis, that was some of the things that Lea has learned with design, and you mentioned that what your designers do and how it’s a difficult experience for them that first time to sort of learn where their solutions don’t exactly align with the requirements, but what about when you get into development, what are some of the tools your developers are using to QA their code, especially for things like WCAG or 508 or even just coding standards with semantics or whatever?

Lewis Francis: Well, our guys and girls do pretty much the same thing that I do. We have access to the same tools, and I think they’re getting to the point where they’re able to automate some of their tests as they come in to their code bases. Yeah, I can’t really think of anything that’s particularly different than what I do.

Timestamp: 00:20:09

Emily Lewis: And some of those examples, you had sent us a couple of links in advance of our call today, about apps and different browser-based tools that you use yourself. Are there some ones that are just super favorites that you always go to?

Lewis Francis: Well, like I said, PowerMappers’ SortSite is really my main go-to. That’s my workhorse. That particular program, it’s available for Mac and PC, and I think there’s actually an online version as well, it will crawl through the site, and depending on the configuration that you’ve set up, it will run validation tests, Section 508, WCAG 1 and 2, Levels 1, 2 and 3 and validate your CSS, your HTML, spellchecking, which is wonderful, except when your site is still populated with lorem ipsum.

Emily Lewis: Lorem ipsum. [Laughs]

Lewis Francis: Yeah. That’s really kind of painful, although it’s also good for a final testing before you push live. It can easily flag all this placeholder stuff that’s actually have been left in, so that’s an awesome tool. There’s another called Xenu link checker that I’ve used for years and years and years. That’s PC only, and all it really does is it crawls a site and then check for bad links, but it does it really, really quickly, so that’s a favorite, and I’ve actually used that for not just QA purposes, but also to do some analysis.

We have a client that we’re interested in talking to. They may not know how many pages are on their sites. They may have a rough guess, so you can run this type of tool and it will crawl the sites and it won’t catch all the orphaned items, like landing pages for email campaigns and stuff like that, but it’s a great tool. It will tell you how many pages, how many Word docs, how many PDFs, the sizes, it’s awesome.

Emily Lewis: I’m looking at both of the sites, particularly I like the SortSite. I think it’s pretty clever that they’re also able to do a check for SEO guidelines as well. As much as it may drive me insane, SEO is always one of those things that clients care about, and so whether you’re in a sales call or kind of validating for a client that you delivered something great for them, I think that would be useful to know as well.

Lewis Francis: Yeah, it’s great. I get a push back from my team occasionally on areas that are flagged, like titles are too long for Yahoo or Bing.

Lea Alcantara: Right.

Lewis Francis: And people will say, “Well, we don’t really care about any of that. It’s Google who we care about.” Well, okay, you can make arguments about how much title length is weighed into ranking algorithms. That’s why the tool is configurable and you can go in there and play around with it and work with, and yeah, fine tune it too to work with your team and your client’s expectations.

Lea Alcantara: So I was just curious because the couple of sources that you gave, specifically the Xenu’s link just reminded me of this tool that our colleague actually introduced us to, which was Content Insight’s Content Analysis Tool, which it’s not free.

Lewis Francis: [Agrees]

Lea Alcantara: What I like about it though is you can buy like credits as well as monthly plans, so it’s like dependent on how many pages that you want to crawl versus pretty much like thousands of pages, you get like tiers, and you can basically scan. So it’s not a Mac or PC. Actually, it’s an online tool and it does essentially like what the Link Sleuth does, it just kind of scans what the pages are on that site in the first place, and I feel like it’s great even before a project really begins because it’s good to even scope out the scale of a project because sometimes when you look at a particular website, it looks like there might only be ten pages or something, but it could be who knows how many pages deep because of all the interlinking or categories or whatever, and something like this Content Insight Analysis Tool can essentially unearth the entire scope of the site in the first place.

Lewis Francis: Well, interesting.

Lea Alcantara: Yeah.

Emily Lewis: Yeah, and I feel like tools like that, especially, Lewis, to tie back to your point about quality assurance being throughout the entire process, I think if you’re dealing with a project that involves an existing site having those initial findings about that existing site as a point of comparison at the end of a new site project is a really powerful tool to demonstrate the improvement, where the investment went.

Lea Alcantara: [Agrees]

Lewis Francis: Exactly. If you’re looking for success matrix, that’s a great way to start. It’s rather scant at the beginning of the site and then afterwards, and compare the results, and it’s just obvious.

Emily Lewis: So you also sent us a couple of browser-based tools that are actually in my arsenal for front-end development. You listed BrowserStack for testing on real browsers and emulators, which we use all the time. I’m curious what your thoughts are about QA for mobile and using devices versus emulators. Is it a budget issue? Is it an access issue? Is it really what works and you go with that?

Lewis Francis: I don’t trust emulators.

Emily Lewis: [Agrees]

Lewis Francis: We did a series of apps for the Brookings Institute, iOS, Android and — who are these guys that we want to go away with all the keyboards?

Emily Lewis: Oh, BlackBerry?

Lewis Francis: BlackBerry.

Lea Alcantara: Right.

Lewis Francis: Which was just atrocious. That was an atrocious experience because the BlackBerrys would give up the ghost if a content page was more than 20kb or something like that. It was insane.

Emily Lewis: [Agrees]

Lewis Francis: It was important at the time because a lot of the Brookings’ target audience were federal workers who were all issued BlackBerrys, but during the QA process for that, we didn’t have access to a lot of hardware devices and so we ended up using a different load testing tool from Keynote that worked pretty well, but we would have bug reports on actual devices that worked fine in the emulators. So since then I don’t trust emulators. I don’t. I recommend that our developers don’t use emulators to test.

Now, the nice thing about BrowserStack is you have now a whole bunch of real, actual devices that you remotely control. If you’re using Chrome, you can control these real devices as opposed to Safari, which kicks you over to the emulator versions of the iOS devices.

Lea Alcantara: Oh, interesting. That’s a good distinction there that Chrome, if you’re using BrowserStack through Chrome, it gives you a different experience than if you’re using BrowserStack on Safari.

Lewis Francis: It does, exactly, that’s really the only time I use Chrome. Well, that’s not true. I prefer Safari myself. I think it’s a more elegant experience. If you do use Chrome with BrowserStack, there is still a set of emulators that don’t have real machine, real device equivalents, but there’s a pretty wide number of devices that are, and so BrowserStack is amazing. It’s worth every single penny that you pay for this.

Emily Lewis: Yeah, I got to agree with you there.

Lea Alcantara: [Agrees]

Emily Lewis: I really do because, especially since they made the investment for the real browsers and that you can also enable local testing.

Lea Alcantara: Oh yeah.

Emily Lewis: It’s really…

Lewis Francis: Yeah.

Emily Lewis: Because it’s not cheap, so it really pays for itself from that regard.

Lewis Francis: It’s not cheap, but maintaining a device library is not necessarily cheap either.

Lea Alcantara: No.

Emily Lewis: Yeah.

Lewis Francis: And your devices are so quickly outdated or they’re difficult to maintain. I would say if you’re developing actual applications, actual mobile applications, then you probably should maintain a device library, and that’s how we started out, but these days we’re not finding many clients that are interested in or have the budgets that would support a dedicated mobile app, so doing web stuff is fine with BrowserStack.

Emily Lewis: [Agrees]

Lewis Francis: I like that BrowserStack remembers which page we’re testing when you switch to a new browser.

Emily Lewis: Yeah. [Laughs]

Lea Alcantara: [Agrees]

Lewis Francis: It’s awesome.

Emily Lewis: We’ve talked a little bit about some tools. We’ve talked about some of the typical things that you’re looking for as part of QA during a project. Have you encountered anything that was really unusual, whether it was unusual client or an unusual project that had some unique QA requirements that you don’t often see, and how you attacked it?

Lewis Francis: No, I can’t really, and that was one of the questions you guys sent me earlier, and I wracked my head around trying to think there must have been some situations that were like that that I could give you a great story for, but I really don’t. It’s more along the lines of, “Here’s what we want,” we deliver it and then, “Oh, by the way, we really wanted something else.” It’s that sort of thing or the last minute declaration that a site needs 508 compliance or the last minute declaration that our CEO is using WebTV so the Internet has to work on that, which is a real story. That’s a real story. It wasn’t us, thankfully.

Timestamp: 00:30:04

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs] Wow!

Sponsored by

  • Craft CMS - Craft Commerce
  • Your ad here (dimensions: 520 pixels wide and 60 pixels tall)

Emily Lewis: Well, what have you done with those? How does your team typically address those last minute situations? I mean, do you just implement? Has that become a business discussion about expanding scope?

Lewis Francis: It does. Well, it depends. It depends on where we’re at in the budget and the time frame. So if there is room in both, then we are inclined to make the customer happy and put out a product that we will all be happy with, and of course, if the relationship hasn’t been great or if there were problems along the way, then we might push back and say, “Well, gosh, guys, we really didn’t spec this out. Resolving this particular problem is going to push your deadline out and it’s going to cost X more dollars. Do you really want to do this?” And that’s not always an easy discussion. Luckily, that’s not one that I ever have to make.

Lea Alcantara: So I’m a little bit curious, you’ve been kind of hinting a little bit about how what you can QA is based on the project budget, but there’s always the best case scenario and then there are compromises, so are there things that you always do regardless of the budget, and are there things that you do if it’s nice to have as the budget allows for it?

Lewis Francis: This is something that comes up quite a bit, especially when we’re talking about projects that have very small budgets, what can we do to ensure a good work profit that isn’t going to break the bank or delay launch.

Lea Alcantara: [Agrees]

Lewis Francis: Luckily, a lot of our tools are automated, so they don’t really cost much for me to run, so I almost always run all the automated tests regardless of the budget, regardless of the timeframe.

Lea Alcantara: [Agrees]

Lewis Francis: Where I might entertain some flexibility is on whether how I will write up the tickets, write up the results or how strong I’ll argue for fixing the problems, and after all, these are websites. They’re not air flight control systems where people’s lives are on the line, you know?

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Lewis Francis: So if something goes out the door and it’s a little funky on ancient versions of Firefox, then that’s probably okay to go live with as long as you round back and address it. So yeah, that’s pretty much the way we look at it.

Emily Lewis: It reminds me there was a point in my career where I was very firm about passing all the validation tests, like a 100%. And then there came a time in my career — and it’s where I currently am now — is where I run things through tests, I see what the results are, and then I use my years of experience to decide, “Just because that tool said X, does that mean I change or is what I have scratching some other itch in the project or meeting some other requirement that has a higher priority.” And I think if there are any developers listening who are like I used to be where you’re like, “Well, I’ve got to spend all this time making sure we get a 100% across the board,” there are times where I intentionally do something that breaks in validation because I know better, not just because the tool told me.

Lewis Francis: Exactly.

Emily Lewis: And I think that’s probably really important for QA is you’re using technology, but it’s the person’s experience that understands where it should be applied.

Lewis Francis: Exactly. I run the validation tests, there’s a whole bunch of issues that will be returned that we call “fact of lives” or “facts of life” that are there because if we didn’t code the page the way we coded the page against the validation standards, then they wouldn’t be accessible.

Lea Alcantara: Right.

Lewis Francis: And what comes to mind is using SVG icons that are on IE. I think all versions of IE and maybe including Edge, that will trap your tab keys. If you’re navigating a page through tab and you’re using SVG icons for

nav elements, and you’ll tab to the icon, tab again, nothing apparently will happen, and then the next time you tab, you’ll move to the next icon. So we have to do tricks to make that easier for the user to use.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Lewis Francis: Yes, it’s going to fail validation, but that’s okay.

Emily Lewis: Right, exactly.

Lea Alcantara: Right.

Emily Lewis: So we’re nearing the end, and I think there’s one question I’m curious about. Are there common misconceptions about quality assurance things that you hear from clients or from designers and devs who may be don’t have a lot of QA experience and they’re wrong about what it is and what it means?

Lewis Francis: Well, I think one of the classic misconceptions about QA is that it will catch all the bugs.

Emily Lewis: Right. [Laughs]

Lea Alcantara: Right.

Lewis Francis: And that’s just not possible. It will catch enough of the bugs to provide a better work product, and I think that’s the goal of it that we’re just striving for.

Emily Lewis: [Agrees]

Lewis Francis: Now, obviously, if you’ve identified some high-value items, you’ll need to make sure that those items have passed QA, that they work as intended. That’s probably the classic misconception.

Emily Lewis: [Agrees]

Lewis Francis: Another one is that you can QA your own work.

Emily Lewis: [Laughs]

Lea Alcantara: [Agrees]

Lewis Francis: You just can’t do it. Reporters have copy editors for a reason.

Emily Lewis: [Agrees]

Lewis Francis: You’re too close to your own work to objectively evaluate it. What else? Another one that comes up more often than I’d like it too is, let’s say, you’ve generated tickets. You’ve identified a bug and you generated a QA ticket for it, and then someone addresses that ticket, and then closes that as fixed. Well, no, it’s not necessarily fixed or it’s not really fixed until I say it’s fixed.

Lea Alcantara: [Agrees]

Lewis Francis: So that’s called regression testing. So people need to apply a fix to an issue and then kick it back to the QA tester to validate that.

Lea Alcantara: Right, right.

Lewis Francis: Because it’s so easy to mark something as fixed when it really isn’t or you’ve marked something as fixed and it turns out that that fix, yes, has addressed that particular issue, but it has then broken something else.

Emily Lewis: Yeah.

Lea Alcantara: Right, right.

Lewis Francis: So that’s kind of a process thing that can be a little tougher for people to get their heads around at first.

Emily Lewis: Yeah, I think I struggle with that a little bit. Especially if I’m under the gun, I will get our QA testing or front-end QA testing results and I’ll bang through them and then we’ll just push it live.

Lea Alcantara: [Laughs]

Emily Lewis: [Laughs]

Lewis Francis: Right. [Laughs]

Lea Alcantara: And then I’ll email you and be like, “Oh, Emily…”

Emily Lewis: “Oh, did you notice this?” [Laughs]

Lea Alcantara: [Laughs]

Emily Lewis: I’m like, “Oops.”

Lewis Francis: [Laughs]

Emily Lewis: So I need to be better about closing that loop myself because, yeah, because exactly of that scenario. It happens all too often. Fortunately, it has not caused any major problems; we’re talking CSS issues. But still it’s a workflow; it’s a process thing.

Lewis Francis: Right.

Emily Lewis: Well, speaking of process, do you have any general ideas about how people could improve their quality assurance processes?

Lewis Francis: Well, I think one of the major things that people can do is better track their issues.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Lewis Francis: Use a real issue-tracking database or system. Don’t send Slack messages.

Emily Lewis: [Agrees]

Lewis Francis: Don’t send any email that has 200 issues in it. Don’t set up a Google spreadsheet that has 200 issues because those things will just…

Emily Lewis: [Agrees]

Lewis Francis: Things will slip between the cracks.

Lea Alcantara: Right.

Lewis Francis: It’s just not an efficient way. I’ve done this so many times. Sometimes a client wants to work a particular way and we’ll grimace and do it the way they want, but almost inevitably there will be a problem.

Emily Lewis: Yeah.

Lewis Francis: So find the issue-tracking system, whether it’s issues in GitHub or an IT-level service desk kind of thing, just anything where you can have a single ticket address a single issue. That’s just so, so important.

Emily Lewis: Have you ever used Asana or I think that’s how you pronounce it, A-S-A-N-A.

Lewis Francis: I’ve used so many from Bugzilla to Mantis.

Lea Alcantara: [Agrees]

Lewis Francis: And I can’t even remember the names of the other one, Trac is one that we use for a long time until we started or until we moved our code repositories over to GitHub. Yeah, I can’t remember, but Asana might have actually used Trac. Was it Python-based?

Emily Lewis: Oh gosh, I don’t know what’s underlying it, but I’ve had to use it a couple of times when I’ve been a subcontractor with other organizations, and it’s not bad. I feel like it does more than tracking, which I think sometimes you need something very focused and sometimes you need something that does a lot of stuff.

Lea Alcantara: [Agrees]

Lewis Francis: Right.

Emily Lewis: And I always felt, in my experience, using it for tracking issues, it was too much. So you mentioned you guys moved over to GitHub Issues. Is that your primary tool these days?

Timestamp: 00:40:04

Lewis Francis: It turns out to our primary tool. Yeah, I have some issues with Issues.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Lewis Francis: Namely…

Lea Alcantara: Like when?

Lewis Francis: Yeah, and namely, the inability to step through issues that are assigned to you.

Emily Lewis: [Agrees]

Lewis Francis: So yes, I can view a list of all the issues that are assigned to me. But once I drill down into one of those issues and I want to paginate to the next one, I can’t do that. Why can’t I do that? I just don’t get it.

Emily Lewis: [Laughs]

Lewis Francis: But it’s clean, it’s a simple system. There’s a little chrome to get through for clients. But if we ever have clients that want to be involved in issue-tracking system, we’ll spin up a Trac instance on this one because that’s a little more client-friendly.

Emily Lewis: [Agrees]

Lewis Francis: It allows us to do things like or Trac allows us to do things like prevent a client… well, it allows our client to enter a ticket and to track that particular ticket’s progress, but not see other tickets.

Lea Alcantara: So I’m curious in regards to actually posting an issue, do you have advice for people who are doing QA? How do you actually write up the issues that it makes sense for the QA person?

Lewis Francis: Yes, absolutely, we have placeholder texts in our entry fields that have little helpful reminders like the URL of the page that’s in question.

Emily Lewis: Right. [Laughs]

Lea Alcantara: Right.

Lewis Francis: You’d be amazed how many people just assume that you telepathically know which page they’re talking about. So URL, which browser version, make, platform, OS platform, the issue it presented, some facility for adding a screen grab, which is tremendously helpful when the client finds it difficult to explain an issue properly.

Lea Alcantara: [Agrees]

Lewis Francis: In my notes that I sent you on the tools that I use, did I mention a screen grab tool unfortunately called LICEcap?

Emily Lewis: No. [Laughs]

Lea Alcantara: No. [Laughs]

Emily Lewis: How do you pronounce it again?

Lewis Francis: Well, I don’t know how you pronounce it. It’s L-I-C-E-cap.

Emily Lewis: That’s weird. [Laughs]

Lea Alcantara: [Laughs] Wow!

Lewis Francis: And I’m sure it’s some acronym for something. Yeah, it’s horribly, unfortunately named, but it’s a really handy tool. Basically, it allows you to create an animated screen movie and save it as an animated GIF, which makes interaction issues easier to capture.

Lea Alcantara: Right, right.

Lewis Francis: Sometimes those can be hard to describe, and if a picture is worth a thousand words, what’s a movie worth?

Emily Lewis: Oh, that’s excellent. Oh, I love that.

Lewis Francis: And it’s a free tool, and I believe it is across platforms as well.

Lea Alcantara: Very cool. I use Snapz Pro X. It doesn’t…

Lewis Francis: Yes.

Lea Alcantara: Yeah, so it doesn’t do an animated GIF, but it does do a little .mov, and I remember it was so useful when this particular client, he noticed something weird was happening with the way things were loading and he was trying to say like, “This is an issue. This is an issue.” And then so I took a video of me interacting with a similar site and I was like, “It is not an issue. See, this is what’s happening right here.”

Lewis Francis: [Agrees]

Lea Alcantara: And simply by showing the video, he was like, “Oh, okay, I get it.”

Emily Lewis: That’s expected behavior, “Oh.”

Lea Alcantara: Exactly, that is exactly what has always been there. You just noticed it now.

Lewis Francis: Right, right.

Lea Alcantara: Yeah.

Lewis Francis: Yeah, I use Snapz Pro X as well, and I discovered this LICEcap when we started using GitHub Issues because you can’t actually attach a movie, a .mov, to an issue stick.

Lea Alcantara: A MOV, right.

Emily Lewis: [Agrees]

Lewis Francis: Which is annoying because I used to capture screen movies and it would also capture the audio of whatever song is playing in iTunes at the time, which I would have fun.

Emily Lewis: [Laughs]

Lea Alcantara: Right.

Lewis Francis: So yeah, excellent tools.

Emily Lewis: Awesome. So before we wrap up, you’ve shared a ton of great tools. Are there any other resources like blogs or books or sites that you frequent regularly that help you stay up to date on different tools and technologies with QA?

Lewis Francis: I really don’t have anything to offer there. I’ve checked different QA forums. There’s a lot of stuff out there, but there’s not a lot of stuff out there for web agency type work.

Emily Lewis: [Agrees]

Lea Alcantara: [Agrees]

Lewis Francis: It’s deeper enterprise-level application development and that sort of thing. So unfortunately, I can’t help you out much there.

Emily Lewis: That seems to be like some content someone needs to create. [Laughs]

Lea Alcantara: Or listen to the show. [Laughs]

Lewis Francis: [Laughs]

Emily Lewis: [Laughs] Technically, we’re filling a hole. [Laughs]

Lea Alcantara: [Laughs]

Lewis Francis: Right.

Lea Alcantara: Okay, well, thank you, Lewis, but before we finish up, we’ve got our Rapid Fire Ten Questions, so our listeners can get to know you a bit better.

Lewis Francis: Oh boy, yeah, all right.

Lea Alcantara: Are you ready?

Lewis Francis: I am.

Lea Alcantara: Okay, first question, morning person or night owl?

Lewis Francis: Night owl.

Emily Lewis: What’s one of your guilty pleasures?

Lewis Francis: Oh, my gosh, I’m not sure how guilty this is, but I’ve discovered French pop, and it’s a kind of thing that’s easy for me to listen to while I’m working because I don’t speak the language.

Emily Lewis: [Laughs]

Lea Alcantara: Hmm, right.

Lewis Francis: So it doesn’t engage that part of my cognitive load. It’s just easy to get to, and I’m sure that if someone were to translate it for me, it would probably be really disappointing.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs] What software could you not live without?

Lewis Francis: Oh, my gosh, in just in general or in QA?

Lea Alcantara: Sure.

Lewis Francis: I think, as I said at the beginning of this podcast, I’m a photography enthusiast.

Emily Lewis: [Agrees]

Lewis Francis: So Adobe Lightroom is just an amazing tool for managing and editing your photo libraries.

Lea Alcantara: Right.

Emily Lewis: All right.

Lewis Francis: I’ll be very sad if that went away.

Emily Lewis: [Laughs] What profession other than your own would you like to try?

Lewis Francis: Well, let’s see, I’ve been a musician. That didn’t work out. The whole rock and roll thing just didn’t quite happen.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Lewis Francis: If I could get paid for — do you know what geocaching is?

Lea Alcantara: Yes.

Lewis Francis: Okay. If I could find a way to get paid for geocaching, that would be fun.

Emily Lewis: [Laughs]

Lea Alcantara: So what profession would you not like to try?

Lewis Francis: Project management.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Lewis Francis: It’s no fun at all.

Emily Lewis: [Laughs] I actually love project management.

Lewis Francis: I love people who love project management and do a good job of it. It’s just impressive to me. I think you guys are awesome.

Emily Lewis: [Laughs]

Lewis Francis: I just don’t think I have the aptitude for it.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs]

Emily Lewis: If you could take us to one restaurant in your town, where would we go?

Lewis Francis: Oh, well, there’s a Korean BBQ place in Annandale called The Honey Pig, that’s a lot of fun and super, super tasty, and also in DC there are a bunch of Ethiopian places. One is called Dukem on U Street. That’s just amazing.

Lea Alcantara: If you could meet someone famous, living or dead, who would it be?

Lewis Francis: Well, I don’t know how famous she is, but Francois Hardy. She was or is, I should say, a French chanteuse from the 60’s onwards. It’s just amazing. I love her music to death.

Emily Lewis: If you could have a super power, what it would be?

Lewis Francis: Invisibility.

Emily Lewis: [Laughs]

Lea Alcantara: So what is your band or musician?

Lewis Francis: My favorite band or musician, whatever I’ve been listening to lately. I don’t think I can have a favorite, but Trixie Whitley has a new album out and I saw her a couple of weeks ago. She’s a Belgian-American singer/songwriter with kind of bluesy-Americana thing. It’s an interesting mix of styles.

Emily Lewis: [Agrees]

Lewis Francis: She’s wonderful. Check her out.

Emily Lewis: All right, last question, pancakes or waffles?

Lewis Francis: Waffles.

Emily Lewis: [Laughs]

Lea Alcantara: [Laughs] You just like it.

Lewis Francis: That’s an easy one, waffles with pecans.

Lea Alcantara: Oh, nice.

Emily Lewis: [Laughs]

Lea Alcantara: So that’s all the time we have for today. Thanks for joining the show, Lewis.

Lewis Francis: Thank you, folks. It was fun.

Emily Lewis: Good, and in case our listeners want to follow up with you, where can they find you online?

Lewis Francis: You can catch me on Twitter @lewisfrancis, and again, if you are interested in life photography, there’s the lewisfrancis.com, one page of this just points you to my portfolio elsewhere.

Emily Lewis: Excellent. Thanks again for joining us. It was great having you on.

[Music starts]

Lewis Francis: Thank you.

Lea Alcantara: CTRL+CLICK is produced by Bright Umbrella, a web services agency obsessed with happy clients. Today’s podcast would not be possible without the support of this episode’s sponsor! Thank you, Craft Commerce!

Emily Lewis: We’d also like to thank our partners: Arcustech and Devot:ee.

Lea Alcantara: And thanks to our listeners for tuning in! If you want to know more about CTRL+CLICK, make sure you follow us on Twitter @ctrlclickcast or visit our website, ctrlclickcast.com. And if you liked this episode, please give us a review on iTunes, Stitcher or both! And if you really liked this episode, consider donating to the show. Links are in our show notes and on our site.

Emily Lewis: Don’t forget to tune in to our next episode when we talk to Stephanie Morillo about copywriting and user experience. Be sure to check out our schedule on our site, ctrlclickcast.com/schedule for more upcoming topics.

Lea Alcantara: This is Lea Alcantara …

Emily Lewis: And Emily Lewis …

Lea Alcantara: Signing off for CTRL+CLICK CAST. See you next time!

Emily Lewis: Cheers!

[Music stops]

Timestamp: 00:50:35

Love this Episode? Leave a Review!

Emily Lewis and Lea Alcantara

CTRL+CLICK CAST inspects the web for you!

Your hosts Emily Lewis and Lea Alcantara proudly feature diverse voices from the industry’s leaders and innovators. Our focused, topical discussions teach, inspire and waste no time getting to the heart of the matter.