In conversation with
CarGurus’ VP of Information Security and Technology on the psychology that underpins our decisions about data
The path towards tech
Hi, I’m Kelly Haydu, the Vice President of Information Security and Technology at CarGurus. I’m excited to talk about my journey in the tech field—my roots, and how I got to where I am today.
I’ve always been interested in how people’s minds work and different dynamics in relationships. My undergraduate studies were in Human Science and Services: my concentration was in marriage and family therapy, with a minor in psychology. While I had this interest, I was also born into a family of technologists. My dad was a LAN administrator at Perkin Elmer—he worked at the facility where they developed the Hubble Space telescope.
As part of my undergraduate program, I participated in a full-semester internship for the South County Community Action Center in Rhode Island, where I helped develop a program for single teenage moms to develop computer skill sets. This enabled them to gain the necessary skills to support themselves and the families they were raising on their own. Coming out of the internship, I learned I could apply my computer and humanitarian skills to educate people about technology.
Prior to graduation, I investigated roles in technology that had an impact on medical patient support. I landed a role at Meditech, in the Boston area, where I was brought on as an implementation consultant to help implement their software into pharmacies in hospitals. And I thought, “Good, I'm feeling like I'm going to be making a difference in some way, with patient care and staff, to be able to really increase visibility into technology but also help in some indirect way.”
From catching bugs to building QA teams
During my implementation and training cycles onsite at hospitals, I would find bugs in our product. It would drive me crazy to be in the middle of a software demo and be blocked due to a bug. So I started testing things before I went out onsite. I would bring the feedback to the quality assurance (QA) team, and they’d say, “Great, thanks!” I also started documenting my use and test cases.
I came to this point in my path where I evaluated what I really liked about my job. I had an attention to detail; I loved the writing component of it and now had some hands-on experience in the technology field. I said to myself, “How do I take this to the next level?” I ended up going down the QA path and found that my itch for writing documentation was satisfied by formulating test plans and test cases.
But I quickly grew frustrated with the tedious, repetitive nature of having to manually execute test steps, so I taught myself how to automate. I would stay after work each night—I worked with the senior automation engineer at the company, and taught myself how to code for efficiency. I started managing offshore teams in India to help with quality assurance and performance testing at U.S.-based organizations.
Automated testing and saving millions
There was a company I worked for called Axeda, which, at the time, focused on machine to machine connectivity—they were essentially the pioneers of IoT (Internet of Things) and well before their time in the space. During my seven-year tenure, I worked in both the R&D and Professional Services divisions; I oversaw the quality and training departments.
While at Axeda, I was recruited to join Hasbro to start their Testing Center of Excellence. As part of that journey, I spoke on a panel at the HP Discover conference. Six months into my tenure, the organization asked that I oversee the Digital Technology Quality Assurance team. We saved the company millions of dollars in automating their processes. Business analysts no longer had to carry the burden of having to test solutions prior to bringing them “live” into production.
Until automation was put in place, the business owners would need to manually test every single process prior to an SAP rollout (we were launching SAP globally). They were working crazy hours, sometimes double their normal job hours. Being able to take that function and accelerate it in terms of quality alleviated so much extra work for the business.
The security component of QA
I started to see a couple shifts happening in the technology space. One was that many organizations were folding quality assurance into the engineering function. The other was I saw an uptick in the need for additional security—I first observed this when I was at Axeda, in their support of IoT.
I had owned and managed a lot of the Quality Assurance labs, so I had been responsible for getting the SSL certs installed, making sure that the environments were protected, shipping out particular pieces of hardware and disks, and making sure that there was encryption in place. There was a whole component to testing that included security.
I saw the need for more security in the technology on the rise as I jumped into the security space. It was a natural transition, since I had the technical background, had managed infrastructure, and had to assess risk and regulatory compliance through QA. I really liked the idea of promoting security and ensuring it was part of the Software Development Life Cycle.
I made the formal transition to security when I went to Brightcove. I was the first official hire for security there, and I started their program. I believe you can build a program from scratch as long as you understand all the building blocks to create that program. My background in risk and compliance, as well as quality and release engineering, helped me to start to understand and ask the question: What are the technical components that we need for our infrastructure to be secure?
Heading back to school
When I was at Hasbro, I explored pursuing a Masters degree but wasn’t sure in what area. Do I go for a technical degree? Do I go for an MBA? This was a long journey of exploration that I struggled with for several years.
At Brightcove, I had a great mentor who was the Chief Legal Officer. He opened my mind to the legal components of business, which were fascinating to me. I explored law schools. In my discovery, I found that it wouldn’t be feasible to work full-time, as a single mom, commuting into Boston five days a week. I just didn't have enough hours in the day.
I was still interested in it, though, since I was doing a lot of legal contract review as it pertained to security. I reported to the General Counsel, another fantastic mentor, at Salsify as well. I deeply embedded myself into the legal privacy aspect of the business. We needed to be GDPR and CCPA compliant, so I learned the ins and outs of the regulations. I went through data protection officer training.
Regardless of my continued independent learning, I still had that itch for formal education. So I went back to look at Masters degree programs again. I'll be very candid with you: regardless of my skill set and technical expertise, my lack of a non-technical undergraduate degree was always questioned. Always questioned.
Some people don’t look at the necessity of understanding the human mind while working in a security field. You need to think about it from a psychological standpoint: How do hackers think? What are their attack vectors? You must really get inside their brains to understand where they may seek weakness in your organization.
There’s also an element of psychological safety. Not only outside your organization, but also inside it. Do employees feel psychologically safe to share and be vulnerable? There’s a whole component of that as well that ties back to my undergraduate degree—but on paper, that is not always recognized in our field.
So I decided to squash the question that would come up in interviews or resume reviews. It was frustrating and exhausting—so I used it as a personal driver to pursue a technical degree. I found a program at Norwich University, a military college, which satisfied my technical and legal pursuit. I was awarded my Masters of Information Security and Assurance, with a concentration in International Privacy Law. I was recognized as the “Outstanding Student” of my graduating class with 4.0 GPA. I felt like the degree really hit the sweet spot of both privacy and security.
Perspectives outside the private sector
From an education standpoint, I knew a lot of the technical textbook material. My entire career was in the private sector. I wanted to understand—talk about psychology—the way that people think in the public sector and military, because it’s completely different, and I hadn’t had exposure to that.
What I got out of that program was not only the textbook material, but in-depth discussions and relationships with people outside of the private sector. It was fascinating to learn about techniques and advanced technologies that are used in the military and different governments (we had students globally). A lot of people in the private sector aren't even aware of some of the advancements being made. What I learned and the relationships that I made at Norwich will last a lifetime. As their motto states, Norwich Together. Norwich Forever.
A new role with established foundations
I had really no interest in leaving Salsify. I was just wrapping up my Masters degree and didn’t feel the need for another change. My boss at the time, the General Counsel, was the best boss, hands down, that I’ve ever worked for in my career. I was happy—I believed they were a rocket ship, they were going somewhere. I was and continue to be invested in their development and success.
When I got a call from the Recruiting Manager at CarGurus, I basically said, thanks but no thanks. They were enticing: “Will you just take a call with the CTO?” And I said, sure, it’s good to network, why not? I put them through the wringer. I asked them so many questions. I met with so many people. I wanted to understand what I was walking into and what the culture was like there; a positive culture is super important to me. If I was going to make this move, it had to be the right move.
I think at the end of the day, what attracted me to the role is that I wasn’t building the program from scratch, like I had done at the prior two companies. There was some foundation there, good building blocks, and executive buy-in to do the right thing. That was huge, because when you’re building a new department or division, you need support from the top.
Having that foundation makes my job easier, because I can just say, “OK, here’s the long-term strategic roadmap; team, go find solutions and execute on it!” I did have to increase staff to really move us to the next level on the maturity curve, but I haven’t had a lot of pushback on the people that I’ve needed to hire to be able to accomplish some of the big goals that we have for the organization.
Shifting attitudes about privacy and security
I’d say that the space in and of itself has had more publicity or attention than it had six or seven years ago—and some of that is a result of major data breaches. I think the other thing that’s highlighted security has been the privacy aspect of it. As the European Union introduced GDPR, they were instituting these major, major fines against Facebook and other large companies. It started to shift executives’ attention to the idea that, “Hey, we don't want to be the one that's caught and issued a fine. We better pay some attention to this over here.”
I think it's helped to gain that support from an executive standpoint, but it also fosters tighter relationships between attorneys, general counsels, and technologists. Privacy professionals can say, “This law applies to these parts of our business case here,” and then the security technologists can say, “OK, if that's the case, we have to implement these XYZ solutions.”
I have found it’s starting to become a little more prescriptive now that the GDPR has been in effect since 2018. As such, a lot of companies have put these checklists and things in place, and it’s really helped from a technology standpoint. I’m also seeing a shift with security and privacy technologists aligning more with the business side of things: they’re working with multiple business units to accommodate new laws and regulations, and they’re no longer siloed in their own little corners.
Do you need that data, or do you want it?
I think a lot of organizations are still playing catch-up. The laws do not always provide black and white answers for how to solve for data problems. There is room for interpretation. A lot of companies are capitalizing on that.
Data analytics is huge, and a perfect example. Identification of PII in your environments and understanding what PII lives in every single asset, how long can it live there—not only according to federal law, but also by state law—and how long do you need it as an organization are all questions that need to be answered. Trimming that data is super important. Some people say, “OK, you're only talking about legal and privacy, how does that affect security?” It affects security because if you have a data breach and you’re holding that data, that’s more data that a hacker or an adversary can manipulate and/or steal.
A lot of organizations have looked at data retention and conducted a full inventory of their data to understand what actually is in their environment. That can be a very slow process if you don’t have the tools and technologies to do it, and if your systems are spread across multiple domains or multiple divisions. Coordination can be tricky if you don’t have the correct people included to say, “It is OK to store data in this location, but we need to anonymize it and ensure aggregation does not equate to personal data discovery.”
The big challenge is: How long are you holding the data, are you holding it unnecessarily, how do you work to get rid of it, and if you do, will you be able to continue to operate your business without it? From a data analytics standpoint, you may want three years’ worth of data because you need to see trends. It’s therefore necessary to fully understand the business requirements and coordinate with multiple business units to get to the right spot for your business while still remaining compliant with law—and that doesn't happen overnight.
Sorting through those business goals and getting people to a comfort point of letting go of data can be difficult. I use my kid as an example: everything that he has is a treasure. Every art project that he brings home, he says, “Mom, you're not going to throw that in the trash, are you?” And I say, “Maybe when you're in bed at night, I’m going to throw it in the trash, but for the very special ones, I will keep those. Tell me which ones are the special ones.”
I actually started sending his artwork out to a company that will do 3D image scanning and put it in the book, and they keep the stuff. These little clay pots that he’s made, you send it out, they do the 3D scan, and it's in the book, so then he feels like he has kept it, but I can put it in his bookshelf and it's away, and I don’t have mounds of paper everywhere. The treasure is still there, but organized.
I direct our line of thinking back to psychology. Do you need it, or do you want it? Need versus want are two entirely different things. I tie that into the risk and compliance piece of security. Tell me about that data. Tell me about your concerns. What is the actual cost and risk to the business if we lose this? Is this an edge case, or is it standard? And that’s how you start to sift through and get to that final data retention date.
Hey, look, edge cases happen, but you have to manage towards the risk, and the cost of loss, and reputation damage. You can’t solve for everything. There may be that one time where someone thinks, “Oh, we shouldn’t have thrown that out three years ago, we need it now.” But you must weigh those risks—otherwise you’re keeping everything forever!
Kelly Haydu is the Vice President of Information Security and Technology at CarGurus. She has over 20 years of technology leadership, SaaS expertise, quality, and security experience. She’s responsible for CarGurus' Information Security and Technology divisions, securing and protecting both internal and external customer data.