Hire Breakthrough Podcast with Kimone Napier

How AI Deepfakes Are Enabling Fake Remote Workers - And What HR Needs to Do with Aarti Samani

Kimone Napier Season 3 Episode 20

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 34:41

What if the remote worker you just hired… wasn’t real?

AI deepfake technology is making it easier than ever for fraudsters to fake their way into jobs—and companies are falling for it. From stolen identities to AI-generated interviews, businesses are unknowingly onboarding fake remote workers, putting their security, finances, and reputation at risk.

In this episode of The Hire Breakthrough Podcast, Kimone Napier sits down with Aarti Samani, an expert in AI-driven fraud prevention, to uncover:

How AI deepfakes are being used to scam hiring processes
The shocking true story of a security company that unknowingly hired a fake employee from a sanctioned nation
Why hiring fraud is a growing risk—and how business culture plays a role
The steps HR and leadership teams must take to protect their organizations
How fraud awareness training can help prevent hiring scams

HR leaders, recruiters, and hiring managers—this is your wake-up call. If you’re hiring remotely, you need to be prepared. Listen now to learn what you can do to protect your team.

🔗 Resources & Links:

Connect with Aarti Samani: https://www.linkedin.com/in/aartisamani/

Grab the Hire Breakthrough Calculator: https://hirebreakthrough.com/calculator

Ready to take the next step?
Scale Hire is our done-for-you hiring solution, handling the entire process from start to finish. From sourcing and screening to delivering a curated list of top candidates, we take the stress out of hiring so you can focus on scaling your business. 👉 Learn More Here


Looking for Talent & Workforce Development Solutions?
At Hire Breakthrough, we design and deliver workforce development programs that equip professionals and organizations for the future of work. Whether you need customized training, leadership development, or career readiness programs, we create solutions that drive real impact. 👉 Learn More Here

🎤 Need a speaker or trainer? Kimone Napier brings expert insights on hiring, DEI, leadership, talent and workforce trends to organizations looking to elevate their teams. 👉 Learn More Here 


Support the show

Thanks for tuning in to The Hire Breakthrough Podcast! If you found this episode helpful, be sure to:

Subscribe so you never miss an episode.

Leave a review to help more leaders discover the show.

Share this episode with someone who’s ready to build their dream team.

Connect with me:

Website: Hire Breakthrough

LinkedIn

Instagram

Youtube

Let's build the future of work - one breakthrough at a time.

Kimone Napier (00:02.26)
Welcome back to the Hire Breakthrough Podcast. I'm your host, Kimone Napier, and today we're tackling something that sounds straight out of a sci-fi thriller, but it's happening right now. Fake remote workers enabled by AI deep fake technology. So imagine hiring someone, onboarding them, and later realizing that they never actually existed. This isn't just a theoretical risk, it's already happening. So my guest today is

Aarti Samani, who is an expert in deep fake fraud prevention, and she's here to expose one of the most shocking hiring fraud cases. I mean, Aarti, when you told me about this, I was just kind of mind blown. So if you're in HR leadership or hiring, this is a conversation you cannot afford to miss. Aarti, welcome to the show.

Aarti Samani (00:56.738)
Thank you, Kimone. Thank you for inviting me. And I'm really pleased that people like yourself are exposing this topic, which not many HR leaders and teams know about. So I'm really delighted that you've invited me today, and I hope I will be able to add tremendous value to our listeners.

Kimone Napier (01:16.598)
Yes, wonderful. And thank you so much for being here because I'm so excited for this conversation because when I found out about this, I was like, people really need to know what's going on. So let's start with the big picture, fake remote workers. AI deepfake tech is evolving rapidly, very fast. But how did we get here? And what is really driving this rise in, I don't know, like AI fraud is what I would call it.

Aarti Samani (01:28.27)
Thanks.

Aarti Samani (01:48.014)
Yeah, great question. And simple answer, AI is driving AI fraud, right? So the past two years, we have seen an exponential rise in knowledge about AI in the number of tools that have penetrated the market, the low cost of these tools as well for $20 a month or $15 a month, you get very high quality tools that allow you to do all kinds of

exciting, interesting things. So it's a how did we get here? We got here because AI has developed so quickly that it's given us amazing power right in our pockets, in our hands, that we can do things with very quickly, very inexpensively and at scale. Now

Usually people like us, we think, it's used for good stuff, right? It's used as a thought partner. It's used to generate great learning videos or great marketing, collaterals, etc. But there is a whole other side. There is almost a shadow world that exists out there which use it for entirely different purposes, the same tools that we use to to be great leaders and great at the work we do.

Kimone Napier (02:39.82)
Thank

Aarti Samani (03:08.322)
The same tools are being utilized for negative purposes by a very large community out there. And that negative purpose, unfortunately, is fraud. It is sort of scamming businesses, organizations, as well as individuals out of their hard-earned money, their revenue, but also their information, their data, and causing a lot of distress along the way.

Kimone Napier (03:34.282)
My goodness, I think a lot more is happening than what people think is happening because I know if we're thinking about it from an HR leadership perspective, almost the thought of using AI in a fraudulent way, you know, everyone automatically thinks fake resume or something like that. And it's not just fake resumes anymore. I mean, people are using AI to literally fake identities and, know, to the story that you're going to share.

you know, even doing it during an interview, which is just mind blowing to me.

Aarti Samani (04:08.97)
Exactly. there are two parts to this. One is creating an entirely AI generated individual accompanied with all of their identity documents. So creating, yeah, this individual may not exist in the real world at all. They have been given a personality, they've been given a name and they've been given a

fraudulently created identity documents that include passports, social security numbers, driving licenses, etc. And then the other part is taking an individual who does exist and utilizing their identity for fraudulent purposes. So utilizing an individual who exists to apply for jobs in their names, to use their identity but in fraudulent documents to verify when

a job offer has been made to them, et cetera. So we may not even know about it, but our identity could be very well be utilized. Currently, as we speak, our identity might actually be being utilized by someone for negative purposes. and typically, especially since COVID, we are doing a lot of remote interviewing, right? We're hiring a lot of staff remotely.

We operate in a global environment, so we don't even think about our technology team based in a different part of the country. Our marketing team might be based in another part of the world, et cetera. So we don't even think about it. And all of that recruitment is happening through video calls, through voice calls, completely remotely. You might never meet your colleague in real life. And the technology as it stands allows

us, allows the people with negative intention to use it for fraudulent purposes so they can be someone entirely different on a video call and we would never know that they were a different person.

Kimone Napier (06:13.152)
I can almost guarantee probably half of our listeners have probably never even thought of some of the things that you're talking about. So, you before we move forward, you came across a, I call it a wild case. Like this is like something that sounds like it would be in a movie or something. And so, you know, from the premise of what happened, a security company, you know, unknowingly hired basically a fraudulent worker and it just turned into like a horror story.

Could you walk us through what happened?

Aarti Samani (06:45.782)
Yes, absolutely. So this happened in United States, a security training provider, a known company, and they came forward with this. And I really respect the CEO of the organization that he came forward and talked about it in public domain that this has happened to us. So be cautious and learn from our experience. So the organization put a job role out there for a remote IT worker.

An individual, several individuals applied. One of them said they were based in Atlanta and they applied for the job, very well qualified. This resume was fantastic. A series of interviews were conducted and they passed every interview. Then there was identity checks, et cetera, done before they were fully employed within the business. All the checks went through. They onboarded this individual.

The organization sent them a company laptop, which was sent to an address in Washington, even though the candidate claimed to be based in Atlanta. But they said, hey, look, I'm going to be in Washington at this particular time. So just send it to this address, and I will pick up from there. Within a day of this person joining the organization,

they started to see that some malware was being installed in this company laptop which had been delivered to them. So obviously the security team rightly raised some alarms. They looked what's going on. They investigated. stopped access to confidential areas for this individual. And within a day, all their access was blocked. And then further investigation revealed that this person

was not an American citizen, they were based in a sanctioned nation. But throughout the interview process, the video calls were using what we call deep fake technology, which means that they visually, they seemed like an American person, their accent was very American, all their identity documents claimed that they had an address in Atlanta and all the other...

Aarti Samani (09:05.71)
credentials that go when you check an identity of an employee before bringing them into the business. Everything was fraudulent. What seemed like live video interviews were all fraud. They were conducted with deepfake, etc. So technology was utilized for an individual from a sanctioned nation to join a security business and be on their roster.

in the IT department where they would have access to a whole lot of information, data, credentials, etc.

Kimone Napier (09:45.544)
my goodness. And for those of you listening, I have to tell you when Aarti and I actually talked about this and she had shared the story with me initially, because we were on video, she actually visibly showed me what it could look like. And literally you could just, someone could just change your face almost. is, the technology is that advanced. would always, I'll also say it's that terrifying. So Aarti, what are some of the red flags?

you know, and, you know, cause I'm just trying to think about it in a sense of what this company potentially missed and how, you know, we can protect ourselves moving forward and learn from this experience.

Aarti Samani (10:26.784)
Yeah, that's a great question. And before I answer that, I think it would be helpful for us to give the context to our audience, the motivation and the drivers. Why would somebody want to do this? Right? Why would you want to pretend to be a citizen of a different country entirely when you're based somewhere else? What are you getting? Because they're not getting

Kimone Napier (10:38.27)
And yeah.

Aarti Samani (10:50.4)
a lot of money. They're a salaried employee when they're onboarded into this company, right? So they're getting their regular salary, which may not be life changing in any way, right? So what are the drivers? What are some of the motivation that inspire people to commit this kind of fraud? And it's a few different things. So money is obviously one driver. The salary that they receive is funneled back.

into the sanctioned nation to fund some of the activities of that regime, of that government. The second motivator or the driver is the information and the data that they have access to. that is currency itself, right? Information is currency in a digital environment. So once they have access to that, that can then be utilized for further crime, which can be, that information can be sold on the dark web.

to facilitate other criminals to do further crime. So money is one aspect of it, but it is the information and the data that they are able to get access to, which then has an amplification effect. It kind of goes exponentially. Once you have that, you then pass it on to 10 other people who pass it on to 10 other people, and it kind of grows and amplifies.

from there. So the drivers, it's really important to understand the drivers and motivation. And I also emphasize this because a lot of companies tell me we are too small to be targeted by this. And my pushback is, you have, do you have information and data which is valuable? And pretty much every organization, no matter their size or the sector they are in, always have information.

which is off use to bad actors and use it for malicious purposes. yeah, it's just like the whole thing is very well thought out, very well designed and very well executed. Now some of the red flags then very, very difficult to tell with naked eye. So like you and I spoke on our prep call and I was able to show you.

Kimone Napier (12:45.9)
Thank you.

Aarti Samani (13:07.254)
a face swap very easily, I just took on a different face and you couldn't tell the difference. So by visually examining a face that is on the other side of the screen, it's impossible to tell the difference. But there are certain things that we can look out for. So one thing is when, you know, so visually you can tell when, you know, when you ask a person to put their hand across their face.

Because when there is an obstruction in the camera, which is being manipulated, that obstruction typically shows just a flicker of the original face. So when I put my hand across the camera like this, then there's only a very small flicker which will go back to the original face that is on the camera versus the one which is being manipulated. So that's one thing you can do. The other thing is, you know, when you move...

towards the edges of the screen, then the technology kind of messes up a little bit. So if I move too much on the edge of the screen, then the face deforms, the manipulated face deforms, and again, you see a flicker of the original. But these are sort of be an idiot test, right? Like when you're interviewing a candidate, what are you going to tell them? Hey, stand up and put your face so that it goes across the screen to the top.

Kimone Napier (14:08.414)
moves around. Okay.

Kimone Napier (14:22.764)
Yes.

Aarti Samani (14:31.852)
then to the left, then to the right. Like these things are not always practical. I mean, we have to resort to that from time to time, but they are not always practical. So I always emphasize, obviously, get the best technology that you can in terms of detection and security, whatever your budget allows for, whatever your security team is already putting in place. And there are deep fake detection technologies which can be utilized on live calls like this that will

at least alert the individual to say, look, we think there is something out of shape here. Please go and investigate. But aside of that, we have to fundamentally rethink how we conduct interviews, right? So if you think about how we interview our potential candidates, we always, we test them for the culture.

Like that's a big thing. We test them for culture fit. We test them for compatibility. We test them on the technical aspect, on the specifications of the role, etc. Right? Like there are a whole bunch of things we test them for. Nowhere do we test them for authenticity in the context of them as individuals, right? We test them for authenticity of their knowledge and their skills and their past work experience, etc.

Kimone Napier (15:46.742)
Very true.

Aarti Samani (15:55.186)
but we don't test them for the authenticity of them as a person. So if I'm telling you, Kimone, I'm a candidate interviewing with you, I'm based in London, then you will have to very cleverly extract out of me if I'm really based in London. Now, how do you do that? So there is, this is where we collectively as a community of HR leaders, hiring managers,

fraud experts, security experts have to come up with techniques through conversation, through sort of forcing certain types of responses that we validate, whether they really belong to this region or the country or the area where they claim to be, right? What is their motivation for joining this particular company? What is their motivation for joining this particular role?

in this particular company. I think we have to, there is no playbook for this yet because this is so new, right? There is no playbook for it. So we have to come up with a playbook. have to, and this playbook that we come up with is not going to be valid for very long because tech will again get better. The individuals will get better at responding to this. It is a continual evolving process. But what I'm saying in a nutshell is that we have to

Kimone Napier (17:04.364)
Yes.

Aarti Samani (17:15.488)
apply critical thinking, we have to trust our instincts and we have to change our line of questioning when we are interviewing remote candidates.

Kimone Napier (17:25.95)
Yes, what I'm gathering from what you just shared, which thank you, by the way, because this is so helpful. This is not just the HR issue. It's a company wide risk is what I think all of you should be taking away from this. And it's a risk that leadership really needs to take seriously. And I think, you know, what you mentioned about some, a company that identifies as a small business, the thought there is that, because I'm a small business, you know, I won't be targeted.

Whereas you were talking, I thought the complete opposite. I think it's actually easier probably to target a small business because they don't necessarily, they might not have the budget to have all the technology in place. And so the parameters of someone who has ill intentions to really like capitalize on basically because they are small to try to infiltrate and get information is...

really scary, but I think it's very doable if the person is really, I don't know the word that I'm looking for, they're really driven in order to really scam someone. And so I think if you are a small business, there are gonna have to be parameters that you put in place. And particularly, you know, with some of the smaller clients that we work with here at Hire Breakthrough, I know in terms of their hiring process before we're working with them,

sometimes it could be very lax. Like, we'll just have a phone conversation with this person and then they'll hire the person. It's not even a really, really a process. And I'm just thinking at the amount of risk that you're putting your company, you you're risking your company by doing that. Because a lot of smaller businesses, they're constantly on the move, you know? And so they might not have the most elaborate hiring process, but it's leaving your company.

really wide open. And so I love what you shared. So let's talk about some solutions. So you talked about HR leaders and team that they need to step up and really include fraud awareness training. I would even go a step further and say include the training in some of their learning and development programs. What would you say are some must haves for companies to protect themselves?

Aarti Samani (19:47.298)
Yeah, absolutely. So in addition to very stringent technology and processes, it is vital to have this kind of fraud awareness training across the board, all the way from the board, the executive team, the middle managers, the whole company. And actually, this is an opportunity for HR to really step in and be a partner to the CSO.

So if you think about the security training that happens to date, it's mostly run by the security team, the CSOs office, whether it's large or small, it doesn't matter, but it's the technology or the CSOs office that runs this kind of training. Now, what we are saying here is that the hiring managers have to think differently. The HR processes have to change from being very informal to very stringent and insisting on...

as much live interaction as possible. Now, this means that HR can really step into this role of including security awareness training within the frameworks that are already in place, whether that's as a lunch and learn session, during the onboarding process, or at employee gatherings, annual employee summits, et cetera. So identify.

Where are the opportunities within the existing platforms where you are communicating information through newsletters? It doesn't have to be always an in-person or a live training, right? Through newsletters, you can do it through a reminder. Like when you do the survey pulse, you can include security training in there. So find pockets of

Kimone Napier (21:36.881)
Thank

Aarti Samani (21:37.006)
opportunities within the environment that you as HR leaders are already operating in and then include the security awareness training. So the burden is not just on the security team, it is distributed across the organization because hiring managers are across the business, right? Everybody is looking to build their teams, they are hiring and they all need to know this. That responsibility

in my opinion, falls onto the HR leaders to say, OK, hey, are you aware of this thing? Let's make you aware. Let's then look out for certain red flags. Is the candidate reluctant to get on a video call with you repeatedly? Is the quality of their video very grainy, very disturbed and therefore very difficult for you to discern whether something is real or not? Are they faltering? Are they trying to push you to?

you know, go onto an audio call, are they making the excuse, I have network connection issues, et cetera. Like, let's look out for some of these things, right? You need to be aware of this. You need to be aware of the warnings that the government is issuing, right? So FBI issued a warning recently about this particular scam. planting people from...

Kimone Napier (22:35.404)
.

Aarti Samani (22:58.84)
sanctioned nations into organizations throughout the country and you wouldn't even know about it. And millions and millions of dollars are being drained out of the system by these fake workers. So if a warning like this has been issued, does the HR team know about it? They should be setting security alerts of their own, right? So they are receiving some of this information. They are ingesting some of this information. They are having regular

conversations with the CISO partner. So the CISO is giving them all this information. now HR leaders really have an opportunity to step up. You can legitimately demand a seat at the executive table if you don't already have because the security is not just now an executive team's problem or the security team's problem. It is impacting the whole business and it is impacting HR. So you can

Kimone Napier (23:49.344)
Mm-hmm.

Aarti Samani (23:57.888)
And you must ask for that seat at the table and become really kind of step into this role of defending the organization through training, through keeping yourselves aware and through information, communication, exchange from both hands.

Kimone Napier (24:00.608)
Okay.

Kimone Napier (24:16.714)
You know, in hearing your responses, and thank you so much for that, the training piece, I think that needs to be stressed because from an HR perspective, you know, depending on the size of the company, a lot of times the hiring manager is not someone in HR for a lot of these large companies. could be, their title could be, you know, manager, director, and they might not have any actual, you know, recruitment.

experience, it's certainly probably not even HR experience. And so I think that training piece definitely, it needs to be leveraged. Because even if we're just talking about it from a recruitment perspective, a lot of the times hiring managers make all kinds of mistakes in terms of interviewing, depending on their level of experience. And so I think there is definitely an area for someone, if they have ill intention to really capitalize on that.

because they might know like, this person is not in HR. What is the chances that they'll know that, you know, I'm using DeepFake or I'm using some type of technology to try to infiltrate this company. And as you were talking, I was sitting here like, wow, this goes deeper and deeper as you talk about it. And so I think a hundred percent companies you have to utilize some level of training.

Aarti Samani (25:30.318)
Exactly.

Kimone Napier (25:38.92)
I want to ask you too, do you think with all of this going on that the trust in remote hiring will decline because of these risks?

Aarti Samani (25:50.262)
It is inevitable, Because this is a trust issue effectively in the story that we talked about or the case that we talked about earlier. The hiring manager trusted this individual. They passed everything and inherent, it's a human instinct, right? When we are bringing people into the business or any human to human engagement, your default mode is to trust the person that you are interacting with.

Now, if you are interacting in real life, it's easy to trust because you know they are real, at least as human beings. Their motivations and their intentions, you have to extract out of them through the conversation, but at least they are human beings. On a digital interaction, we can't even be sure if they're real human beings on the other side, right? They could be AI-generated beings, and we don't know. So that trust is really...

Kimone Napier (26:42.774)
Yeah.

Aarti Samani (26:48.706)
We are at that, we are in that environment where we are on the on the edge of trusting and not trusting all the time. And I, I do not ever say don't trust the people you meet. I always say, please trust your colleagues because not trusting creates a toxic work environment. And that then has a whole load of implications on culture, which we haven't even talked about, like your work culture.

and your risk exposure are almost directly correlated, right? So I never say do not trust your colleagues because that's how it's a cultural impact. What I always say is trust your colleagues 100%. The digital interactions that you have with your colleagues or anyone else for that matter, treat those interactions with curiosity and caution because what you think

who you think is your colleague on the other side of the screen may be somebody impersonating to be your colleague, who you think is a good candidate that you want to bring on board may be someone impersonating to be that candidate and it may not actually be the genuine individual. So it's a very tricky balance and I always stay away from saying do not trust, please do trust physical interactions.

but treat the digital interactions with caution just because it's as detrimental to the person who is being impersonated as it is to you who is being a victim of this kind of impersonation fraud.

Kimone Napier (28:28.916)
I cannot agree more because I think, you know, certain people who are listening and certain leaders, I think the immediate reaction is, well, you know, maybe I should just push my workforce to go more traditional in person and maybe not do any remote work. And I would even say, and I agree with you there, that remote work is inevitable. We live in a digital age, so...

You know, no matter, even if your company and your team operates in person, there's always going to be a level of remote work. And it might not be an extreme situation where someone is trying to infiltrate your company in order to just, you know, at the initial stage of the hiring process. What got me also thinking is that if somebody is working remotely, like what is the chances that it could be someone else doing the work? It could also be that too, just to add another spin on it. And so I think.

in terms of thinking, you can't just jump to like, okay, in order to avoid this, we're just not going to do remote work. I don't think that's the solution at all. I think the solution is to do, put parameters in place, like strengthening your hiring processes, making sure that you're asking some of the questions, reading body language and how they move and just being, you know, well aware while you are conducting the interview. And then I think the most important part is yeah, training.

your HR leaders, your C suites, and training hiring managers about this. Because this is not something that anyone would typically think of, but it can go so far and it could get out of hand very fast. And so I think the key is to put parameters in place. This was just such an eye opening conversation. Aarti, where can people find you and learn more about your work?

Aarti Samani (30:02.83)
Thank you.

Aarti Samani (30:10.51)
Thank

Aarti Samani (30:18.934)
On LinkedIn, so I'm very active on LinkedIn. Please read my content on LinkedIn. I put out these kind of cases as I hear about them. And my aim is to drive as much awareness to open and expand people's minds of the art of possibilities because people on the other side of the screen are using all of the technology available to them.

They are very creative in how they use this technology. And I can't guarantee what the next way of attacking an organization or an individual will be. Like today we are talking about fake remote workers. Tomorrow there may be an entirely different technique. We cannot predict it, right? It's like whack-a-mole. So the more we are aware of the art of the possible, the more we are able to think critically.

Kimone Napier (31:04.3)
Yes.

Aarti Samani (31:15.042)
the more we are demonstrating contextual awareness about our business around the threat landscape in the environment that we're operating, the more likely we are to be able to spot these signs and therefore defend them. And then work culture plays a very, very important role because these kind of fraud, deep fake fraud, more widely known as impersonation fraud, targets people.

It does not target technology and it does not target the process. Right. So if I'm if I'm on a video call with my colleague who's working remotely and I see something weird I should feel safe enough to call my colleague from a different channel either on a mobile phone or through through other means I should feel safe enough to call them up and say hey look are you

Did you just have a call with me? Did you and I just speak? we spoke about, can you tell me what we spoke about? Like, you know, I should feel safe enough to question them without them feeling offended about it, because all I'm trying to do is protect them and protect the business and myself that we are working in. Right. So that culture of psychological safety is very, very important. The culture of not a lot of.

Power Distance Index is very important. If the first time you hear from your CEO is when they ask you through a voicemail to reset their password, then that's not a great culture because that voicemail may not be your CEO, right? So you, no matter how junior you are, you have to be, you have to know who your executives are. You have to have some kind of contact with them, at least during your tenure, you have to

You have to be able to recognize the kind of brand that they are. Would they ever ask you to reset their password through a voice note or to give their password credentials through a voice note? Maybe not. Right. Like these little things that we have been talking about for so long that we need culture of transparency. We need a culture of accessibility to your senior leaders. We need a culture of psychological safety. Now.

Kimone Napier (33:32.982)
Yes.

Aarti Samani (33:36.798)
It is more critical than ever because if you do not have it, your risk exposure to deepfake and other types of impersonation fraud is so much higher. So if there's one thing you take away from this conversation, it is that make sure that you have a positive culture which is conducive to defending the organization and defending the people that you are working with.

versus making them feel exposed or unsafe.

Kimone Napier (34:09.696)
Thank you so much, Aarti, for bringing up culture because I also have, you know, I don't know how many of you listeners have experienced this, but I've experienced it where I've worked somewhere and we, you know, the team that I was on, we got emails from the leader, supposedly in air quotes, to reset a password. But, you know, I didn't even ever think about voicemail.

or anything like that. So technology is definitely getting more more advanced. And as Aarti mentioned, know, psychological safety and having that close knitted community with your team is going to be helpful. We have to also help each other to understand how advanced technology is. And so we're not putting ourselves and the companies that we are serving at risk. Aarti, thank you so much for joining me today.

All our listeners, make sure that you go on Aarti's LinkedIn and connect with her. We will obviously put her information in our show notes. And that is a wrap on today's episode of the Hire Breakthrough Podcast. If you found this conversation valuable, make sure to like, comment, and share because trust me, this is something every HR and business leader needs to hear. Even if you're an employee, you really need to hear this information to make sure you're not putting the company at risk.

 So until next time, stay tuned for the next episode.