We’ve all heard about how AI is coming for the jobs, right?

But we’re okay. Sure, AI can make art and drive cars and analyze financial data to predict the market. But can it diagnose a rare care of diabetes insipidus in a boxer? Or manage a complicated immune mediated thrombocytopenia in a diabetic with Cushing’s?

Medicine will surely be one of the last jobs to go—won’t it?

I wouldn’t be so sure about that.

We’re already seeing AI creep its way into the veterinary profession.

AI can be used to monitor pet’s eating habits, sleeping patterns, and activity levels. It can be used for disease surveillance in populations of animals. It’s being used to analyze x-rays and other diagnostics images. And yes, AI algorithms are capable of diagnosing patients and devising treatment plans.

Quick quiz…what are the four tasks only licensed veterinarians can perform? Surgery, diagnosis, prognostication, and prescribing medications. How many of those jobs can AI already do? How many are they better at than people?

But, Lauren, what about surgery, you say. We’ve still got that.

Except, robotic assisted surgery has been used in human medicine for decades. How much longer before it’s not just assisting.

I know this is depressing, but I swear, I’m not here on some nihilistic mission to make you feel useless and replaceable, because you’re not.

In fact, there’s one very important job that only humans can do—and that’s provide the human touch.

This is why is drives me bonkers when I see vets who insist that they’re here to practice medicine, not to coddle clients, help them figure out how to pay their bills, and listen to their problems.

Because like it or not, your medical expertise is replaceable. Or at least well on its way to being so. But your compassion and heart—that’s something only you can provide.

Which isn’t to say, AI won’t try. In fact, it already is. I have a friend, who, in an imposter syndrome funk, turned to ChatGPT to ask it to reframe the situation to help them see things more positively. And it said all the right things. I know—I read it. And yet I couldn’t help but get a serious case of the ick as I did.

That ick, is a phenomenon known as the uncanny valley. The uncanny valley theory states that the more human-like a non-human entity is, the more unsettling it will be to actual humans. Which means, the closer AI gets to being able to provide that human experience, the less likely it will succeed.

And besides, even if AI can say all the right things. It can’t truly empathize. It can’t hold a grieving client’s hand as they say good-bye to the cat they’ve had since they were ten. It can’t get down on the floor and coo all over their beloved golden retriever. No matter how hard it tries, it can’t be human.

So, if you want to AI proof your job—start by focusing on the client and patient experience.