If you’ve been on the internet lately, you’ve probably heard about ChatGPT. For the uninitiated, ChatGPT is an AI tool that uses natural language processing to output text that feels more natural and human-like than most of the tools that came before it. ChatGPT is one of the best-known natural language processing tools publicly available today, but it’s not the only one. Alternatives, like Google’s Bard and Microsoft’s Bing Chat, are also emerging. One thing’s for sure—artificial intelligence is having a moment.
For many people, this begs the question: if ChatGPT is so powerful, can it help make decisions about work? Can you use ChatGPT for hiring purposes?
We sat down with ChatGPT to pose some questions. After all, if the tool is really that smart, it’s only fair to give it a chance to speak for itself.
How Does ChatGPT Work?
First, let’s go over some basics of ChatGPT’s mechanics. I don’t claim to be an expert on all things AI, so I asked ChatGPT to help explain itself.
Here’s what it said:
I arrive at my answers through a combination of analyzing the input provided, retrieving and processing relevant information from my training data, and generating a response based on patterns and structures observed in the data. I comprehend the context, identify key terms, and apply language generation algorithms to generate coherent and informative responses.
While I strive to provide accurate answers, it’s important to note that I can make mistakes or provide incomplete or outdated information. Therefore, it’s always advisable to verify information from reliable sources and exercise critical thinking.
This is a good start, but I’m still curious ChatGPT’s training data.
I ask it to tell me more about that data—and, in an effort to not bore my readers to tears, I ask it to keep its answer brief.
The specific details of the dataset I was trained on have not been publicly disclosed by OpenAI, the organization behind ChatGPT. However, I can provide a general overview. I was trained on a vast corpus of publicly available text from the internet, including books, articles, websites, and other sources. It’s important to note that the dataset does not include proprietary, classified, or confidential information.
Interesting, if a tad mysterious!
So, here’s what we definitely know: ChatGPT’s answers come from a bunch of different kinds of texts that were publicly available at the time it was trained. It’s fair to say that ChatGPT doesn’t know anything people don’t already know—it’s just really good at absorbing all that information and finding patterns.
There’s one part of this that confuses me. I’ve heard that ChatGPT continues to learn, but this makes it sound like its answers all come from a single training period. I want to make sure I understand this, so I ask: Do you have access to any information that was not available in this data set?
It tells me:
No, as an AI language model, I do not have direct access to information or data that was not included in the training data. My responses are generated based on the patterns and information present in the data I was trained on, up until my knowledge cutoff in September 2021. I cannot retrieve or access information beyond that point.
To clarify a little more, I ask: Do you learn based on user input?
As an AI language model, I do not have the ability to learn based on user input or adapt in real-time. I do not retain information about specific user interactions or incorporate individual user inputs into my training.
This actually surprises me. I had assumed that ChatGPT would be capable of using its interactions with users to learn more about the world and improve its answers. But apparently not—by and large, its knowledge comes strictly from the data it accessed in 2021.
Benefits and Pitfalls of Using ChatGPT to Research Hiring Software
ChatGPT can’t directly access the internet, which definitely limits how much it can help with hiring. It cannot, for instance, give feedback on your website, tell you how your job ads are performing, or screen candidates for you (even if it could, you might not want it to).
But ChatGPT is often used as a research tool. So can it help you research new hiring tools and strategies?
Short answer: theoretically. Let’s look at some of the pros and cons of using ChatGPT to research hiring software.
Benefits of Researching Software With ChatGPT
1. It’s Quick and Easy
Compared to sifting through dozens of web pages, asking ChatGPT to do research for you is pretty efficient. The major caveat here is that you will have to fact-check everything it tells you, which can also be time-consuming. But for pulling together initial research, ChatGPT can be a great place to start.
2. You Can Ask Follow-Up Questions
Personally, this is my favorite thing about doing research with ChatGPT. Let’s say you’re reading an article about texting candidates. You might wonder: could I use the same tools to text my current employees?
Traditionally, you’d have to go digging for that information somewhere else. But with ChatGPT, you can just ask.
Granted, the answer ChatGPT gives you could still be misleading or out of date. But if your question is covered by ChatGPT’s training materials, getting an answer could be as easy as asking the question.
3. There Are Fewer Distractions
Researching with ChatGPT, you don’t have to wade through marketing materials or flashy websites to get answers. ChatGPT provides information in a pretty standard format, which can also improve accessibility.
That being said, taking away the distraction of marketing isn’t the same as taking away marketing itself. ChatGPT was trained, in part, on marketing materials. Therefore, it’s likely to repeat the biases it learned from these sources.
Pitfalls of Researching Software With ChatGPT
1. The Scope Is Limited
ChatGPT was trained with a specific data set and does not continually access new data. As a result, its knowledge is limited. If those training materials didn’t include information about a certain software option, ChatGPT won’t be knowledgeable about it. It also may not account for new releases, development, mandates, or any other updates that would affect its research.
ChatGPT has a lot of generalized information to draw from, but it doesn’t have the discernment or niche industry knowledge that an actual industry expert might have. It’s not, therefore, always the greatest at anticipating problems or suggesting lesser-known solutions.
My responses are based on the data I have been trained on, which may not include all of the latest information or perspectives. As a result, I may not be able to provide a comprehensive overview of the hiring software landscape.
2. Bias and Misinformation
Because ChatGPT’s responses are based on a wide range of materials, it may reflect biases or misinformation that was present in that training data. As a result, the responses generated by ChatGPT may not always be reliable, accurate, or fair.
My responses are generated based on patterns in a wide range of sources from the internet. These sources may have inherent biases or inaccuracies that can potentially influence the information I provide.
3. Outdated Information
This is a big one. Although ChatGPT can be a great way to find information that isn’t time-sensitive, it’s important to remember that the platform is inherently out of date. Currently, ChatGPT’s training only covers a selection of materials that were available up to September 2021.
That’s a big problem if you’re trying to research software. Most platforms change over the years. ChatGPT currently cannot offer an accurate, up-to-date assessment of any software, website, or organization.
My training data goes up until September 2021, so I may not have the most up-to-date information on the latest features, pricing, or user reviews of specific software solutions. It’s always recommended to visit official websites, read current user reviews, and consult recent sources for the most accurate and current information.
There’s one more sneaky pitfall of ChatGPT that I have to mention. Basically: ChatGPT has a habit of misrepresenting itself.
While I wrote this article, ChatGPT repeatedly told me that one of its strengths was its ability to provide up-to-date information. When I asked it to clarify if it was really up-to-date, it apologized, took back the claim, and explained that it’s only up-to-date up to September 2021. If I didn’t already know about the 2021 knowledge cutoff, I might have been fooled.
This is a great example of why it’s important to fact-check ChatGPT’s information. Even simple claims about how it works can, at times, be absolutely untrue.
Researching Software With ChatGPT: Dos and Don’ts
DO Treat ChatGPT as a Supplemental Tool
ChatGPT can be a great tool to augment your existing research strategy. It can help you brainstorm new ideas or suggest different research paths—as long as you’re prepared to do more hard research with another tool.
DON’T Rely on ChatGPT to Make Decisions
Think of ChatGPT as an advanced text generator combined with an (outdated) search engine. Its results can be useful and impressive, but it’s not a replacement for other research or expert advice. Quite simply, it’s not a primary source of information.
You might find that ChatGPT’s answers contradict information you find elsewhere on the web. In this case, remember that ChatGPT is just detecting patterns from its 2021 data set. Odds are good that the other information is more up-to-date, nuanced, or accurate.
DO Use ChatGPT for Broad, Evergreen Research
Software changes rapidly, but plenty of research topics are less time-sensitive. ChatGPT can help you understand an industry, common problems, and potential solutions. One great use of ChatGPT in hiring research, for instance, would be to ask it to help you think of features you might need in recruitment software—it just won’t be so great at telling you what platform has these features.
DON’T Assume ChatGPT Has Discernment
ChatGPT seems smart, but that doesn’t mean it has a brain. It doesn’t necessarily understand the intricacies of your business, industry, or goals.
DO Cross-check Any Information ChatGPT Gives You
Be prepared to fact-check ChatGPT’s answers extensively. You can increase the odds of an accurate response by making your inquiry very specific, but even then, you’ll definitely want to find another source to verify the claim.
If an answer gives you pause, you can also ask ChatGPT to explain how it arrived at its conclusion. ChatGPT might not give you an exact source for its data, but sometimes it offers insight into its process that can help you decide whether the information is likely to be accurate.
DON’T Assume ChatGPT Is Fully Accurate
Don’t assume ChatGPT has access to up-to-date information, particularly in rapidly changing fields like technology. Its knowledge cutoff means that it can’t accurately report on any updates since 2021. But even for more general inquiries, ChatGPT sometimes gives answers that are simply false. I’ve encountered incorrect information many times while using ChatGPT—including while writing this article.
No source of information can be 100% accurate 100% of the time. But you have to be extremely vigilant to identify ChatGPT’s incorrect information, because it doesn’t always give you any indication that its answer may be wrong.
To see this problem in action, check out the following interaction. I asked ChatGPT if healthcare employers should be worried about being penalized for high turnover rates. Completely incorrectly, it told me no.
Since CMS has only recently begun factoring retention rates into facilities’ star ratings, I wasn’t surprised ChatGPT wasn’t up to date. But depending on how I phrased my question, it often didn’t acknowledge that its information was outdated.
In this case, ChatGPT is giving me objectively incorrect and potentially damaging information.
To the developers’ credit, ChatGPT will sometimes interject reminders to fact-check in case things have changed. But it frequently makes no such reminder. In either case, it’s entirely up to the user to verify everything ChatGPT says.
6 Ways to Use ChatGPT While Researching Hiring Software
Just because there are pitfalls to using ChatGPT as a primary research tool doesn’t mean you can’t use it at all. Here are just a few safer ways to use ChatGPT in your software research.
- Ask ChatGPT to help you brainstorm software features that could help accomplish your goals
- Ask ChatGPT to suggest questions you could ask during a software demo
- Explain the staffing problems you’re facing and ask ChatGPT what kinds of software tools might help
- Before implementing new software, ask ChatGPT what kinds of issues you might expect to encounter and how to avoid them
- Ask it to explain the pros and cons of different pricing models before you purchase new software
- Ask for tips on how to introduce new software to your team
TLDR: Should You Use ChatGPT to Research Software?
ChatGPT is a great tool for brainstorming, text generation, and general research. But it can’t directly view the web, it can only access data up to September 2021, and it has a habit of providing incorrect information without any indication of what it’s doing. For these reasons, it’s risky to use it to make decisions that could affect your business.
Be cautious about asking ChatGPT questions about specific platforms, pricing, or federal and state-level regulations. It’s much safer to use ChatGPT as a brainstorming tool rather than as a source of hard information. If you do use ChatGPT, it’s vital to understand exactly what it can and cannot do.
Interested in learning how you can automate your hiring while also accessing guidance from real industry experts? Apploi’s all-in-one platform can help you efficiently attract, hire, and manage healthcare talent all from one platform. Speed up your hiring with tailored advice from a team who understands your industry. Schedule a demo today.