Thinking about building a Deep Research" tool for job seekers

I own a couple job boards and have access to a huge dataset of job listings, salary ranges, and hiring trends. Instead of making job seekers scroll through endless postings and tweak filters manually, I’m thinking about building a deep research tool for job search—kind of like how ChatGPT’s Deep Research dig deep to provide structured, thoughtful answers instead of just dumping links.

The tool would be interactive and hyper-personalized. Users would upload their resume and answer a few targeted questions (or the AI could ask clarifying questions as needed). Then, instead of returning a basic job list, it would generate a multi-page report that includes:

Curated job matches, ranked and explained (not just “you qualify,” but why you do, or where you fall short).

  • Market insights, such as salary ranges, hiring trends, and which companies/industries are actively recruiting.
  • Skills gap analysis, highlighting missing qualifications and suggestions for upskilling.
  • Interactive elements, allowing users to tweak preferences, mark interesting jobs, and even refine the search dynamically.

It wouldn’t be instant—probably 10-20 minutes to generate, since it’s actually processing and analyzing rather than just running a keyword search. Users would get notified when it’s ready.

Does this sound like something that would be useful? Would you trust an AI to “do the research” for you instead of manually searching job boards? Any potential pitfalls I should consider?

Why would it take 10 to 20 minutes?

The research tool of OpenAI is most probably slow because they check images and pdfs in pages sequentially one page / image after the other.

It doesn’t need to be like this. You can run thousands of analysis in paralallel and make it fast like a keyword search (ok, not exactly - but 20 minutes is just a very poor level).

Just suggesting. Even a single o3 prompt execution would take dozens of seconds. in this case I suppose dozens of even hundreds of llm calls would be needed. I expect the system to pull the data first like jobs, trends, salaries etc and then reason based on that data.

Was working for a job platform in my very early days I think around 2002/2003. We did just that. Scraping thousands of pages and getting the data which company is looking for employees. Like a real search engine. Manually created regular expressions omg… doing complex SQL to analyze that stuff. I mean you can do a lot with SQL alone.

What exactly do you have in mind when you say prompt o3 and let it reason?

Any specific data that comes to you mind or are you looking for specific trends?

I mean learning path creation with LLM - yeah cool. Like “you are here and you want to take this job and it will take you 3 years to get the certifications required for it”?

Or the jobsearch companion? I feel like that’s a really depressing thing to think about "people will spend so much time with jobsearch that they need a tool and they even bound to it?

Tbh nothing of the stuff that you’ve listed can’t be done without a GPT model. Looks like a 5-10 second task to grab such data.

When developing this, you could make use of a search API and let the AI search online itself. However keep in mind to always output sources as well, in case the job seekers want to read through what the AI found themselves.
You can webscrape yourself or use a library/API for that as well.

I’m sure you could cook up something absolutely beautiful! :hugs:

How you deal with Fake Jobs?

1 Like