A tweet went viral claiming consulting is dead because a woman at a top consulting firm had ChatGPT write every word of a presentation for a Fortune 50 client. That sparked an idea: what if you could build an app that generates consultant-style reports from any website, using AI to do the research and writing?
That's exactly what we built. In 12 minutes, from zero to a deployed app, using Cursor, OpenAI's o1, V0, Firecrawl, and Patched.
The stack
Here's what each tool handles:
- OpenAI o1 β The reasoning model that generates the actual report. What's unique about o1 is it thinks through your instructions before responding. No system messages or tool calls, just a prompt and a structured response.
- Firecrawl β Handles web crawling. You give it a URL, it recursively crawls the site and returns clean markdown. We limit it to 5 pages to keep things fast.
- V0 β Vercel's AI UI generator. You describe what you want, and it builds a React component with all the dependencies installed. We use it to scaffold the entire frontend.
- Cursor β The AI-powered code editor. Composer mode lets you reference files, create new ones, and edit existing ones all from the same prompt.
- Patched β Automates code reviews, docs, and patches. We use it at the end to auto-generate a README from the codebase.
Scaffolding the frontend with V0
The first prompt to V0 is straightforward: create a React component for a consultant report generator using TipTap (a lightweight rich-text editor, like a mini Google Docs), a random report button with example data, and a header/footer.
V0 streams out the component and it works right away. Click "Generate random report" and you get interactive content inside the TipTap editor with basic formatting, centering, bold, and the usual rich-text controls.
Next prompt: add a URL input, an instructions input, and a submit button above the TipTap editor. Now we have the full layout β a URL field, instructions, a generate button, and the editor below.
The really cool part about V0 is when you click "Add to codebase" and run its setup command, it installs all the dependencies automatically. TipTap, Shadcn components, everything. Check the package.json and it's all there. No manual installs needed.
Setting up the Next.js project
From there it's standard Next.js setup:
bun create nextRun the V0 command to install the component, clear out the default homepage, drop the V0 component in, and start the dev server. Everything works locally right away.
Wiring up the backend step by step
Here's where Cursor's Composer mode shines. Instead of trying to do everything in one prompt, you build step by step and make sure each piece works before moving on.
Step 1: Mock data first.
Open Composer, pass in the V0 component and the page file as context, and say: "Set up a route that returns HTML with the instructions and some mock data on submission, and replace what's in the TipTap editor. Use App Router."
Composer creates the API route and updates the frontend to handle the request. Test it β paste a random URL, type "test", submit, and mock HTML appears in the editor.
Step 2: Wire up OpenAI o1.
One thing to note about o1: it doesn't support system messages or tool calls. You send everything as a single user prompt.
Tell Composer to send the instructions to o1 and ask for HTML back. It updates the route to call o1-preview instead of returning mock data. Test it β you get a real response from OpenAI in the editor.
Step 3: Add Firecrawl for web crawling.
This is where the app gets interesting. Firecrawl's Node SDK lets you crawl a URL and check the crawl job status. Copy a couple examples from the Firecrawl documentation into Composer (or use Cursor's docs feature to index the full page) and say: "Use the URL from the request to crawl with Firecrawl. Limit results to 5 pages."
The updated route now:
- Initializes the Firecrawl client
- Crawls the URL
- Passes the crawled content plus the user's instructions to o1
- Returns the generated HTML report
Install both SDKs:
bun add @mendable/firecrawl-js openaiTest it with a real URL. The app crawls the site, sends the data to o1, and generates a structured report. It comes back as valid HTML, but everything is flat β same font size, no hierarchy.
Styling the TipTap editor
The fix is simple. In your global.css, add broad styles targeting the base HTML elements inside the TipTap editor. You can ask Cursor to generate these or grab them from the TipTap documentation. Either way, you're targeting h1, h2, p, ul, and other standard elements scoped to the editor.
After that, the report looks good. And here's the bonus: TipTap content copies cleanly into Google Docs with the structure intact. Headings, lists, formatting β it all carries over.
Auto-generating a README with Patched
Once the app is working, push it to GitHub. Now comes the documentation step, and this is where Patched saves a lot of time.
Patched is a platform for building AI workflows. You can automate code reviews, docs, and patches. It also has an open-source project if you want to use your own LLMs.
The workflow is simple:
- Add your repository in Patched
- Click "Generate README"
- Select the branch and output directory
- Click "Patch"
Patched analyzes the entire codebase, understands what each file does, and generates a comprehensive README. The result breaks down the components, the API route, the inputs and outputs, setup instructions, environment variables, and even deployment options.
As a developer shipping a lot of projects, maintaining good documentation for every repo is a real challenge. Patched handles it in a couple clicks and honestly produces a better README than most developers would write manually.
Deploying to Vercel
Last step: deploy to Vercel. Create a new project, point it at the repo, and deploy. The only manual step is adding your environment variables (Firecrawl API key and OpenAI API key) in the Vercel dashboard. After that, you have a live, hosted consultant report generator.
What we covered
This was a full journey from idea to deployment:
- V0 scaffolded the frontend with TipTap, inputs, and all dependencies installed automatically
- Cursor Composer wired up the backend step by step β mock data first, then OpenAI, then Firecrawl
- Firecrawl handled the web crawling, returning clean markdown from any URL
- OpenAI o1 generated structured HTML reports from the crawled data and user instructions
- Patched auto-generated the README from the codebase
- Vercel deployed the whole thing in one click
The key takeaway: build incrementally. Don't try to get everything working in one prompt. Start with mock data, verify the frontend works, then swap in real APIs one at a time. Each step should be testable on its own.
If you want to try Firecrawl for your own projects, grab a free API key at firecrawl.dev.
Frequently Asked Questions
What does this app do?
It's a consultant report generator. You paste a URL and some instructions, Firecrawl crawls the site and returns structured data, then OpenAI's o1 model turns that data into a formatted HTML report inside a TipTap editor. You can edit the report right in the browser and copy it into Google Docs.
Why use Firecrawl instead of scraping manually?
Firecrawl handles the entire crawl pipeline for you. You give it a URL and a page limit, it recursively crawls the site and returns clean markdown. No need to build your own scraper, handle pagination, or deal with JavaScript rendering.
What is OpenAI's o1 model?
OpenAI's o1 is a reasoning model that thinks through instructions before responding. It doesn't support system messages or tool calls like GPT-4, but it excels at following detailed instructions and producing structured output like HTML reports.
Can I use a different LLM instead of o1?
Yes. The route sends a prompt with the crawled data and instructions to the model. You could swap o1 for GPT-4, Claude, or any other model that accepts a text prompt and returns HTML.
What is Patched and why is it used here?
Patched is a platform for building AI workflows that automate code reviews, documentation, and patches. In this project, it's used to auto-generate a README for the GitHub repository by analyzing the codebase and producing structured documentation in a single click.

data from the web