I Built a SaaS Product With No Engineering Background. Here's What Actually Worked.
My career has spanned sales at Sony, small business consulting at ANZ Bank, and five years teaching ATAR Computer Science to high school students in Perth. I understood technology concepts well enough to teach them - but I had never built a commercial software product. I could not have told you the difference between a framework and a library two years ago. And I just built and launched a SaaS product that handles user authentication, payment processing, AI-powered report generation, and database management.
This is not a "learn to code in 30 days" success story. It is a more honest account of what happens when someone with business and education experience but zero software engineering background tries to build a real product in 2026 - with AI as their engineering partner.
The Background
My consulting work through Cognitiv often involved assessments - helping organisations measure things like leadership capability, team dynamics, and coaching readiness. In the classroom, I was designing student assessments and writing individual feedback reports every term. The same bottleneck kept appearing in both worlds: the tools were fine for collecting responses, but turning that data into meaningful, personalised feedback was always the bottleneck. Either people got templated feedback (generic) or someone wrote individual reports manually (unsustainable).
I knew what the product should do. I had the domain expertise from years of working with coaches, educators, and consultants. I had designed assessments in boardrooms and classrooms alike. What I did not have was the ability to build software.
In the past, that would have meant hiring developers, which would have meant raising capital or spending savings on an MVP that might not work. In 2026, there is another option.
The Stack
For context, here is what Scorafy runs on:
- Next.js - the web framework (React-based, handles both the frontend and backend)
- TypeScript - the programming language (JavaScript with type checking)
- Tailwind CSS - the styling system (utility classes instead of writing custom CSS)
- Supabase - the database and authentication (like Firebase but with Postgres)
- Stripe - payment processing
- Claude AI - both the development assistant and the engine that powers Scorafy's report generation
- Vercel - hosting and deployment
I did not choose this stack through careful technical evaluation. I chose it because it is what the AI tools I was using could work with most effectively, and because Next.js plus Vercel has the smoothest deployment experience I could find. You push your code and it just works.
What AI-Assisted Development Actually Looks Like
There is a romanticised version of this story where you describe what you want and AI writes the code perfectly on the first try. That is not what happens.
What actually happens is closer to pair programming with a very patient, very knowledgeable colleague who sometimes misunderstands what you are asking for. A typical interaction looks like this:
I describe what I want to build - say, a page where users can create assessment questions with drag-and-drop reordering. The AI writes the code. I deploy it. Something does not work the way I expected - maybe the drag-and-drop works but the order does not save to the database. I describe the problem. The AI fixes it. I deploy again. A different edge case breaks. We iterate.
Most features took three to five iterations to get right. Some took more. The AI is excellent at writing functional code quickly, but it does not always understand the business logic behind what you are building. You have to be precise about what you want and patient about getting there.
The biggest skill I developed was not coding - it was describing problems clearly. The better I got at articulating exactly what should happen, what was happening instead, and what the constraints were, the better the AI's output became. It turns out that years of writing client briefs, lesson plans, and assessment rubrics is directly transferable to writing AI prompts.
What Was Easier Than Expected
Database design. I expected this to be impenetrable. It was not. Supabase gives you a visual interface for creating tables and relationships. The AI could explain concepts like foreign keys and row-level security in plain English. I still do not think in SQL natively, but I can design a schema, write queries, and troubleshoot issues.
Payment integration. Stripe's documentation is genuinely good, and the AI has clearly been trained on thousands of Stripe implementations. Getting subscriptions, checkout flows, and webhooks working was smoother than I expected. The trickiest part was not the code - it was understanding Stripe's concepts (products vs prices vs subscriptions vs checkout sessions).
Deployment. Vercel's deployment process is effectively "connect your GitHub repository and push." I went from code on my laptop to a live website with a custom domain in under an hour. This would have been a multi-day DevOps exercise five years ago.
What Was Harder Than Expected
Authentication. Getting user login, signup, password reset, and session management right was the most frustrating part of the entire build. There are a dozen ways to do it, each with subtle trade-offs, and the error messages when something goes wrong are often unhelpful. I rebuilt the auth system three times before landing on something stable.
State management. Understanding when data lives on the server versus the client, when to refetch, when to cache, and how to keep the UI in sync with the database - this was conceptually harder than writing the code itself. The AI could write the code for any approach I asked for, but knowing which approach to ask for required understanding I did not have.
Edge cases. The happy path - where users do exactly what you expect - is straightforward to build. But what happens when someone opens two tabs? What happens when a payment webhook arrives before the database write completes? What happens when a user starts an assessment, closes their browser, and comes back three days later? Every edge case is a small puzzle, and there are hundreds of them.
Knowing what you do not know. This is the meta-problem. When you do not have an engineering background, you do not know what questions to ask. I shipped several features that worked perfectly in development and broke in production because I did not know about things like environment variables, CORS policies, or the difference between server-side and client-side rendering. Each of these was a learning moment - but they were also hours of debugging something I did not know existed.
What I Would Do Differently
Start with the data model. I built features first and figured out the database structure as I went. This led to several painful migrations where I had to restructure tables with live data. If I started again, I would spend the first week mapping out every entity, relationship, and data flow before writing a line of code.
Write tests earlier. I still do not write enough tests. But the few times I did set up automated testing, it caught problems before they reached users. "I will add tests later" is a trap I fell into repeatedly.
Ship a smaller v1. Scorafy launched with assessment creation, AI report generation, payment processing, PDF export, custom branding, conditional branching, linked assessments, and more. I could have launched with just assessment creation and AI reports - validated demand - and built everything else based on what users actually asked for.
The Honest Take
Building a SaaS without an engineering background in 2026 is genuinely possible. AI development tools have lowered the barrier from "impossible without years of training" to "possible with persistence and clear thinking." That is a real shift.
But "possible" is not "easy." I spent hundreds of hours on this. I hit walls that took days to get past. I broke things in production. I built features I later threw away. The AI accelerated everything enormously - what might have taken a team of engineers months took me weeks - but it did not eliminate the need to understand what you are building and why.
The real advantage I had was not technical - it was domain expertise. I knew exactly what coaches, educators, and consultants needed because I had been all three. I did not have to guess at the features or validate the market. I was building the tool I wished existed.
If you have deep expertise in a domain and a clear vision for a product, the technical barriers are lower than they have ever been. The question is not whether you can build it. The question is whether you are willing to spend the time learning as you go, tolerating the frustration, and shipping something imperfect.
Scorafy is live at scorafy.com. You can browse the full feature set, or try the demo to see what I built - and judge for yourself whether a non-engineer can ship a real product.