
This article is for those considering PoC (Proof of Concept) development but are unsure about "how much it will cost," "how to proceed," or "which company to hire."
This article covers everything from the basic concepts of PoC development, to typical costs (500,000–5,000,000 yen), a 5-step process, and criteria for selecting an outsourcing partner. The latter half also introduces recent approaches to shortening the validation period using AI and LLMs.
By the end of the article, you should have all the information you need to start PoC development at your own company.
PoC (Proof of Concept) — what it is, and how it differs from a prototype or MVP — is a question that inevitably arises in new business development and DX promotion initiatives. Let's start by establishing the basic definition, then organize the situations in which a PoC is most effective.
PoC (Proof of Concept) refers to the process of verifying the feasibility of an idea or technology on a small scale. In the context of IT and software development, the core of a PoC is "confirming whether something is technically achievable before committing to full-scale development."
A PoC has three key characteristics:
The purpose of PoC development is to obtain the basis for a Go/No-Go decision.
For example, before a large enterprise introduces an AI chatbot, it might test whether sufficient accuracy can be achieved using 100 internal inquiry records — this is a typical PoC approach. If the accuracy meets expectations, development proceeds; if not, alternative approaches are explored. A major advantage of the PoC is that it allows you to determine direction with minimal investment.
PoC, Prototype, and MVP (Minimum Viable Product) are often confused, but they differ in purpose and phase.
| Name | Purpose | Primary Audience | Completeness | Typical Duration |
|---|---|---|---|---|
| PoC | Validate technical and business feasibility | Internal decision-makers | Low (proof-of-operation level) | 1 week – 3 months |
| Prototype | Verify UI, behavior, and user experience | Development team / select users | Medium (appearance close to production) | 1 – 3 months |
| MVP | Deliver value to the market with minimal features | Actual end users | High (release-ready quality) | 3 – 6 months |
Development Phase Overview:
Idea → [PoC: "Can we do it?"] → [Prototype: "Can we use it?"] → [MVP: "Can we sell it?"] → Full Development
Common Points of Confusion:
PoC development is particularly effective "before investing in technologies or concepts where feasibility is uncertain." Below are five representative scenarios.
① Evaluating the Introduction of AI/Machine Learning Start with a small-scale trial to determine whether an AI model can achieve sufficient accuracy using internal data. It is important to set KPIs in advance—such as "we will proceed if 80% accuracy is achieved"—so that Go/No-Go decisions can be made objectively. Example: A major human resources services company verified a reduction in workload from 20 hours/week to 2 hours/week through a PoC for automated AI analysis reporting, then decided to move to production.
② Validating the Feasibility of Operational Efficiency and Automation Test how much of routine work can be automated. Before building a full production system from scratch, run only the core processes to prove their effectiveness with concrete numbers. Example: A global logistics company confirmed a 70% reduction in processing time through an AI workflow automation PoC before rolling it out company-wide.
③ Verifying Improvements to Educational and Content Experiences The impact of new learning experiences or content delivery methods cannot be known until measured with actual user data. A PoC allows you to quickly validate hypotheses. Example: A major educational services company demonstrated an improvement in course completion rates from 45% to 78% through a PoC of an AI tutor feature with a 50-person monitor group.
④ Validating Core Features of New Businesses Confirm whether the functionality central to a business model is technically achievable. The results can also serve as supporting material for presentations to investors and senior management. Example: A video streaming startup verified through an AI video generation pipeline PoC that publication lead time could be reduced from 2 weeks to 3 days.
⑤ Confirming ROI for Internal DX Promotion For new systems with unclear return on investment, start by piloting within a single department or workflow to gather data and build a basis for management approval. Using a PoC as "evidence" to support internal budget proposals is a common practice.
What Happens When PoC Is Skipped:
After investing tens of millions of yen in full-scale development, situations arise such as "it turned out to be technically infeasible" or "the training data was insufficient." A PoC functions as "insurance that minimizes the cost of failure." Test small, learn fast—this is the value of PoC development.

"How much does a PoC actually cost?"——This is the first question you'll run into when you start considering placing an order. Honestly, the range spans from 500,000 yen to over 5,000,000 yen, depending on the scope of verification and technical complexity. That said, if you understand the structure of an estimate, you'll be able to judge "why this particular price."
The costs of PoC development can be categorized as follows based on the scope of verification, technical complexity, and duration.
| Scale | Estimated Cost | Estimated Duration | Primary Use Cases |
|---|---|---|---|
| Small | ¥500K–¥1.5M | 1–4 weeks | API integration verification, single-function prototypes |
| Medium | ¥1.5M–¥3M | 1–2 months | AI model accuracy verification, UX prototypes |
| Large | ¥3M–¥5M+ | 2–3 months | Multi-function integration, external system integration |
The figures above are estimates based on projects our company has handled in the past. When AI or machine learning is involved, additional data preparation costs may apply. Furthermore, by leveraging development companies in Thailand or Southeast Asia, cost reductions of approximately 30–50% may be achievable while maintaining the same level of quality.
The main reasons why PoC development estimates vary significantly from project to project come down to the following four factors.
1. Novelty of the Technology The amount of work required differs depending on whether you use an existing technology stack (React, Python, etc.) or build a new AI/ML model from scratch. PoCs using LLMs (Large Language Models) can require considerable time for prompt design and fine-tuning.
2. Readiness of Validation Data If internal data is already well-organized, the project can proceed at a lower cost. However, additional costs arise when data collection, cleansing, and labeling are required.
3. Complexity of Validation (Number of Integrations) Coordination costs increase when the project involves external API integrations, embedding into existing systems, or validation that spans multiple departments.
4. Availability of Ongoing Support An accompanied contract that includes "results analysis and Go/No-Go decision support" after the PoC is completed costs more than a simple development outsourcing arrangement, but it increases the likelihood of a successful transition to the next phase.
The most common reason PoCs fail is "trying to build a perfect system from the start." The following small-start strategies are effective for improving validation accuracy while keeping costs down.
① Narrow your validation focus to one or two axes Rather than trying to simultaneously validate "whether it's technically feasible" and "whether users will actually use it," start by focusing solely on technical validation.
② Make full use of no-code tools and LLMs Leveraging tools such as the OpenAI API, Claude API, and Dify can significantly reduce engineering man-hours. A cost-effective approach to PoC development is to implement 80% of the functionality using no-code solutions and handle the remaining 20% through custom development.
③ Manage budget risk by splitting into phases By breaking the project into separate orders—such as "Phase 1: Technical Research & Design (¥500,000)" and "Phase 2: Prototype Development (¥1,000,000)"—you create natural Go/No-Go decision points that help prevent unnecessary investment.

The most wasteful pattern in PoC development is when you've "built something vaguely, but in the end aren't really sure what you learned." To reach meaningful conclusions within a limited timeframe and budget, you need to define the structure of your hypothesis validation in advance. Here, we will walk through five steps for actually running a project.
The first step in PoC development is to clearly define "what needs to be proven for it to be considered a success."
Bad example (vague goal): "Check whether an AI chatbot can be used."
Good example (specific goal): "Verify whether the chatbot can answer more than 80% of 100 internal FAQ items accurately."
KPI setting examples:
Key principle: Choose KPIs that are both "measurable in numerical terms" and "serve as Go/No-Go decision criteria."
Once the goal is defined, articulate the hypothesis of "why you believe it can be achieved."
Hypothesis structure:
"[Preconditions] exist, therefore by using [approach], we can achieve [validation goal]"
Example:
"Given that over 200 internal FAQ entries exist, using OpenAI Embeddings + RAG architecture, we can automate 80% of inquiry responses"
The validation plan should include the following:
Once the hypothesis is defined, implement only the minimum features necessary for validation. A common pitfall at this phase is aiming for production quality.
PoC code is fine to write with the assumption that it will be thrown away. UI appearance, error handling, and security can all be kept to a minimum.
Key points for technology selection:
Estimated timeline: For a small-scale PoC (single feature), the goal is to complete an MVP within 1–2 weeks.
Once the prototype is complete, measure it against the KPIs established in Step 1.
Notes on measurement:
Evaluation perspectives:
| Evaluation Axis | What to Check |
|---|---|
| Technical Feasibility | Were the KPIs achieved? |
| Scalability | Can the same performance be achieved in a production environment? |
| Business Value | Is a positive ROI likely? |
| Risk | Are there any legal or security concerns? |
Based on the measurement results, you determine whether to "proceed to full development (Go)," "change direction (Pivot)," or "discontinue (No-Go)."
Decision Criteria:
| Result | Decision | Next Action |
|---|---|---|
| KPI achieved + business value confirmed | Go | Define requirements and secure budget for full development |
| KPI not met, but improvement is foreseeable | Pivot | Revise hypothesis and run another PoC |
| KPI not met + fundamental issues identified | No-Go | Explore alternative approaches |
Key Mindset: A No-Go is not a PoC failure. The knowledge gained from "trying it out" is itself valuable. Being able to fail small before wasting tens of millions of yen in full development is the return on your PoC investment.

The success or failure of PoC development often comes down to compatibility with your partner company. If you choose a partner simply because they were "the cheapest option" or "have an extensive track record," you may find that even if the PoC itself goes smoothly, things start to fall apart when you move into full-scale development. Here, we introduce the key criteria you should check before placing an order.
The first thing to check when selecting a PoC development partner is whether they have experience relevant to your specific challenges.
Understanding industry-specific regulations, data formats, and business workflows can only come from a company that has actually gone through the development process. The simplest way to verify this is to ask prospective partners directly: "Do you have PoC experience in our industry?"
For reference, here is a breakdown of industry-specific results Unimon has delivered:
| Industry | Key Results |
|---|---|
| Manufacturing | Translation & localization automation: 60% reduction in delivery time, 40% reduction in annual costs |
| HR / Staffing | Administrative workload reduced from 40h to 8h/month (80% reduction); analytical report creation reduced from 20h to 2h/week |
| Education | AI tutor implementation: course completion rate improved from 45% to 78% |
| Logistics | Workflow automation: 70% reduction in processing time |
| Accounting | Receipt OCR + LLM: 65% reduction in journal entry time |
| Media / Video | AI video generation pipeline: publication lead time reduced from 2 weeks to 3 days |
Beyond track record, the following points are also worth checking:
Can they handle the full journey from PoC to production? — If a company excels at PoC but hands off production development to another vendor, there is a real risk of issues with code quality and architecture handover.
Is their AI technology stack up to date? — Confirm whether they are equipped for the 2025–2026 technology landscape, including not just the OpenAI API, but also RAG, LLM agents (Mastra / Dify), and asynchronous queues (QStash).
Cost optimization through offshore development — Development companies with bases in Southeast Asia, such as Thailand or Laos, can potentially deliver the same quality at 30–50% lower cost compared to domestic vendors. The small time difference with Japan (approximately 2 hours) is also an advantage, as it minimizes disruption to day-to-day communication.
Be cautious of companies where all past work is under NDA and undisclosable (some cases are legitimate, but full non-disclosure across the board is a red flag), as well as companies that have only system development experience and no actual PoC track record.
PoC development estimates are an area prone to cost disputes due to the "ambiguous definition of deliverables."
Transparency Checklist:
Examples of Flexible Contract Structures:
The PoC success rate varies significantly depending on the support style of the development company.
| Style | Characteristics | Suitable Cases |
|---|---|---|
| Spot-based | Only builds what is instructed | PoCs with a clearly defined scope of technical validation |
| Accompaniment-based | Supports hypothesis design, results analysis, and Go/No-Go decisions | Exploratory PoCs where the problem is ambiguous |
Unimon's Accompaniment Support Structure:
An engineering team based in Bangkok, Thailand, provides end-to-end support covering "problem definition → PoC design → development → evaluation → next phase proposal."
| Structural Strengths | Details |
|---|---|
| Dual-base structure in Thailand and Laos | Senior engineers in Bangkok take the lead, collaborating with cost-competitive Lao engineers for development |
| Cost advantage | PoC development is available at 30–50% of the cost compared to domestic development |
| 2-hour time difference | Close to Japan Standard Time, making daily standups and real-time reviews easy to conduct |
| Native Japanese-speaking PM | Consistent Japanese-language support from requirements gathering through to results reporting |
The following outcomes have been achieved in actual accompaniment support projects:
In both cases, Unimon provided seamless, end-to-end support from problem definition through to quantitative validation.

Over the past one to two years, the speed of PoC development has changed dramatically. Validations that previously took three months are now being completed in two to three weeks by integrating generative AI into the development process.
Behind this shift is a concept called "AI-driven development." This does not refer to building systems that incorporate AI, but rather to an approach where AI collaborates throughout the development process itself — including code generation, testing, and documentation. This approach is particularly well-suited to PoC development, and the following benefits can be expected:
| Benefit | Description |
|---|---|
| Shorter delivery times | AI assists with coding, testing, and documentation generation, reducing development effort by 40–60% |
| Cost reduction | Decreased manual work allows more hypotheses to be validated within the same budget |
| Improved quality | AI-powered code reviews and automated test generation enable early bug detection |
| Faster hypothesis validation | The cycle of prompt design → implementation → measurement can be iterated rapidly |
At Unimon, we actively leverage tools such as GitHub Copilot, Claude API, and Cursor in real-world PoC development, achieving both shorter delivery times and cost reduction simultaneously.
AI-driven development centers on this AI-first PoC approach. The greatest advantage of PoC development using generative AI is the ability to significantly skip the mockup → prototype stages.
Traditional Approach (3–4 months):
AI-First PoC Approach (2–4 weeks):
Primary tech stack actually used by Unimon:
| Category | Tools & Frameworks |
|---|---|
| LLM API | OpenAI API, Claude API |
| RAG | pgvector (Supabase) + LangChain |
| AI Agent | Mastra, Dify |
| Async Processing | QStash |
| Frontend | Next.js / TypeScript |
| Backend | Supabase, FastAPI |
Optimizing PoC costs with Thailand & Laos offshore development:
Unimon's dual-base structure in Thailand and Laos makes this AI-first PoC approach even more cost-efficient.
| Comparison Axis | Details |
|---|---|
| Development Cost | 30–50% reduction compared to domestic development |
| Time Difference | Only 2-hour time difference from Japan (real-time collaboration possible) |
| PM Structure | Native Japanese-speaking PM on-site |
| Scalability | Team size flexibly adjusted across dual bases in Thailand and Laos |
Key Point: By deprioritizing UI and error handling and focusing exclusively on getting the "core hypothesis" running as quickly as possible, Go/No-Go decisions can be made within a matter of weeks. In AI-driven development, leveraging AI agent frameworks such as Mastra and Dify enables rapid prototyping of complex workflow automation in a short timeframe.
At Unimon, we have developed more than 20 PoCs and prototypes. Here we introduce three representative case studies.
Challenge: The existing LMS (Learning Management System) had a high learner dropout rate, reducing the return on investment in training.
PoC Approach:
Tech Stack: Next.js / TypeScript · RAG + pgvector (Supabase) · Mastra (AI Agent)
Result: Course completion rate improved from 45% to 78%. Following the successful PoC, a company-wide rollout was decided.
Challenge: Routine tasks such as shipping instructions, invoice verification, and inventory adjustments were siloed to specific individuals, averaging 40 minutes per task.
PoC Approach:
Tech Stack: Next.js / TypeScript · OpenAI API · Mastra · QStash
Result: Processing time for targeted tasks reduced by 70%. On an annualized basis, the equivalent of 1.5 full-time employees' workload was automated.
Challenge: A large volume of paper and PDF documents were being manually entered as journal entries each month, creating a staffing bottleneck during the closing period.
PoC Approach:
Tech Stack: Next.js / TypeScript · RAG + pgvector (Supabase) · Claude API · Stripe (payment integration)
Result: Journal entry work reduced by 65%. External staffing costs during peak periods were significantly cut.
What these cases share is a "start small, validate with data, then commit to full investment" approach. A PoC is the most rational way to minimize the cost of failure before committing to full-scale development.

Here are 3 frequently asked questions that we actually receive from customers considering PoC development.
It depends on the verification scope, but one to three months is a general guideline.
| Scope | Estimated Duration |
|---|---|
| Single feature / API verification | 1–4 weeks |
| AI model accuracy verification | 1–2 months |
| Multi-feature integration PoC | 2–3 months |
If it looks like it will exceed three months, pause and check whether the scope has grown too large or whether you are demanding production-level quality at the PoC stage.
"Failure" is also a legitimate outcome. The conclusion of a No-Go means you have avoided the risk of wasting tens of millions of yen in full-scale development, so it can be said that the PoC investment has delivered sufficient returns.
In the results report, it is common practice to clearly articulate "why it did not work" (whether it was a technology issue, a data issue, or a business definition issue), and to organize the direction of approaches that should be tried next.
Yes, in fact, we receive more requests from companies that have no engineers at all.
What you need is the ability to explain "what you are currently struggling with," domain knowledge about your business processes and data, and a decision-maker who can make Go/No-Go decisions participating in meetings. Technical requirements definition and architecture selection are the development company's responsibility.

Let's recap the key points to keep in mind for PoC development.
Define your validation goals upfront — Before you start development, put into words exactly what needs to be proven for a Go decision. A PoC without KPIs will almost certainly go off the rails.
Narrow your scope — The only features a PoC needs are those that address the core hypothesis. UI polish and error handling can be built out during full development.
Don't fear failure — Both Go and No-Go outcomes are valuable results. The true ROI of a PoC investment lies in "the cost of preventing major failures in full development."
Leverage AI-driven development — Combining tools such as GitHub Copilot, Claude API, and Cursor can reduce development effort by 40–60%. Pairing this with an offshore setup can drive costs down even further.
Once you have a clearer picture of how to approach your PoC development, the next step is selecting a partner company. At Unimon, our engineer team based in Bangkok is available from your very first free consultation.
If any of the above applies to you, please don't hesitate to get in touch.
Yusuke Ishihara
Started programming at age 13 with MSX. After graduating from Musashi University, worked on large-scale system development including airline core systems and Japan's first Windows server hosting/VPS infrastructure. Co-founded Site Engine Inc. in 2008. Founded Unimon Inc. in 2010 and Enison Inc. in 2025, leading development of business systems, NLP, and platform solutions. Currently focuses on product development and AI/DX initiatives leveraging generative AI and large language models (LLMs).