Building a Government Recruitment Platform with AI: The Koru Story

How I used AI-assisted development to build a complete government recruitment platform in record time - from architecture to Azure DevOps pipelines

Building a Government Recruitment Platform with AI: The Koru Story

What happens when you combine modern AI assistance with decades of software engineering experience? You get a government recruitment platform built in a fraction of the time, with enterprise-grade architecture, and comprehensive CI/CD pipelines. This is the story of Koru Recruitment.

Koru (pronounced “koh-roo”) is the Ijaw word for “wait” — a fitting name for a platform where applicants wait for opportunities, and hiring managers wait for the right candidates.

See it live: Koru Recruitment Platform

The Challenge

A government client approached me needing a modern recruitment platform. The requirements were clear:

  • A public job portal accessible to residents, diaspora, and general applicants
  • Complete application lifecycle management
  • Role-based access (Admin, HR Staff, Applicants)
  • Secure document management
  • Email notifications
  • Admin dashboard with real-time metrics
  • Budget constraint: Monthly operating costs under $50

The timeline? As fast as possible while maintaining quality.

Choosing the Stack

With the constraints in mind, I settled on a modern Azure-based architecture:

  • Frontend: Blazor WebAssembly with MudBlazor components
  • Backend: Azure Functions (serverless = cost-effective)
  • Database: Azure SQL Database (Basic tier)
  • Storage: Azure Blob Storage for documents
  • Hosting: Azure Static Web Apps (free tier available!)
  • Infrastructure as Code: Terraform
  • CI/CD: Azure DevOps Pipelines

This stack provides enterprise-grade capabilities while keeping costs minimal. The serverless approach means you only pay for what you use.

Enter GitHub Copilot

This is where things got interesting. Instead of the traditional approach of writing every line of code manually, I paired with GitHub Copilot to accelerate development dramatically.

Architecture in Minutes

I started with a simple prompt describing the platform requirements. Within minutes, I had:

  • A complete Clean Architecture structure
  • Domain models for all entities
  • Repository interfaces
  • Service layer abstractions

The AI understood the patterns I was aiming for and generated code that followed best practices without me having to explicitly state every principle.

Database Design

The database schema emerged naturally through conversation:

├── Users                    # User accounts and authentication
├── ApplicantProfiles        # Detailed applicant information
├── JobPostings             # Job posting management
├── Applications            # Application tracking
├── Interviews              # Interview scheduling
├── ApplicationDocuments    # Document management
├── ApplicationWorkflows    # Workflow tracking
├── WorkflowStages          # Workflow configuration
└── AuditLogs               # Complete audit trail

Instead of drawing ERDs and then translating to SQL, I described the domain and the AI helped generate both the conceptual model and the Entity Framework Core configurations.

The Azure DevOps Pipelines

This is where AI assistance really shined. Creating production-grade Azure DevOps pipelines typically takes days of careful configuration. With AI assistance, I had:

Build Pipeline Features:

  • Multi-stage builds (dotnet, database, client)
  • Parallel execution where possible
  • Artifact publishing
  • Test execution with coverage reports

Deployment Pipeline Features:

  • Template-based architecture for reusability
  • Environment-specific variable management
  • Terraform integration with state management
  • Automatic backend storage creation
  • Health checks and validation at each stage
  • CAF-compliant resource naming

The pipeline follows this flow:

Build → Dev → Prod

    [Infra] → [Database] → [Application] → [Validation]

Infrastructure as Code

Terraform configurations were generated with proper:

  • Azure CAF naming conventions
  • Environment separation
  • State management with Azure Storage backend
  • Secure handling of sensitive variables via Key Vault

What I Learned

AI Accelerates, But Doesn’t Replace Expertise

The AI could generate code quickly, but it needed guidance. Understanding Clean Architecture, SOLID principles, and Azure best practices allowed me to steer the AI toward optimal solutions. Without that foundation, I’d have accepted suboptimal patterns.

Iteration is Key

The first generated code was rarely perfect. The real power came from iterating:

  1. Generate initial code
  2. Review and identify issues
  3. Provide specific feedback
  4. Refine until correct

This mirrors traditional development, just at 10x speed.

Documentation Comes Free

One unexpected benefit: the AI naturally documented the code as it went. Comments explaining complex logic, README files for each component, and even this blog post were all assisted by AI.

The Human Touch Matters

There were moments where the AI’s suggestions were technically correct but didn’t fit our specific context. Government clients have unique requirements around data sovereignty, accessibility, and compliance that required human judgment to address properly.

The Migration Story

During development, I went through several naming iterations before settling on “Koru”. The project involved migrating from an earlier naming convention to the final Koru.Recruitment namespace. This is typically tedious work:

  • Rename projects
  • Update namespaces
  • Fix all references
  • Update configuration files
  • Modify pipelines

With AI assistance, this multi-hour task became a systematic process where the AI generated migration scripts and identified all affected files.

Cost Analysis

Running in production:

ComponentMonthly Cost
Azure Static Web AppsFree
Azure Functions (Consumption)~$5
Azure SQL (Basic)~$5
Azure Blob Storage~$1
Azure Key Vault~$1
Total~$12/month

Well under the $50 budget, with room to scale.

Key Takeaways

  1. AI is a force multiplier: Not a replacement for skill, but an accelerator for those who have it
  2. Architecture still matters: AI generates code faster, but bad architecture is still bad architecture
  3. Infrastructure as Code is essential: AI can generate Terraform just as well as application code
  4. CI/CD pipelines benefit greatly: Template generation and complex YAML structures are natural for AI
  5. Documentation improves: When AI assists, documentation often comes as a byproduct

The CI/CD in Action

Here’s what the Azure DevOps pipelines look like in practice:

Deployment Pipeline Stages Six-stage deployment: Validate → Infrastructure → Database → Applications → Post-Deployment → Notifications

The deployment pipeline completed in under 20 minutes, with full Terraform infrastructure provisioning, database schema deployment, and application deployment to Azure Static Web Apps.

What’s Next

The platform is now deployed and ready for production use. Future enhancements include:

  • Video interviewing integration
  • Advanced analytics dashboard
  • Mobile application
  • Multi-language support

Each of these will benefit from the same AI-assisted approach, building on the solid foundation we’ve established.


Try it yourself: Koru Recruitment Platform

Want to see the code? The architecture patterns and pipeline templates are available as examples for your own projects. The key is combining modern tooling with solid engineering fundamentals.

Building something similar? I’d love to hear about your experience with AI-assisted development. Drop me a line.