← Industries/Local Government
Case Study

From 14-Day Wait Times to Instant Citizen Answers

How the City of Meridian Falls deployed IntelligenceAmplifier.AI to transform citizen services, automate permit processing, and give every city employee instant access to the full breadth of municipal knowledge — while maintaining complete data sovereignty.

74%
Faster citizen inquiry response
58%
Permit processing acceleration
$1.2M
Projected annual savings
16 wks
Kickoff to full production
01
Executive Summary

The Situation at a Glance

The City of Meridian Falls is a mid-size municipality serving 148,000 residents across 12 departments including Public Works, Planning & Zoning, Code Enforcement, Parks & Recreation, the City Clerk's Office, Finance, and the City Manager's Office. The city employs 620 full-time staff and processes approximately 34,000 citizen service requests, 8,200 permit applications, and 1,100 public records requests annually.

Despite a decade of investment in digital government platforms — online permit portals, a 311 citizen request system, and a redesigned city website — Meridian Falls was drowning in its own institutional knowledge. Ordinances spanning forty years, zoning codes across 23 districts, council resolutions numbering in the thousands, departmental procedures never consolidated, and a permit process that required applicants to navigate requirements scattered across six different document sources. Citizens waited an average of 14 business days for substantive responses to inquiries. Staff spent 40% of their workday searching for information they knew existed but could not locate quickly.

In mid-2024, Meridian Falls engaged arvintech to deploy IntelligenceAmplifier.AI as the city's unified knowledge intelligence layer — a private AI system trained exclusively on the city's own ordinances, policies, procedures, meeting minutes, and operational documents. The deployment took 16 weeks from contract execution to full production. Within 90 days of launch, average citizen response time dropped from 14 days to 3.2 days, permit processing accelerated by 58%, and the city projected $1.2 million in annual operational savings.

This case study documents the technical architecture, data preparation, deployment workflow, and measured outcomes of that engagement.


02
The Challenge

Five Systemic Failures Hiding in Plain Sight

Meridian Falls' operational challenges were not caused by incompetent staff or inadequate technology. They were caused by the exponential growth of institutional knowledge that no human — and no traditional software system — could navigate efficiently. Every department had its own document repositories, its own interpretation of shared ordinances, and its own tribal knowledge that existed only in the heads of senior employees.

14 days

Average Citizen Response Time

Substantive citizen inquiries about permits, zoning, and city services required an average of 14 business days from submission to a complete, accurate response — driven by the need to research answers across fragmented document systems.

40%

Staff Time Lost to Information Search

City employees across all departments spent an estimated 40% of their workday searching for policies, ordinances, procedures, and precedents — information that existed but was scattered across dozens of repositories.

6 sources

Permit Application Confusion

A single permit application required citizens to navigate requirements from up to six different document sources — the municipal code, zoning maps, application checklists, fee schedules, inspection requirements, and applicable variances.

22 days

Public Records Response Time

The City Clerk's Office averaged 22 business days to fulfill public records requests, well above the 10-day statutory target, due to the manual effort of identifying and compiling responsive documents across departments.

9 months

New Employee Effectiveness

New city employees averaged nine months before they could independently answer citizen questions or navigate the city's full institutional knowledge — creating a sustained dependence on senior staff for routine inquiries.

The root cause across all five failures was identical: the city's knowledge was fragmented across systems that didn't communicate with each other. The zoning code lived in one system. The related ordinances lived in another. The council resolutions that amended those ordinances lived in meeting minutes stored somewhere else. The departmental procedures that implemented those ordinances were in yet another location — if they were documented at all.

Staff who had been with the city for twenty years could navigate this complexity through institutional memory. New hires could not. And citizens had no chance. When a resident called to ask whether they could build a detached garage on their property, the answer required cross-referencing the zoning district map, the zoning code setback requirements, recent variance approvals on the same block, and the building permit application checklist. No single system contained the complete answer. The city needed an AI that could.


03
Solution Overview

A Private AI Brain for the Entire Municipality

arvintech proposed IntelligenceAmplifier.AI as a closed-loop, municipally owned AI deployment. The foundational principle: the system would be trained exclusively on Meridian Falls' own documents and would run within infrastructure controlled by the city. No citizen data, no internal communications, and no proprietary municipal documents would ever leave the city's network or be processed by third-party AI providers.

The architecture used Retrieval-Augmented Generation (RAG) — a design pattern where a large language model is paired with a private vector database containing the city's full document corpus. Rather than relying on generic AI knowledge, every response is grounded in the city's actual ordinances, resolutions, policies, and procedures. The AI retrieves and synthesizes — it does not guess.

Five primary use cases were scoped for the initial deployment:

  1. Citizen Inquiry Assistant — AI-powered responses to resident questions about permits, zoning, ordinances, and city services
  2. Permit Application Navigator — Guided permit workflows that identify requirements, check zoning compliance, and pre-validate applications
  3. Policy & Ordinance Q&A — Instant answers for city staff across all departments, with source citations to specific code sections
  4. Council Meeting Intelligence — Searchable, AI-summarized meeting minutes with resolution tracking and decision history
  5. Staff Onboarding Knowledge Base — An AI-guided orientation through city procedures, organizational structure, and departmental protocols

A sixth use case — automated public records request triage — was added during the pilot phase after the City Clerk's Office identified it as a high-impact opportunity to reduce their 22-day average response time.


04
Tech Stack

The Complete Technical Architecture

The IntelligenceAmplifier.AI deployment for Meridian Falls is a multi-layer system designed for municipal-grade security, public-facing reliability, and seamless integration with the city's existing GIS, permitting, and records management platforms.

1AI & Language Model Layer

LLM Engine
Private LLaMA 3.1 70B (quantized, on-premise)
Open-weight model enables full on-premise deployment within the municipal data center. No city data leaves city-controlled infrastructure. Quantized to 4-bit for GPU efficiency.
Embedding Model
BGE-M3 (BAAI) — multilingual dense retrieval
High-quality embeddings for legal and regulatory language. Handles the formal, cross-referencing style of municipal ordinances and resolutions.
RAG Framework
LangChain + custom municipal retrieval pipeline
Custom pipeline includes cross-reference resolution for ordinance citations, temporal awareness for superseded policies, and multi-hop retrieval for complex zoning queries.
Inference Server
vLLM with PagedAttention
Handles 30+ simultaneous queries during peak citizen usage. Sub-3-second response time for the public-facing citizen assistant.

2Vector Database & Retrieval

Vector Store
Weaviate (self-hosted, municipal data center)
Stores document embeddings with rich metadata: document type, department, effective date, superseded status, and zoning district applicability. Enables precise filtered retrieval.
Chunking Strategy
Hierarchical semantic chunking — section-aware
Municipal codes have deep section hierarchies (Chapter → Article → Section → Subsection). Custom chunking preserves these boundaries and maintains parent-child context.
Reranking
Cohere Rerank (self-hosted) — cross-encoder
Reranks top-20 retrieved chunks to top-5 before LLM context injection. Critical for distinguishing between similar ordinance sections that apply to different zoning districts.

3Document Ingestion Pipeline

PDF Extraction
Apache Tika + custom OCR (Tesseract 5)
Municipal documents include scanned council resolutions from the 1980s, signed permits, and legacy PDF exports. Multi-engine extraction handles all formats reliably.
Document Classification
Fine-tuned DistilBERT classifier (18 categories)
Automatically tags documents as ordinance, resolution, policy, procedure, permit form, meeting minutes, budget document, etc. Enables role-based and category-based retrieval.
Cross-Reference Resolution
Custom NLP pipeline — regex + dependency parsing
Identifies and resolves internal citations ("per Section 12.4.2(b)") by linking referenced sections into the embedding context. Resolved 4,847 cross-references across the municipal code.
Update Pipeline
Apache Airflow — post-council-meeting sync
After every council meeting, newly adopted ordinances and resolutions are ingested, superseded documents are flagged, and the vector database is updated within 48 hours.

4Integration & Infrastructure

GIS Integration
Esri ArcGIS REST API
Allows the AI to resolve address-based queries by identifying the zoning district, overlay zones, and applicable regulations for a specific parcel — enabling "Can I build X at 123 Main St?" queries.
Permit System
Tyler Technologies EnerGov API bridge
Connects the AI to the city's existing permit management system. The AI can check permit status, identify required inspections, and pre-validate application completeness.
Authentication
SAML 2.0 via Azure AD (city SSO) + anonymous citizen tier
Staff access via existing city credentials with RBAC. Citizens use the public assistant without authentication. Rate limiting and abuse detection protect the public endpoint.
Compute
2× NVIDIA A100 80GB (municipal data center)
Dedicated GPU cluster provisioned by ArvinTech, physically located in the city's secure data center. Under city physical and logical control.
UI Layer
Next.js 14 — embedded in city website + intranet
Public citizen assistant embedded on the city website. Internal staff portal deployed on the city intranet. Both share the same AI backend with role-appropriate access controls.

A critical architectural decision was the hybrid deployment model. The vector database and all document processing run on city-controlled infrastructure hosted in the municipal data center managed by Meridian Falls' IT department. The LLM inference layer runs on a dedicated GPU server provisioned by arvintech, physically located in the same data center. A private cloud failover on Azure Government provides high-availability during peak citizen usage — typically Monday mornings and the days following council meetings.

All communication between system components is encrypted with TLS 1.3. Staff access is authenticated via SAML 2.0 single sign-on integrated with the city's Active Directory. The public-facing citizen assistant operates through the city's existing website with rate limiting and abuse detection. No citizen personally identifiable information is stored in the AI layer.


05
AI Preparation

Ten Weeks of Groundwork Before a Single Query

Municipal document environments are among the most challenging for AI preparation. Unlike a corporation with a single document management system, a city government accumulates documents across decades, across administrations, across departments that often operate with minimal coordination. Ordinances reference other ordinances that reference resolutions from twenty years ago. Meridian Falls was no exception.arvintech ran a structured ten-week preparation phase before any AI model was trained or tested.

1
Week 1–2

Municipal Document Audit & Inventory

ArvinTech conducted a comprehensive audit of all city document repositories: the Laserfiche records management system, SharePoint departmental sites, the Municode online code library, the City Clerk's resolution archive, and legacy shared drives. The audit identified 21,340 documents across 18 document types spanning four decades.

  • 21,340 total documents inventoried across 8 source systems
  • 18 distinct document categories mapped to departments and user roles
  • 4,120 documents flagged as superseded or expired and excluded from the active corpus
  • 2,847 council resolutions spanning 1984–2024 requiring temporal indexing
2
Week 3

Data Quality Assessment & Legal Review

Each document category was evaluated for extractability, completeness, currency, and legal status. The City Attorney's Office reviewed the approach to ensure the AI would not misrepresent superseded ordinances as current law — a critical requirement for any government AI deployment.

  • 19% of documents required remediation before ingestion
  • 12% were scanned legacy documents requiring OCR processing
  • 7% had inconsistent section numbering requiring normalization
  • City Attorney sign-off obtained on training corpus scope and AI disclaimer language
3
Week 4–5

Cross-Reference Resolution & Temporal Indexing

Municipal codes reference themselves extensively. "Subject to Section 12.4.2(b)(iii)" appears in one section but requires the AI to understand the referenced section to give a useful answer. ArvinTech built a custom pipeline to resolve these cross-references and create temporal metadata — marking which documents supersede which, and when provisions took effect.

  • 4,847 internal cross-references identified and resolved across the municipal code
  • 1,203 ordinance amendments linked to their parent ordinances with effective dates
  • 342 superseded provisions flagged with replacement references
  • Temporal metadata enables the AI to answer "What was the setback requirement before 2019?" accurately
4
Week 6–7

Document Processing & Embedding

Approved documents were processed through the ingestion pipeline: text extraction, hierarchical semantic chunking (preserving the Chapter → Article → Section structure of municipal codes), embedding generation, and vector database indexing.

  • 17,220 documents approved for ingestion after audit and legal review
  • 243,680 vector embeddings generated across all document chunks
  • Average document processing time: 3.8 seconds per document
  • Total ingestion pipeline runtime: 18.2 hours (weekend batch)
5
Week 8–9

Retrieval Quality Testing

A test suite of 320 queries was developed collaboratively with department heads, the City Clerk, planning staff, and the City Manager's Office — representing real questions that citizens and staff ask. Each query was evaluated for retrieval precision, answer accuracy, citation correctness, and response to superseded provisions.

  • 320 test queries across 6 use case domains
  • Initial retrieval precision: 68% (target: 90%+)
  • Identified 4 document categories with poor chunk boundaries — re-chunked with section-aware splitting
  • Identified 3 terminology gaps — added municipal abbreviation glossary (PUD, CUP, ROW, BMP, etc.)
6
Week 10

Tuning, Prompt Engineering & Re-testing

Based on test results, ArvinTech refined the retrieval pipeline (adding zoning-district-aware metadata filtering, improving cross-reference context injection), optimized system prompts for each use case, and re-tested the full query set. Final retrieval precision reached 92.1% before production go-live.

  • Retrieval precision improved from 68% to 92.1% through pipeline tuning
  • Use-case-specific system prompts written for 6 workflow types
  • Response latency optimized: P95 latency reduced from 7.8s to 2.9s
  • Department head sign-off obtained on answer quality across all test domains

The municipal document challenge: Government documents are uniquely difficult for AI preparation because they are written to be legally precise, not semantically clear. An ordinance that says “structures as defined in Section 12.4.2(b)(iii)” means nothing to an AI without the referenced section. Our cross-reference resolution pipeline resolved 4,847 internal citations before embedding — ensuring the AI understands the full context of every provision, not just the text of a single section in isolation.


06
AI Workflow

How Every Query Flows Through the System

Understanding the workflow architecture is critical to understanding why IntelligenceAmplifier.AI produces reliable, citation-backed answers rather than generic responses. The system uses a Retrieval-Augmented Generation (RAG) pipeline with a six-stage processing flow for every query — whether from a citizen on the website or a city employee on the internal portal.

1

Query Intake & Access Tier Detection

A query arrives through either the public citizen portal or the internal staff interface. The system determines the access tier — citizen queries are limited to public documents (ordinances, resolutions, permit information, meeting minutes), while staff queries include internal procedures, draft policies, and departmental documents.

Source endpoint detected → access tier assigned → document domain filter applied → query passed to retrieval pipeline

2

Query Decomposition & GIS Resolution

Complex queries are decomposed into sub-questions. Address-based queries ("Can I build a fence at 456 Oak Ave?") trigger a GIS lookup to identify the parcel's zoning district, then decompose into: (1) fence regulations for that zoning district, (2) applicable setback requirements, and (3) permit application checklist.

LLM sub-query generation → address detected? → Esri ArcGIS parcel lookup → zoning district resolved → 1–4 parallel retrieval paths

3

Semantic Retrieval with Metadata Filtering

Each sub-query is converted to a vector embedding and searches the Weaviate database. Retrieval is filtered by: document status (active, not superseded), access tier (citizen vs. staff), and where applicable, zoning district. Hybrid search combines semantic meaning with keyword matching for legal terminology.

BGE-M3 embedding → Weaviate hybrid query (alpha=0.65 semantic / 0.35 keyword) → metadata filters applied → top-20 chunks returned

4

Reranking with Cross-Reference Context

Top-20 chunks are reranked by a cross-encoder model. Chunks containing cross-references have their referenced sections automatically injected into the reranking context — so a chunk saying "per Section 12.4.2(b)" is evaluated with the full text of that referenced section.

Cohere cross-encoder → cross-reference expansion → chunks scored 0–1 → top-5 selected → full citation chain preserved

5

Response Generation with Citation

Top-5 chunks are injected into the LLM context with a role-specific system prompt. The citizen prompt emphasizes plain-language explanations with code citations. The staff prompt provides full legal references and procedural detail. The AI is instructed to explicitly state when a question cannot be answered from available documents.

System prompt + retrieved context + user query → LLaMA 70B inference → streamed response → inline citations to specific code sections

6

Output, Disclaimer & Audit Logging

The response is delivered with inline citations linking to source documents. Citizen responses include a standard disclaimer that AI-generated answers are informational and official determinations require staff review. Every interaction is logged for audit and quality monitoring purposes.

Response + citations + disclaimer → UI rendering → audit log (timestamp, user_type, query_category, doc_sources) → citizen queries anonymized

Citizen Permit Inquiry Workflow: End-to-End

The highest-volume workflow is the citizen permit inquiry — a resident asking whether they can do something, what permits they need, and how to apply. Here is a detailed trace of how a typical citizen interaction flows through the system:

Citizen Permit Inquiry — Workflow Trace
1
Citizen
Visits the City of Meridian Falls website and opens the "Ask the City" assistant. Types: "I want to build a detached garage at 789 Elm Street. What do I need?"
2
AI System
Detects address in query. Calls the Esri ArcGIS API to resolve 789 Elm Street to Parcel ID MF-2847-003, Zoning District R-2 (Single Family Residential), no overlay districts, no historic designation.
0.4 seconds — GIS parcel resolution
3
AI System
Decomposes the query into three retrieval paths: (1) accessory structure regulations for R-2 zoning, (2) building permit requirements for detached garages, and (3) setback and lot coverage limits for the R-2 district. Retrieves and reranks relevant ordinance sections and permit checklists.
1.8 seconds — retrieval and reranking
4
AI System
Generates a comprehensive response: detached garages are permitted in R-2 as an accessory structure (Ordinance §14.6.3), maximum size is 720 sq ft or 40% of the primary structure footprint (whichever is smaller), rear setback minimum is 5 feet (§14.4.2), a building permit is required (Application Form BP-103), estimated fee is $485 based on the current fee schedule, and two inspections are required (foundation and final). Includes direct links to the applicable ordinance sections and the permit application form.
2.3 seconds — response generation with citations
5
Citizen
Reads the response, clicks through to the permit application form, and follows up: "Do I need a site plan?"
6
AI System
Retrieves the site plan requirements from §14.8.1 and the BP-103 application checklist. Responds: Yes, a site plan drawn to scale is required showing the proposed garage location, dimensions, setback distances from all property lines, and the location of existing structures. The site plan does not need to be professionally prepared for residential accessory structures under 1,000 sq ft.
1.6 seconds — follow-up response with context retention

Council Meeting Intelligence Workflow

The council meeting intelligence use case operates on a batch-plus-interactive model. After each council meeting, the AI processes the full meeting minutes, identifies motions and votes, links resolutions to affected ordinances, and generates structured summaries. City staff and council members can then query the system conversationally — “What did council decide about the downtown parking ordinance in the last six months?” — and receive a chronological summary with citations to specific meeting dates and resolution numbers.

This workflow replaced a manual process where the City Clerk's Office would receive staff requests for “what did council say about X” and spend hours searching through meeting minutes PDFs. Average response time dropped from 2.4 days to 12 seconds.

Public Records Request Triage Workflow

Public records requests are a significant operational burden for municipalities. Meridian Falls received approximately 1,100 requests annually, each requiring staff to identify responsive documents, review for exemptions, redact sensitive information, and compile the response package. The AI workflow for this use case:

  1. Request is submitted through the city's records portal and automatically parsed by the AI to identify the scope and document categories
  2. AI searches the knowledge base and identifies potentially responsive documents, ranked by relevance
  3. System flags documents likely to contain exemptions (personnel records, litigation materials, draft documents) for staff review
  4. Staff reviews the AI-compiled document package, confirms or modifies the selection, and approves release
  5. Response is generated with a cover letter citing applicable public records statutes

Average time from request to compiled document package: 4.2 hours (AI processing) plus staff review. Previously averaged 18 business days of cumulative staff time. Public records response compliance rate improved from 71% on-time to 94% on-time within the first quarter.


07
Implementation Timeline

16 Weeks from Kickoff to Production

Week 1
Project Kickoff & Stakeholder Alignment
Engaged City Manager, department heads, City Attorney, IT Director, and City Clerk. Established use case priority, success metrics, data access protocols, and public-facing AI disclaimer language.
Week 2–3
Municipal Document Audit
ArvinTech team embedded with city IT and City Clerk to inventory all document repositories. 21,340 documents identified across 8 systems spanning four decades.
Week 4–5
Data Quality, Legal Review & Cross-Reference Resolution
Document remediation, OCR processing of legacy scans, City Attorney review, and cross-reference pipeline construction. 4,847 internal citations resolved.
Week 6
Infrastructure Deployment
GPU cluster provisioned in municipal data center. Weaviate, vLLM, and supporting services deployed. Network security and penetration testing completed.
Week 7–8
Document Ingestion & Embedding
243,680 embeddings generated from 17,220 approved documents. Vector database indexed. Post-council-meeting sync pipeline activated.
Week 9
Alpha Testing with Department Champions
15 department champions (3 per primary use case) ran structured testing. 320 test queries evaluated. Retrieval precision: 68%. Zoning terminology gaps identified.
Week 10
Pipeline Tuning & Prompt Engineering
Section-aware re-chunking, metadata filtering, cross-reference context injection, and system prompts refined. Retrieval precision improved to 92.1%.
Week 11–12
Pilot — Internal Staff Only
Full deployment to Planning & Zoning, City Clerk, and Public Works (94 staff). Real-world feedback gathered. Public records triage use case added based on Clerk feedback.
Week 13–14
Expanded Rollout — All Staff + Public Citizen Assistant
System opened to all 620 city employees. Citizen-facing assistant launched on city website with disclaimer and feedback mechanism. Local press briefing held.
Week 15–16
Stabilization & Handover
System monitoring handed to city IT. ArvinTech ongoing support SLA activated. Baseline metrics collection completed. City Council presentation delivered. Project formally closed.

08
Data Governance & Security

Data Sovereignty by Architecture, Not Policy

Municipal AI deployments operate under a unique trust requirement: citizens must be able to trust that their government is not sending municipal data to third-party AI companies, that their personal information is not being used to train commercial models, and that the AI's responses reflect the city's actual laws and policies — not general internet knowledge. Meridian Falls' deployment was architecturally designed to satisfy all three requirements by default.

Zero External Data Transmission
All LLM inference runs on the municipal data center GPU cluster. No document content, citizen queries, or city data is ever sent to external AI providers.
Municipal Data Sovereignty
The entire AI system — models, databases, logs — runs on city-owned or city-controlled infrastructure. The city retains full ownership and control of all data and all AI outputs.
AES-256 Encryption at Rest
All vector embeddings, document metadata, and system logs encrypted at rest. Encryption keys managed by the city's existing IT key management infrastructure.
TLS 1.3 in Transit
All inter-service communication encrypted with TLS 1.3. Public-facing citizen endpoint served over HTTPS with certificate pinning.
Citizen Anonymity by Design
No personally identifiable information is collected from citizens using the public assistant. No cookies, no tracking, no login required. Queries are logged anonymously for quality monitoring only.
Role-Based Access Control
Staff access segmented by department and role at the retrieval layer. Code Enforcement sees enforcement-related documents. Finance sees budget documents. RBAC enforced via SAML attributes from Active Directory.
Public Records Law Compliance
AI interaction logs are structured to comply with the state's public records retention schedule. System designed in consultation with the City Attorney to ensure all AI-related records are properly classified and retained.
Section 508 / WCAG 2.1 AA Accessibility
The citizen-facing interface meets Section 508 and WCAG 2.1 AA accessibility standards, including screen reader compatibility, keyboard navigation, and color contrast compliance.

A formal data governance review was conducted by the City Attorney's Office in partnership with arvintech prior to go-live. The review covered public records law compliance, data retention requirements, citizen privacy protections, and accessibility standards under Section 508 and WCAG 2.1 AA. The citizen-facing assistant includes a visible disclaimer that responses are AI-generated summaries and that official determinations require staff confirmation.

All AI interactions — both citizen-facing and internal — are logged with timestamp, user type (citizen or staff), query category, and document sources retrieved. Citizen queries are anonymized. Staff queries include user identity for audit purposes. Logs are retained per the city's records retention schedule and stored in an immutable, append-only audit trail.


09
Results & Outcomes

Measured Outcomes at 90 Days

Meridian Falls established a measurement framework at project kickoff to capture baseline metrics across all five use cases. The following outcomes were measured at the 90-day post-deployment mark using the same methodology as the baseline assessment.

74%
Faster Citizen Response Time
Average substantive response time dropped from 14 business days to 3.2 business days. For inquiries answered entirely by the citizen AI assistant, response time is under 5 seconds.
58%
Permit Processing Acceleration
Average permit application review time reduced from 23 days to 9.7 days, driven by AI pre-validation that catches incomplete applications before they enter the review queue.
94%
Public Records On-Time Compliance
Public records response compliance with the 10-day statutory target improved from 71% to 94%. AI-assisted document identification reduced the manual search phase by 82%.
12 sec
Council Decision Retrieval
Staff can now retrieve any council decision, motion, or vote from four decades of meeting minutes in an average of 12 seconds. Previously required 2.4 days of Clerk research.
91%
Staff Satisfaction Score
91% of city employees rated IntelligenceAmplifier.AI as "very useful" or "indispensable" in the 90-day survey. Adoption rate reached 84% of all city staff.
$1.2M
Projected Annual Savings
Blended calculation of reduced citizen inquiry handling time, faster permit processing, accelerated public records fulfillment, and decreased new-employee ramp time across all 620 staff.

Qualitative Feedback

Beyond quantitative metrics, the City Manager's Office conducted structured interviews with 42 staff members and reviewed 1,200 citizen feedback submissions collected through the AI assistant's built-in feedback mechanism at the 60-day mark.

  • "I used to spend half my morning looking up zoning requirements for residents who called in. Now I pull up the AI, type their address, and have the complete answer — with the exact ordinance section — in seconds. I can actually help more people in a day."
    Senior Planner, Planning & Zoning Department
  • "The council meeting search alone justified the entire project for my office. Council members used to call and ask "what did we decide about X" and it would take me a day to find the answer. Now it takes seconds. My office went from reactive to proactive."
    City Clerk, City of Meridian Falls
  • "I was hired three months ago and I feel like I know as much about this city's policies as people who have been here twenty years. The AI doesn't just give me answers — it teaches me where things are and how the code connects. My onboarding was completely different from what I was told to expect."
    Code Enforcement Officer (new hire), 3 months tenure
  • "As a resident, I was shocked. I asked the city website if I could put a shed in my backyard and it told me the exact rules for my address, the permit I needed, the fee, and linked me to the application. I've never gotten that kind of answer from any government website."
    Citizen feedback submission, Week 6 post-launch

10
Key Learnings

What We Would Do Differently — and What We Would Do Again

Every IntelligenceAmplifier.AI deployment generates insights that inform future engagements. The Meridian Falls deployment was our first full-scale municipal government implementation, and the lessons learned are directly applicable to any city, county, or regional government considering AI deployment.

✓ Do Again

Invest in cross-reference resolution before embedding

Municipal codes are uniquely self-referential. The 4,847 cross-references we resolved before embedding were the single most important data preparation step. Without this, the AI would have returned partial answers that cited sections without explaining what those sections actually say. This step should be non-negotiable for any government deployment.

✓ Do Again

Launch internal staff portal before the public citizen assistant

Deploying to staff first (Weeks 11–12) gave us four weeks of real-world usage data and prompt refinements before citizens ever saw the system. Staff caught edge cases — superseded ordinances that weren't properly flagged, abbreviations the AI didn't recognize — that would have been embarrassing in a public-facing tool. Internal-first is the right sequence for government.

✓ Do Again

Include the City Attorney from Day 1

The City Attorney's involvement from project kickoff prevented two potential issues: (1) the AI could have presented superseded ordinances as current law without proper temporal filtering, and (2) the citizen-facing disclaimer language was legally reviewed before launch, not after. Government AI requires legal review as a core workstream, not an afterthought.

↺ Improve

Plan for GIS integration complexity earlier

The Esri ArcGIS integration — which enables address-based zoning lookups — required coordination between the IT department, the GIS team, and the Planning department. The API access approval took three weeks longer than planned. Future government deployments will initiate GIS integration approvals in Week 1.

↺ Improve

Create department-specific onboarding, not one universal training

The initial staff training used a single guide for all departments. Planning staff engaged immediately. Parks & Recreation staff found less immediate relevance. Subsequent rollout used department-specific training showing each team's highest-value queries and workflows, which improved adoption measurably in underperforming departments.

The Meridian Falls deployment validated a principle that applies uniquely to government: the AI's value is not measured only in efficiency and cost savings — it is measured in public trust. When citizens get accurate, cited, instant answers to their questions about their government, trust in that government increases. When staff can confidently navigate decades of institutional knowledge in seconds, service quality increases. AI in local government is not about replacing public servants — it is about giving them the tools to serve the public better.


Deploy AI in Your Municipality

Every city, county, and regional government has decades of institutional knowledge. We'll deploy AI trained on yours — securely, privately, and under your complete control.

Deployment and ongoing support by arvintech — Managed IT & AI Services Since 2000