A Singapore Government Agency Website
How to identify
Official website links end with .gov.sg
Government agencies communicate via .gov.sg websites (e.g. go.gov.sg/open). Trusted websites
Secure websites use HTTPS
Look for a lock () or https:// as an added precaution. Share sensitive information only on official, secure websites.
LogoLogoHomeAboutFAQsEventsProblem Statements
LogoLogo
Sign up here

{build} Hackathon & Incubator

Are you ready to be part of the next {build}?

Contact UsReport VulnerabilityPrivacy StatementTerms of Use
GovTech 10th AnniversaryGovTech 10th Anniversary

© 2026 Government Technology Agency of Singapore | GovTech

Projects/Data & AI
Confirm-Plus-Chop

Confirm-Plus-Chop

Technical documentation copy editing assistant to expedite the Standards publication process which involves a long drawn out, iterative review at the tail end.

Booth DA7

Back to all projects
Cover

ConfirmPlusChop (CPC)

Improving Enterprise Singapore Standards copyediting process with the use of AI for faster, clearer, and more consistent Standard documents for publishing

Summary

CPC leverages AI to streamline the final copyediting phase of in Standards document publications, with the ultimate goal of reducing reliance on external copyeditors and accelerating the refinement process. By automatically identifying inconsistencies in structure, terminology, and formatting, CPC enhances clarity and ensures greater compliance to the authoring guidelines and rules while minimizing costs and turnaround times.

Background

The Enterprise Singapore (ESG) and Singapore Standard Council (SSC) currently face a longer gestation time in the publication of Standards as the solicitation process involves a time-consuming process when refining Standards documents at the tail end of the entire Standards publication workflow. The process relies heavily on external copyeditors to review and make edits for inconsistencies in structure, terminology, and formatting. As the edits involve multiple rounds of reviews with EnterpriseSG Standard Partners, leading to increased costs, prolonged turnaround timelines, and risks of misinterpretation.

On average, EnterpriseSG reviews 85 drafts annually, with each draft spanning around 100 pages. At a copyediting rate of $15 per page, and with an average of 100 pages per Standards, this solution has a potential cost savings of approximately $127,450 per annum if the copyediting process is automated. Additionally, the SSC is able to reduce dependency on external editors, and minimizes procurement delays and administrative overhead.

Product Goals

  1. Efficiency Improvement - Automate repetitive copyediting tasks to reduce turnaround time and expedite the Standards publication process.

  2. Cost Reduction - Decrease reliance on external copyeditors and procurement processes.

  3. Consistency Assurance - Maintain uniformity in structure, terminology, and formatting.

Approach

  1. AI Tool Exploration
  • Evaluated various AI tools, from both government and commercial, as part of exploration to identify the most suitable solution.
  • Also evaluated the option of using a locally hosted LLM to allow for greater control over the training of the LLM model hosted locally and data privacy.
  • Landed on 2 potential ones OpenAI (commercial) and GovText (government) and will be using prompt for this POC.
  1. Prototype Development
  • Built an web app that processes Standards drafts and generates a comprehensive report highlighting inconsistencies and areas for improvement.
  1. User Testing & Feedback We designed the report layout and conducted usability testing with EnterpriseSG to gather feedback on accuracy and user experience.

    • Grammar Suggestions: Out of 8 grammar suggestions made by the AI, 3 were accepted by EnterpriseSG, while 5 were rejected primarily due to language preferences rather than errors or incorrect interpretation by the product.

    • *“: * The AI flagged 9 instances of the “Note to entry” rule. Upon review with EnterpriseSG, only 2 of 9 (22.2%) was accepted as valid corrections. Further clarifications with EnterpriseSG revealed that the specific rule used by the product is only applicable to specific sections of the Standards document and not the entire document.

  2. Improvements Made Based on the feedback provided during a check point session with EnterpriseSG, we refined the AI prompt for improved accuracy.

    With the second version:

    • The AI generated 10 recommendations based on the "Note to Entry" rule yielded a success rate of 50% (5 of 10 correctly flagged); formulate a 2x improvement in accuracy.
    • Grammar and content suggestions also saw a significant increase in the acceptance rate rising from 37.5% (3 out of 8) to 73.33% (11 out of 15), demostrating a twofold improvement.
    • While not perfect, these show improvement and indicate we are on the right track.
  3. Evaluation Metrics

    • Measured effectiveness by comparing AI-generated edits with those made by human copyeditors to gauge accuracy and precision.
    • Collected user feedback to assess whether the POC addresses key pain points.
    • Conducted a review of AI-generated changes with the officers to validate the improvements.
  4. Next Steps

    • AI Model training: Investigate the feasibility of training a localized LLM model instead of relying solely on prompts to improve contextual understanding and accuracy.
    • Integration and deployment: Work with EnterpriseSG to integrate this tool into their online drafting software and evaluate performance.

Team

The team consists of 4 software engineers from GDP, and 1 UX designer and 1 product manager from GDT.***