
Technical documentation copy editing assistant to expedite the Standards publication process which involves a long drawn out, iterative review at the tail end.

CPC leverages AI to streamline the final copyediting phase of in Standards document publications, with the ultimate goal of reducing reliance on external copyeditors and accelerating the refinement process. By automatically identifying inconsistencies in structure, terminology, and formatting, CPC enhances clarity and ensures greater compliance to the authoring guidelines and rules while minimizing costs and turnaround times.
The Enterprise Singapore (ESG) and Singapore Standard Council (SSC) currently face a longer gestation time in the publication of Standards as the solicitation process involves a time-consuming process when refining Standards documents at the tail end of the entire Standards publication workflow. The process relies heavily on external copyeditors to review and make edits for inconsistencies in structure, terminology, and formatting. As the edits involve multiple rounds of reviews with EnterpriseSG Standard Partners, leading to increased costs, prolonged turnaround timelines, and risks of misinterpretation.
On average, EnterpriseSG reviews 85 drafts annually, with each draft spanning around 100 pages. At a copyediting rate of $15 per page, and with an average of 100 pages per Standards, this solution has a potential cost savings of approximately $127,450 per annum if the copyediting process is automated. Additionally, the SSC is able to reduce dependency on external editors, and minimizes procurement delays and administrative overhead.
Efficiency Improvement - Automate repetitive copyediting tasks to reduce turnaround time and expedite the Standards publication process.
Cost Reduction - Decrease reliance on external copyeditors and procurement processes.
Consistency Assurance - Maintain uniformity in structure, terminology, and formatting.
User Testing & Feedback We designed the report layout and conducted usability testing with EnterpriseSG to gather feedback on accuracy and user experience.
Grammar Suggestions: Out of 8 grammar suggestions made by the AI, 3 were accepted by EnterpriseSG, while 5 were rejected primarily due to language preferences rather than errors or incorrect interpretation by the product.
*“: * The AI flagged 9 instances of the “Note to entry” rule. Upon review with EnterpriseSG, only 2 of 9 (22.2%) was accepted as valid corrections. Further clarifications with EnterpriseSG revealed that the specific rule used by the product is only applicable to specific sections of the Standards document and not the entire document.
Improvements Made Based on the feedback provided during a check point session with EnterpriseSG, we refined the AI prompt for improved accuracy.
With the second version:
Evaluation Metrics
Next Steps
The team consists of 4 software engineers from GDP, and 1 UX designer and 1 product manager from GDT.***