Customer Cases
Pricing

Enhancing Business Value with Automation: Practical Team Practices

Learn how QJIAYI Tech Quality Team enhances automation business value with practical practices. 10k+ test cases, 80+ monthly bugs detected—turn automation into a business-driven capability.

Introduction

Automation has become a core part of modern quality assurance and R&D efficiency. However, many teams struggle with low defect detection, high maintenance costs, misaligned technologies, and difficulty proving real business value.

After years of practice and iteration, the QJIAYI Quality Team has built a mature automation system that now runs 10,000+ UI and API test cases in continuous regression and detects more than 80 bugs per month. This article shares how we transformed automation from a costly experiment into a stable, business-driven capability.

Common Challenges in Test Automation

Many teams face similar pain points when building automation systems:

  • High volumes of automated test cases, but few real defects found.

  • Frequent business changes lead to heavy script maintenance.

  • New frameworks and tools often fail to adapt to real business scenarios.

  • Stakeholders doubt the value of automation and are reluctant to invest.

  • Long-term investment does not meet expected return on investment.

Teams often jump between new tools and frameworks without solving fundamental problems, creating a cycle of inefficiency. At QJIAYI, we broke this cycle by aligning automation closely with business goals and team workflows.

Align Automation Goals with Business Stages

Rather than chasing universal metrics such as code coverage, pass rate, or CI compliance, we believe automation goals must match the business lifecycle.

Lessons from Real-World Practice

  • In early-stage products, over-emphasizing high test coverage can slow iteration and hurt customer experience.

  • Neglecting automation during rapid growth leads to online failures, regressions, and damaged brand reputation.

We structured our automation roadmap into three progressive stages:

1. Tool-Driven Stage

We built foundational frameworks including:

  • Apollo API Automation Framework

  • Hades UI Automation Framework

  • Data Regression Platform

The goal was to replace repetitive manual tests with stable script execution and improve case writing efficiency.

2. Platform-Driven Stage

We developed a unified automation platform for:

  • Centralized test case management

  • Standardized execution scheduling

  • Automated reporting and metric analysis

This platform allowed more team members to participate and improved overall efficiency.

3. Engineer-Driven Stage

Test engineers took ownership to optimize systems, expand scenarios, and innovate solutions:

  • Scenario-based and data-driven testing

  • Image comparison, JSON validation, and data consistency checks

  • Platform-based testing for internal packages

This stage turned automation from a tool into a team-driven capability.

Strategic Automation for Complex Business

One-size-fits-all automation rarely works in complex systems. At QJIAYI, our business spans design tools, merchant backends, open platforms, mini-programs, and international services. We use different automation strategies for different technical architectures, including backend services, open APIs, front-end components, and plugins.

Lessons from Long-Chain Automation Attempts

We once built an end-to-end automation system covering design, generation, and data production. While it detected real issues, we faced:

  • Unstable data and inconsistent IDs

  • High comparison noise and maintenance costs

  • Difficulty generalizing front-end interactions

  • High cross-team collaboration costs

This experience taught us to:

  • Strengthen front-end data validation

  • Conduct in-depth feasibility research before implementation

  • Consider both technical and organizational challenges

How We Implemented Automation Effectively

In the early stages, our metrics looked good—high coverage, high pass rate—but stakeholders still questioned automation value. We took targeted actions:

1. Analyze Online Failures Backwards

We reviewed every missed bug to identify gaps in validation, scenario design, and false positives. This made automation more targeted.

2. Conduct Automated Code Reviews

Regular reviews improved stability, reduced redundancy, and standardized development practices.

3. Integrate Automation into Stability Projects

For high-risk businesses, automation became part of project goals from the start. We worked with developers to simplify data construction and improve testability.

4. Optimize CI Stability

We tracked frequent failures, fixed environmental issues, and reduced non-bug CI blockages. This made automation trusted and efficient.

Deepening the Business Value of Automation

Once automation was stable, we amplified its impact:

  • Recognized and rewarded teams for automation innovation

  • Used major tech projects to test and improve automation capabilities

  • Encouraged internal sharing of practical methods

  • Connected automation failures directly to bug tickets with logs and quick re-run functions

These steps turned automation into a trustworthy, indispensable part of the development pipeline.

Conclusion

Building automation is easy. Building a sustainable, business-aligned automation system is difficult. Our key takeaways:

  • Match automation goals to business stages, not just KPIs.

  • Collaborate closely with product and R&D teams.

  • Build your automation platform like a real product—with tools, training, and operation mechanisms.

  • Learn from failures and iterate continuously.

When you focus on making automation stable, effective, and business-oriented, your team and partners will follow.

Latest Posts
1AI Unit Test Generation 2026: From Crisis to Productivity Leap AI unit test generation is transforming software testing. This complete guide covers usability challenges, global giant practices (GitHub Copilot, Amazon Q), Chinese innovations (Kuaishou, ByteDance, Huawei), and 5 proven paths to scale AI testing from pilot to production.
2Why Most Manual Testers Find Test Platforms Frustrating: An Honest Look at API Automation Most manual testers find traditional test platforms frustrating. Here's why — and what actually works better for API automation (hint: scripts + a good framework).
3Requirements Review and Acceptance Testing: The Key to Shifting Quality Left Learn how to use acceptance testing and requirements review to shift quality left. Discover DoR examples, UAT best practices, and metrics to improve team efficiency and product quality.
4Acceptance Testing: The Complete Guide to Types, Criteria, Tools, and Best Practices What is acceptance testing? Learn UAT, BAT, OAT, alpha vs beta testing, entry/exit criteria, 4 popular tools (Selenium, Cucumber, JMeter, SoapUI), and step-by-step execution. Perfect for QA engineers.
5How to Reduce Test Leakage: A Complete Guide to Software Testing Quality Test leakage can severely impact software quality. This comprehensive guide covers root causes of test leakage, prevention strategies, testing methods, and effective communication with developers. Learn how to build a robust testing process that catches defects before production.