AI Statistical Process Control SPC System Development

We design and deploy artificial intelligence systems: from prototype to production-ready solutions. Our team combines expertise in machine learning, data engineering and MLOps to make AI work not in the lab, but in real business.
Showing 1 of 1 servicesAll 1566 services
AI Statistical Process Control SPC System Development
Medium
~1-2 weeks
FAQ
AI Development Areas
AI Solution Development Stages
Latest works
  • image_web-applications_feedme_466_0.webp
    Development of a web application for FEEDME
    1161
  • image_ecommerce_furnoro_435_0.webp
    Development of an online store for the company FURNORO
    1041
  • image_logo-advance_0.png
    B2B Advance company logo design
    561
  • image_crm_enviok_479_0.webp
    Development of a web application for Enviok
    823
  • image_logo-aider_0.jpg
    AIDER company logo development
    762
  • image_crm_chasseurs_493_0.webp
    CRM development for Chasseurs
    848

Development of an AI-based statistical quality management system (SPC) with AI

Statistical Process Control (SPC) is a mathematical quality monitoring tool developed by Shewhart in the 1920s. The SPC AI extension replaces manual control chart interpretation with automatic detection of all eight Western Electric rules, adapts control limits to non-stationary processes, and generates multivariate charts for complex production environments.

Classic control charts

Shewhart charts for continuous data:

import numpy as np
import pandas as pd

def compute_xbar_r_chart(data, subgroup_size=5):
    """
    X-bar и R карта: среднее и размах по подгруппам
    Стандарт для производственных измерений
    """
    n_subgroups = len(data) // subgroup_size
    subgroups = data[:n_subgroups * subgroup_size].reshape(n_subgroups, subgroup_size)

    xbar = subgroups.mean(axis=1)
    R = subgroups.max(axis=1) - subgroups.min(axis=1)

    # Константы по стандарту ASTM (зависят от размера подгруппы)
    d2 = {2: 1.128, 3: 1.693, 4: 2.059, 5: 2.326}[subgroup_size]
    D3 = {2: 0, 3: 0, 4: 0, 5: 0}[subgroup_size]
    D4 = {2: 3.267, 3: 2.574, 4: 2.282, 5: 2.114}[subgroup_size]
    A2 = {2: 1.880, 3: 1.023, 4: 0.729, 5: 0.577}[subgroup_size]

    # Центральная линия и контрольные пределы
    xbar_cl = xbar.mean()
    R_cl = R.mean()

    xbar_ucl = xbar_cl + A2 * R_cl
    xbar_lcl = xbar_cl - A2 * R_cl
    R_ucl = D4 * R_cl
    R_lcl = D3 * R_cl

    return {
        'xbar': xbar, 'R': R,
        'xbar_cl': xbar_cl, 'xbar_ucl': xbar_ucl, 'xbar_lcl': xbar_lcl,
        'R_cl': R_cl, 'R_ucl': R_ucl, 'R_lcl': R_lcl,
        'sigma_hat': R_cl / d2
    }

CUSUM and EWMA for small displacements:

def ewma_control_chart(data, lambda_param=0.2, L=3.0):
    """
    EWMA лучше X-bar для обнаружения малых (1-2σ) смещений
    λ: скорость забывания (меньше = длиннее память)
    L: ширина контрольных пределов (обычно 2.7-3.0)
    """
    n = len(data)
    mean = data[:20].mean()  # baseline из первых 20 точек
    std = data[:20].std()

    z = np.zeros(n)
    z[0] = lambda_param * data[0] + (1 - lambda_param) * mean

    for i in range(1, n):
        z[i] = lambda_param * data[i] + (1 - lambda_param) * z[i-1]

    # Контрольные пределы (асимптотические)
    sigma_z = std * np.sqrt(lambda_param / (2 - lambda_param))
    ucl = mean + L * sigma_z
    lcl = mean - L * sigma_z

    out_of_control = (z > ucl) | (z < lcl)
    return z, ucl, lcl, out_of_control

Western Electric Automatic Rule Detection

8 Rules for Violating Stability:

def check_western_electric_rules(data, control_chart):
    """
    Проверка всех 8 правил WECO
    """
    cl = control_chart['cl']
    sigma = control_chart['sigma']
    ucl = cl + 3*sigma
    lcl = cl - 3*sigma

    violations = []

    # Правило 1: 1 точка за 3σ
    r1 = np.where((data > ucl) | (data < lcl))[0]
    violations.extend([{'rule': 1, 'index': i, 'description': 'Point beyond 3σ'} for i in r1])

    # Правило 2: 9 последовательных точек по одну сторону от CL
    for i in range(8, len(data)):
        window = data[i-8:i+1]
        if all(window > cl) or all(window < cl):
            violations.append({'rule': 2, 'index': i, 'description': '9 points same side of CL'})

    # Правило 3: 6 последовательных точек с трендом (монотонный рост/спад)
    for i in range(5, len(data)):
        window = data[i-5:i+1]
        diffs = np.diff(window)
        if all(diffs > 0) or all(diffs < 0):
            violations.append({'rule': 3, 'index': i, 'description': '6 points monotone trend'})

    # Правило 4: 14 чередующихся точек (зигзаг)
    for i in range(13, len(data)):
        window = data[i-13:i+1]
        alternating = all(
            (window[j] - window[j-1]) * (window[j+1] - window[j]) < 0
            for j in range(1, len(window)-1)
        )
        if alternating:
            violations.append({'rule': 4, 'index': i, 'description': '14 alternating points'})

    # Правило 5: 2 из 3 точек за 2σ
    for i in range(2, len(data)):
        window = data[i-2:i+1]
        count_beyond_2sigma = sum(1 for x in window if abs(x - cl) > 2*sigma)
        if count_beyond_2sigma >= 2:
            violations.append({'rule': 5, 'index': i, 'description': '2 of 3 beyond 2σ'})

    # Правила 6-8 по аналогии...
    return violations

Multivariate SPC (Hotelling T²)

For correlated quality parameters:

from sklearn.decomposition import PCA
from scipy.stats import chi2

def hotelling_t2_chart(X, phase1_data):
    """
    T² контрольная карта для многомерных данных
    Учитывает корреляции между параметрами качества
    """
    # Фаза I: оценка параметров на нормальных данных
    mean = phase1_data.mean(axis=0)
    cov = np.cov(phase1_data.T)
    cov_inv = np.linalg.inv(cov)

    # T² статистика для каждой новой точки
    T2 = []
    for x in X:
        deviation = x - mean
        t2 = deviation @ cov_inv @ deviation
        T2.append(t2)

    T2 = np.array(T2)

    # Контрольный предел: chi2(p, alpha) или F-распределение
    p = X.shape[1]  # число переменных
    alpha = 0.0027  # 3σ эквивалент
    ucl = chi2.ppf(1 - alpha, df=p)

    out_of_control = T2 > ucl
    return T2, ucl, out_of_control

Decomposition of violation T²: When T² is triggered, you need to figure out which variable is at fault:

def decompose_t2_violation(x_new, mean, cov_inv, p):
    """
    Decomposition: вклад каждой переменной в общий T²
    """
    contributions = []
    for j in range(p):
        # T² без j-й переменной
        mask = [i for i in range(p) if i != j]
        x_reduced = x_new[mask]
        mean_reduced = mean[mask]
        cov_inv_reduced = np.linalg.inv(np.linalg.inv(cov_inv)[np.ix_(mask, mask)])
        t2_reduced = (x_reduced - mean_reduced) @ cov_inv_reduced @ (x_reduced - mean_reduced)

        total_t2 = (x_new - mean) @ cov_inv @ (x_new - mean)
        contribution = total_t2 - t2_reduced
        contributions.append({'variable': j, 'contribution': contribution})

    return sorted(contributions, key=lambda x: x['contribution'], reverse=True)

Adaptive Control Limits

Adaptation to non-stationary processes:

class AdaptiveSPCChart:
    """
    Динамические контрольные пределы для процессов с медленным дрейфом
    (изменение сырья, деградация инструмента)
    """
    def __init__(self, adaptation_rate=0.05, min_phase1_samples=50):
        self.adaptation_rate = adaptation_rate  # скорость адаптации границ
        self.phase1_complete = False
        self.history = []

    def update(self, new_value):
        self.history.append(new_value)

        if len(self.history) < 50:
            return None  # накапливаем данные

        if not self.phase1_complete:
            self.mean = np.mean(self.history[-50:])
            self.std = np.std(self.history[-50:])
            self.phase1_complete = True
        else:
            # EWMA адаптация параметров (медленная — для нормального дрейфа)
            self.mean = (1 - self.adaptation_rate) * self.mean + self.adaptation_rate * new_value
            # Адаптация std по экспоненциальному скользящему среднему квадрата отклонения
            self.std = np.sqrt(
                (1 - self.adaptation_rate) * self.std**2 +
                self.adaptation_rate * (new_value - self.mean)**2
            )

        ucl = self.mean + 3 * self.std
        lcl = self.mean - 3 * self.std

        return {
            'value': new_value,
            'cl': self.mean, 'ucl': ucl, 'lcl': lcl,
            'out_of_control': new_value > ucl or new_value < lcl
        }

Integration with production

MES integration: The SPC system receives measurements online from the MES or directly from measuring equipment (CMMs, spectrometers, test benches). When an alarm is triggered:

  • Automatic batch locking for inspection
  • Notification of the operator and technologist
  • Creating a Non-Conformance Report (NCR) in QMS

Process capabilities (Cp, Cpk):

def process_capability(data, lsl, usl):
    """
    Cp: потенциальная возможность
    Cpk: реальная (учитывает смещение среднего)
    """
    mean = np.mean(data)
    std = np.std(data, ddof=1)

    cp = (usl - lsl) / (6 * std)
    cpu = (usl - mean) / (3 * std)
    cpl = (mean - lsl) / (3 * std)
    cpk = min(cpu, cpl)

    return {'cp': cp, 'cpk': cpk, 'mean': mean, 'std': std}

Deadlines: X-bar/R maps + WECO rules + alerts + MES connector — 3-4 weeks. EWMA/CUSUM, multivariate T², adaptive boundaries, process capability, QMS integration — 2-3 months.