A/B testing of website page elements

Our company is engaged in the development, support and maintenance of sites of any complexity. From simple one-page sites to large-scale cluster systems built on micro services. Experience of developers is confirmed by certificates from vendors.
Development and maintenance of all types of websites:
Informational websites or web applications
Business card websites, landing pages, corporate websites, online catalogs, quizzes, promo websites, blogs, news resources, informational portals, forums, aggregators
E-commerce websites or web applications
Online stores, B2B portals, marketplaces, online exchanges, cashback websites, exchanges, dropshipping platforms, product parsers
Business process management web applications
CRM systems, ERP systems, corporate portals, production management systems, information parsers
Electronic service websites or web applications
Classified ads platforms, online schools, online cinemas, website builders, portals for electronic services, video hosting platforms, thematic portals

These are just some of the technical types of websites we work with, and each of them can have its own specific features and functionality, as well as be customized to meet the specific needs and goals of the client.

Our competencies:
Development stages
Latest works
  • image_web-applications_feedme_466_0.webp
    Development of a web application for FEEDME
    1161
  • image_ecommerce_furnoro_435_0.webp
    Development of an online store for the company FURNORO
    1041
  • image_crm_enviok_479_0.webp
    Development of a web application for Enviok
    822
  • image_crm_chasseurs_493_0.webp
    CRM development for Chasseurs
    847
  • image_website-sbh_0.png
    Website development for SBH Partners
    999
  • image_website-_0.png
    Website development for Red Pear
    451

A/B Testing Page Elements

A/B testing is a controlled experiment where one user group sees the original version (control) and another sees a modified version (variant). A statistically significant difference in conversion proves the effect of the change.

What Can Be Tested

  • Headline and CTA button text
  • Button color, size, and placement
  • Form: number of fields, field order, labels
  • Hero images and videos
  • Social proof (testimonials, counters)
  • Trust elements (security badges, guarantees)
  • Pricing and price display

Minimum Sample Size

Before launching a test, calculate the required sample size:

from scipy import stats
import math

def calculate_sample_size(baseline_rate, min_detectable_effect, alpha=0.05, power=0.8):
    """
    baseline_rate: current conversion (0.05 = 5%)
    min_detectable_effect: minimum effect (0.10 = 10% relative improvement)
    """
    p1 = baseline_rate
    p2 = baseline_rate * (1 + min_detectable_effect)

    z_alpha = stats.norm.ppf(1 - alpha / 2)
    z_beta = stats.norm.ppf(power)

    p_avg = (p1 + p2) / 2

    n = (z_alpha * math.sqrt(2 * p_avg * (1 - p_avg)) +
         z_beta * math.sqrt(p1 * (1 - p1) + p2 * (1 - p2))) ** 2 / (p2 - p1) ** 2

    return math.ceil(n)

# Example: 3% conversion, want to detect 15%+ improvement
n = calculate_sample_size(0.03, 0.15)
print(f"Need {n} users per variant = {n*2} total")
# Need 3,842 users per variant = 7,684 total

Client-Side A/B Test Implementation

// Simple A/B test without external tools
function getVariant(testName) {
  const stored = localStorage.getItem(`ab_${testName}`)
  if (stored) return stored

  const variant = Math.random() < 0.5 ? 'control' : 'variant'
  localStorage.setItem(`ab_${testName}`, variant)

  // Send to analytics
  gtag('event', 'ab_test_assignment', {
    test_name: testName,
    variant: variant
  })

  return variant
}

// Apply test
const variant = getVariant('cta_button_color')
const ctaButton = document.getElementById('main-cta')

if (variant === 'variant') {
  ctaButton.style.backgroundColor = '#FF6B35'  // orange instead of blue
  ctaButton.textContent = 'Start for Free'
} else {
  // Keep default
}

// Track conversion
ctaButton.addEventListener('click', () => {
  gtag('event', 'cta_click', {
    test_name: 'cta_button_color',
    variant: variant
  })
})

Server-Side A/B Test (Preferred)

Client-side tests cause flickering. Server-side is better:

// Laravel Middleware: assign variant before rendering
class AbTestMiddleware
{
    public function handle(Request $request, Closure $next)
    {
        $testName = 'checkout_form_v2';
        $userId = auth()->id() ?? $request->session()->getId();

        // Deterministic assignment by user ID
        $variant = (crc32($userId . $testName) % 2 === 0) ? 'control' : 'variant';

        $request->merge(['ab_variants' => [$testName => $variant]]);
        View::share('ab_variants', [$testName => $variant]);

        $response = $next($request);
        $response->headers->set('X-AB-Variant', $variant);

        return $response;
    }
}
{{-- In template --}}
@if($ab_variants['checkout_form_v2'] === 'variant')
    @include('checkout.form-v2')
@else
    @include('checkout.form-v1')
@endif

Results Analysis

from scipy.stats import chi2_contingency, proportions_ztest
import numpy as np

def analyze_ab_test(control_visitors, control_conversions,
                    variant_visitors, variant_conversions):
    # Conversion rates
    cr_control = control_conversions / control_visitors
    cr_variant = variant_conversions / variant_visitors
    relative_change = (cr_variant - cr_control) / cr_control * 100

    # Z-test for proportions
    count = np.array([variant_conversions, control_conversions])
    nobs = np.array([variant_visitors, control_visitors])
    z_stat, p_value = proportions_ztest(count, nobs)

    print(f"Control: {cr_control:.2%} ({control_conversions}/{control_visitors})")
    print(f"Variant: {cr_variant:.2%} ({variant_conversions}/{variant_visitors})")
    print(f"Relative change: {relative_change:+.1f}%")
    print(f"P-value: {p_value:.4f}")
    print(f"Statistically significant: {'YES' if p_value < 0.05 else 'NO'}")

analyze_ab_test(
    control_visitors=3842, control_conversions=115,
    variant_visitors=3891, variant_conversions=148
)

Common Mistakes

  • Stopping the test early — p-value fluctuates initially, must reach required sample size
  • Testing multiple changes at once — unclear which change caused the effect (that's Multivariate testing)
  • Ignoring segments — test is neutral overall but improves mobile by 25%
  • Not accounting for seasonality — test during a representative period

Delivery Time

Setting up an A/B test with server-side split, analytics integration, and results analysis — 2–4 business days.