Graylog Logging Setup for Web Application

Our company is engaged in the development, support and maintenance of sites of any complexity. From simple one-page sites to large-scale cluster systems built on micro services. Experience of developers is confirmed by certificates from vendors.
Development and maintenance of all types of websites:
Informational websites or web applications
Business card websites, landing pages, corporate websites, online catalogs, quizzes, promo websites, blogs, news resources, informational portals, forums, aggregators
E-commerce websites or web applications
Online stores, B2B portals, marketplaces, online exchanges, cashback websites, exchanges, dropshipping platforms, product parsers
Business process management web applications
CRM systems, ERP systems, corporate portals, production management systems, information parsers
Electronic service websites or web applications
Classified ads platforms, online schools, online cinemas, website builders, portals for electronic services, video hosting platforms, thematic portals

These are just some of the technical types of websites we work with, and each of them can have its own specific features and functionality, as well as be customized to meet the specific needs and goals of the client.

Our competencies:
Development stages
Latest works
  • image_web-applications_feedme_466_0.webp
    Development of a web application for FEEDME
    1161
  • image_ecommerce_furnoro_435_0.webp
    Development of an online store for the company FURNORO
    1041
  • image_crm_enviok_479_0.webp
    Development of a web application for Enviok
    822
  • image_crm_chasseurs_493_0.webp
    CRM development for Chasseurs
    847
  • image_website-sbh_0.png
    Website development for SBH Partners
    999
  • image_website-_0.png
    Website development for Red Pear
    451

Setting Up Logging (Graylog) for Your Web Application

Graylog occupies a niche between ELK (powerful, complex) and Loki (simple, limited). Built-in web interface with search, alerting, and dashboard — no Kibana as a separate component. Perfect for teams needing centralized log management without deep customization.

Architecture: Graylog ← MongoDB (configuration) + OpenSearch/Elasticsearch (data)

Deployment

# docker-compose.yml
version: '3.8'
services:
  mongodb:
    image: mongo:6.0
    volumes:
      - mongo_data:/data/db

  opensearch:
    image: opensearchproject/opensearch:2.12.0
    environment:
      - cluster.name=graylog
      - discovery.type=single-node
      - plugins.security.disabled=true
      - "OPENSEARCH_JAVA_OPTS=-Xms2g -Xmx2g"
      - bootstrap.memory_lock=true
    ulimits:
      memlock: { soft: -1, hard: -1 }
    volumes:
      - os_data:/usr/share/opensearch/data

  graylog:
    image: graylog/graylog:6.0
    environment:
      - GRAYLOG_PASSWORD_SECRET=your_random_64_char_secret
      # echo -n "admin_password" | sha256sum
      - GRAYLOG_ROOT_PASSWORD_SHA2=your_sha256_password_hash
      - GRAYLOG_HTTP_EXTERNAL_URI=http://graylog.example.com:9000/
      - GRAYLOG_MONGODB_URI=mongodb://mongodb:27017/graylog
      - GRAYLOG_ELASTICSEARCH_HOSTS=http://opensearch:9200
    ports:
      - "9000:9000"     # Web UI
      - "12201:12201"   # GELF UDP
      - "12201:12201/udp"
      - "5044:5044"     # Beats
      - "514:514/udp"   # Syslog UDP
    depends_on:
      - mongodb
      - opensearch

volumes:
  mongo_data:
  os_data:

Generate GRAYLOG_PASSWORD_SECRET:

pwgen -N 1 -s 96

Hash the password:

echo -n "your_admin_password" | sha256sum | awk '{print $1}'

Input Sources (Inputs)

Graylog receives logs through Inputs — configured in System → Inputs:

GELF UDP (recommended for applications):

  • Port: 12201
  • Possible loss at high load (UDP), but minimal overhead

GELF TCP (more reliable):

  • Port: 12201
  • Use if delivery guarantee is critical

Beats (for Filebeat):

  • Port: 5044

Syslog UDP/TCP:

  • For system logs and network equipment

Sending Logs from Laravel

Via GELF (Graylog's native protocol):

composer require graylog2/gelf-php
// app/Logging/GraylogLogger.php
namespace App\Logging;

use Gelf\Publisher;
use Gelf\Transport\UdpTransport;
use Monolog\Handler\GelfHandler;
use Monolog\Logger;

class GraylogLogger
{
    public function __invoke(array $config): Logger
    {
        $transport = new UdpTransport(
            $config['host'],
            $config['port'] ?? 12201,
            UdpTransport::CHUNK_SIZE_LAN
        );
        $publisher = new Publisher($transport);
        $handler = new GelfHandler($publisher);

        return new Logger('app', [$handler]);
    }
}
// config/logging.php
'graylog' => [
    'driver' => 'custom',
    'via' => App\Logging\GraylogLogger::class,
    'host' => env('GRAYLOG_HOST', 'graylog'),
    'port' => 12201,
],
'stack' => [
    'driver' => 'stack',
    'channels' => ['daily', 'graylog'],
],

Context fields automatically become fields in Graylog:

Log::error('Payment failed', [
    'user_id' => $user->id,
    'order_id' => $order->id,
    'amount' => $order->amount,
    'provider_error' => $response->error,
]);

Filebeat for Nginx Logs

# /etc/filebeat/filebeat.yml
filebeat.inputs:
  - type: log
    paths: [/var/log/nginx/access.log]
    fields:
      source_type: nginx_access

processors:
  - add_fields:
      target: ''
      fields:
        environment: production

output.logstash:
  hosts: ["graylog-server:5044"]

Extractors and Pipelines

Graylog allows parsing fields from messages via Extractors (for individual fields) or Processing Pipelines (for complex logic).

Pipeline for Nginx logs (System → Pipelines):

rule "parse nginx access log"
when
  has_field("source_type") AND to_string($message.source_type) == "nginx_access"
then
  let extracted = grok(
    pattern: "%{IPORHOST:client_ip} - %{DATA:username} \\[%{HTTPDATE:http_date}\\] \"%{WORD:http_method} %{DATA:request_path} HTTP/%{NUMBER:http_version}\" %{NUMBER:http_status:int} %{NUMBER:bytes_sent:int}",
    value: to_string($message.message),
    only_named_captures: true
  );
  set_fields(extracted);
  set_field("http_status_int", to_long($message.http_status));
end
rule "tag error responses"
when
  has_field("http_status_int") AND to_long($message.http_status_int) >= 500
then
  set_field("is_error", true);
  add_tag("http_error");
end

Streams — Log Routing

Streams partition the log flow by categories with different retention policies:

  • All Nginx Access — source_type = nginx_access → retention 30 days
  • Application Errors — level = ERROR or CRITICAL → retention 90 days
  • Security Events — tags contain "security" → retention 180 days

Each stream can have its own Index Set with independent rotation settings.

Index Sets — Storage Management

System → Index Sets → Create index set:

Index prefix: app-errors
Max number of indices: 90
Index rotation: Time-based, Daily
Index retention: Delete, max 90 indices
Shards: 2
Replicas: 0 (for single-node)

Alerts

Graylog supports Event Definitions — condition-based alerts:

Alerts → Event Definitions → Create:

Title: High 5xx error rate
Condition: Aggregation
  - Stream: All Nginx Access
  - Group by: (none)
  - Count messages
  - Filter: http_status >= 500
  - Execute every: 5 minutes
  - Condition: count > 50

Notification:
  Type: HTTP Notification
  URL: https://api.telegram.org/bot<TOKEN>/sendMessage
  Body: {"chat_id": "<ID>", "text": "High error rate: ${event.message}"}

Dashboard

In Graylog, dashboards are built from search widgets. Standard set for web applications:

  • Message count (all logs, 24h) — number
  • HTTP status codes (Pie chart, http_status field)
  • Error rate (Line chart, level:ERROR filter, group by time)
  • Top request paths (Table, Top values by request_path)
  • Geographic distribution (Map, if GeoIP enabled)

Timeline

Deploying Graylog + OpenSearch + MongoDB, configuring Inputs, Filebeat for Nginx, GELF logging from application, basic Pipeline rules, Index Sets with retention policy, initial alerts: 1-2 working days.