Implementation of Logging and Reporting for Product Import Results
Without logging, import is a black box. Something goes wrong overnight, the manager sees discrepancies in the catalog in the morning, but cannot understand the cause. Good logging captures every step, and the report gives a definitive answer: what was, what changed, what broke.
What Needs to Be Logged
At the import run level:
- Start and completion time
- Data source, type, file URL/path
- Final counters: created / updated / skipped / errors
- Status: success / partial / failed
- User who initiated the import (if manual)
At the individual row level:
- Row number / SKU
- Operation type: create / update / skip / error
- Fields that changed (diff)
- Error message (if any)
Database Schema for Logs
CREATE TABLE import_runs (
id serial PRIMARY KEY,
source_id int REFERENCES import_sources(id),
status varchar(20) DEFAULT 'pending', -- pending | processing | success | partial | failed
trigger varchar(20) DEFAULT 'scheduled', -- scheduled | manual | webhook
triggered_by int REFERENCES users(id),
file_name varchar(500),
file_size bigint,
total_rows int DEFAULT 0,
created_count int DEFAULT 0,
updated_count int DEFAULT 0,
skipped_count int DEFAULT 0,
errors_count int DEFAULT 0,
started_at timestamptz,
completed_at timestamptz,
duration_ms int,
error_message text,
created_at timestamptz DEFAULT now()
);
CREATE TABLE import_row_logs (
id bigserial PRIMARY KEY,
import_id int REFERENCES import_runs(id) ON DELETE CASCADE,
line_number int,
sku varchar(100),
operation varchar(10), -- create | update | skip | error
changed_fields jsonb, -- {"price": {"old": 100, "new": 120}}
error_code varchar(50),
error_msg text,
created_at timestamptz DEFAULT now()
);
CREATE INDEX import_row_logs_import_id_idx ON import_row_logs (import_id);
CREATE INDEX import_row_logs_sku_idx ON import_row_logs (sku);
Import Logger
class ImportLogger
{
private ImportRun $run;
private array $rowBuffer = [];
private int $bufferSize = 500;
public function start(int $sourceId, string $trigger, ?int $userId): void
{
$this->run = ImportRun::create([
'source_id' => $sourceId,
'status' => 'processing',
'trigger' => $trigger,
'triggered_by' => $userId,
'started_at' => now(),
]);
}
public function logRow(
int $line,
string $sku,
string $operation,
array $changedFields = [],
?string $errorMsg = null,
?string $errorCode = null
): void {
$this->rowBuffer[] = [
'import_id' => $this->run->id,
'line_number' => $line,
'sku' => $sku,
'operation' => $operation,
'changed_fields'=> $changedFields ? json_encode($changedFields) : null,
'error_code' => $errorCode,
'error_msg' => $errorMsg,
'created_at' => now()->toDateTimeString(),
];
if (count($this->rowBuffer) >= $this->bufferSize) {
$this->flush();
}
}
public function finish(string $status, ?string $errorMessage = null): void
{
$this->flush();
$counts = DB::table('import_row_logs')
->where('import_id', $this->run->id)
->selectRaw("
SUM(CASE WHEN operation = 'create' THEN 1 ELSE 0 END) AS created,
SUM(CASE WHEN operation = 'update' THEN 1 ELSE 0 END) AS updated,
SUM(CASE WHEN operation = 'skip' THEN 1 ELSE 0 END) AS skipped,
SUM(CASE WHEN operation = 'error' THEN 1 ELSE 0 END) AS errors
")
->first();
$this->run->update([
'status' => $status,
'created_count' => $counts->created,
'updated_count' => $counts->updated,
'skipped_count' => $counts->skipped,
'errors_count' => $counts->errors,
'completed_at' => now(),
'duration_ms' => now()->diffInMilliseconds($this->run->started_at),
'error_message' => $errorMessage,
]);
}
private function flush(): void
{
if (!empty($this->rowBuffer)) {
DB::table('import_row_logs')->insert($this->rowBuffer);
$this->rowBuffer = [];
}
}
}
Record buffering by 500 items — instead of INSERT per row.
Capturing diff of Changed Fields
private function buildDiff(Product $existing, array $newData): array
{
$trackFields = ['price', 'qty', 'name', 'description'];
$diff = [];
foreach ($trackFields as $field) {
$old = $existing->{$field};
$new = $newData[$field] ?? null;
if ((string) $old !== (string) $new) {
$diff[$field] = ['old' => $old, 'new' => $new];
}
}
return $diff;
}
Aggregated Report
class ImportReportBuilder
{
public function build(ImportRun $run): ImportReport
{
$topErrors = DB::table('import_row_logs')
->where('import_id', $run->id)
->where('operation', 'error')
->select('error_code', DB::raw('COUNT(*) as count'), DB::raw('MIN(sku) as example_sku'))
->groupBy('error_code')
->orderByDesc('count')
->limit(10)
->get();
$priceChanges = DB::table('import_row_logs')
->where('import_id', $run->id)
->where('operation', 'update')
->whereRaw("changed_fields ? 'price'")
->count();
return new ImportReport(
run: $run,
topErrors: $topErrors,
priceChangesCount: $priceChanges,
);
}
}
Notifications on Results
class ImportCompletedNotification extends Notification
{
public function toMail(mixed $notifiable): MailMessage
{
$run = $this->run;
return (new MailMessage)
->subject("Import #{$run->id}: {$run->status}")
->line("Source: {$run->source->name}")
->line("Created: {$run->created_count}, updated: {$run->updated_count}, errors: {$run->errors_count}")
->line("Duration: " . round($run->duration_ms / 1000, 1) . " sec")
->when($run->errors_count > 0, fn($m) => $m->action('View Errors', $this->reportUrl()));
}
}
Notification is sent only if the status is not success or the number of errors exceeds the threshold.
Log Rotation
Row-by-row logs grow quickly. Retention policy:
// Artisan command in scheduler
$schedule->command('import:cleanup-logs --older-than=30')->weekly();
class CleanupImportLogsCommand extends Command
{
public function handle(): void
{
$cutoff = now()->subDays($this->option('older-than'));
// Delete row logs from old successful imports
ImportRowLog::whereHas('run', fn($q) =>
$q->where('status', 'success')->where('completed_at', '<', $cutoff)
)->delete();
// Keep summary import_runs records forever (they're small)
}
}
Timeline
- ImportLogger with buffering, database tables, final counters — 1 day
- Field diffs, aggregated report, notifications — +0.5 day
- UI for viewing logs in admin panel, rotation of old records — +0.5 day







