GigHive bee gighive

Database Load Methods

This document describes the two database (re)load mechanisms used by GigHive and how they differ.

Overview

GigHive currently supports two practical ways to wipe and repopulate the MySQL database with media data:

  1. Admin-triggered runtime import (web admin UI: import_database.php, import_normalized.php)
  2. MySQL initialization load (container bootstrap via /docker-entrypoint-initdb.d/) (advanced)

Both approaches ultimately load per-table CSVs into MySQL, but they differ in when they run, where data lives during load, and how much infrastructure is involved.


Method 1: Admin-triggered runtime import (web admin, Option A)

Where it lives

When it runs

How preprocessing works


For Bands/Event Planners. Admin UI Section 3A (Legacy): Upload Single CSV and Reload Database (destructive)

Minimum required headers in the uploaded source CSV

The uploaded source CSV must include a header row. The admin UI performs a preflight check before starting an import.

Minimum required headers:


For Bands/Event Planners. Admin UI Section 3B (Normalized): Upload Sessions + Session Files and Reload Database (destructive)

Minimum required headers in sessions.csv

Minimum required headers in session_files.csv

Where uploads and job artifacts are stored

Each admin-triggered import creates a unique job directory under:

This includes Section 3B imports, which stage the uploaded sessions.csv and session_files.csv into the same job directory before loading.

The uploaded CSV is saved as:

The preprocessing script writes generated per-table CSVs to:

These paths are under /var/www/private/ and are not served publicly by Apache.

Example directory structure:

root@0f034bf13860:/var/www/private/import_jobs# ll
total 12
drwxr-xr-x 3 www-data www-data 4096 Dec 24 19:19 ./
drwxrwxr-x 1 www-data www-data 4096 Dec 24 19:19 ../
drwxr-xr-x 3 www-data www-data 4096 Dec 24 19:20 20251224-191954-377d73b02838/

root@0f034bf13860:/var/www/private/import_jobs# ll 20251224-191954-377d73b02838/
total 156
drwxr-xr-x 3 www-data www-data  4096 Dec 24 19:20 ./
drwxr-xr-x 3 www-data www-data  4096 Dec 24 19:19 ../
-rw-r--r-- 1 www-data www-data  4158 Dec 24 19:20 import_local.sql
drwxr-xr-x 2 www-data www-data  4096 Dec 24 19:19 prepped_csvs/
-rw-r--r-- 1 www-data www-data 92908 Dec 24 19:19 session_files.csv
-rw-r--r-- 1 www-data www-data 41127 Dec 24 19:19 sessions.csv

root@0f034bf13860:/var/www/private/import_jobs# ll 20251224-191954-377d73b02838/prepped_csvs/
total 212
drwxr-xr-x 2 www-data www-data   4096 Dec 24 19:19 ./
drwxr-xr-x 3 www-data www-data   4096 Dec 24 19:20 ../
-rw-r--r-- 1 www-data www-data 122888 Dec 24 19:19 files.csv
-rw-r--r-- 1 www-data www-data    437 Dec 24 19:19 musicians.csv
-rw-r--r-- 1 www-data www-data   4801 Dec 24 19:19 session_musicians.csv
-rw-r--r-- 1 www-data www-data   6170 Dec 24 19:19 session_songs.csv
-rw-r--r-- 1 www-data www-data  24818 Dec 24 19:19 sessions.csv
-rw-r--r-- 1 www-data www-data   5692 Dec 24 19:19 song_files.csv
-rw-r--r-- 1 www-data www-data  20756 Dec 24 19:19 songs.csv

files.source_relpath population behavior

GigHive stores an optional files.source_relpath value for traceability back to the original folder structure.

How data is loaded

This relies on two independent settings:

With LOCAL INFILE, the CSV files do not need to be mounted into the MySQL container. The mysql client streams the file contents over the connection.

Wipe + reseed behavior

The runtime import will:

Pros

Cons


Summary Table

Aspect MySQL initialization load (advanced) Admin-triggered runtime import (Option A)
Trigger MySQL entrypoint on fresh datadir Admin action in web UI
Best for First install / provisioning Ongoing admin reloads
Data file location Must be readable by MySQL server (/var/lib/mysql-files/) Can remain on Apache container; streamed via LOCAL
Load statement LOAD DATA INFILE LOAD DATA LOCAL INFILE
Requires shared mount into MySQL Typically yes No
Requires local_infile No Yes (server ON + client --local-infile=1)

Operational Notes


For Media Librarians. Admin UI Section 4: Folder Scan → Import-Ready CSV (destructive)

What was added

admin.php now includes a folder scan/import section titled:

Choose a Folder to Scan & Update the Database

This section includes:

What it does

What Section 4 actually does (implementation details)

How to Scan your files and upload using Section 4

1) Prereqs (quick verification)

2) Run the UI flow

3) Execute the import

4) Validate results

5) If it fails (what to capture)


For Media Librarians. Admin UI Section 5: Folder Scan → Add to the Database (non-destructive)

Section 5 is intended for building a long-term media library without wiping existing data.

Endpoint and payload

Each item includes:


For Advanced Users

MySQL initialization load (container bootstrap)

Where it lives

When it runs

If the MySQL container is restarted with an already-initialized datadir, the /docker-entrypoint-initdb.d/ scripts are typically not re-run.

How data is loaded

Pros

Cons