gighiveUsing the standard upload process (upload utility or iPhone app), Gighive will accumulate new media and store the metadata for each file into its database. But you can also preload video and audio files into the database via comma-quote (CSV) file format. This would be useful if you have a preexisting library of media, say for a band that has been around for a long time. This document explains how GigHive imports CSV data into the MySQL database, including the data transformation pipeline and configuration options, so if you have such a cache of media files, you will be able to upload them with ease.
GigHive uses a multi-step process to transform a single source CSV file into normalized database tables. This process handles session data, musicians, songs, and media files while maintaining referential integrity.
The source CSV file contains the following fields:
Fields marked with asterisk () are mandatory for proper database import.

You have the option to load a full database or just a sample database. While this option most likely won’t be used, know that you have that ability. The import process is controlled by the database_full variable in your inventory group vars file:
File: $GIGHIVE_HOME/ansible/inventories/group_vars/gighive.yml
# app flavor for build-time overlay; database_full kept as-is
app_flavor: gighive
database_full: false # Set to true for full dataset, false for sample
database_full: true → Uses mysqlPrep_full.py (processes all sessions)database_full: false → Uses mysqlPrep_sample.py (processes only 2 sessions: 2002-10-24 and 2005-03-03)$GIGHIVE_HOME/ansible/roles/docker/files/mysql/dbScripts/loadutilities/database.csv$GIGHIVE_HOME/ansible/roles/docker/files/mysql/dbScripts/loadutilities/mysqlPrep_full.py$GIGHIVE_HOME/ansible/roles/docker/files/mysql/dbScripts/loadutilities/mysqlPrep_sample.py$GIGHIVE_HOME/ansible/roles/docker/files/mysql/dbScripts/loadutilities/doAllFull.sh$GIGHIVE_HOME/ansible/roles/docker/files/mysql/externalConfigs/prepped_csvs/full/$GIGHIVE_HOME/ansible/roles/docker/templates/docker-compose.yml.j2$GIGHIVE_HOME/ansible/roles/docker/files/docker-compose.yml (on VM)The Python scripts transform the single source CSV into multiple normalized tables:
database.csvSingle CSV with columns for sessions, musicians, songs, and files all in one row per session.
sessions.csv - Session metadata (date, location, crew, etc.)musicians.csv - Unique musician names with IDssession_musicians.csv - Many-to-many relationship between sessions and musicianssongs.csv - Unique songs with IDs (⚠️ Critical Fix Applied)session_songs.csv - Many-to-many relationship between sessions and songsfiles.csv - Media files (audio/video) with metadatasong_files.csv - Many-to-many relationship between songs and filesdatabase_augmented.csv - Original CSV with added columns for reference# Edit the source CSV
vim $GIGHIVE_HOME/ansible/roles/docker/files/mysql/dbScripts/loadutilities/database.csv
# Edit inventory file
vim $GIGHIVE_HOME/ansible/inventories/group_vars/gighive.yml
# Set database_full: true or false
cd $GIGHIVE_HOME/ansible/roles/docker/files/mysql/dbScripts/loadutilities
# Run the appropriate script based on database_full setting
python3 mysqlPrep_full.py # if database_full: true
# OR
python3 mysqlPrep_sample.py # if database_full: false
# Ansible copies files and restarts containers
ansible-playbook -i inventories/inventory_azure.yml playbooks/site.yml
# On the VM host
cd $GIGHIVE_HOME/ansible/roles/docker/files
./rebuildForDb.sh
The MySQL startup process will automatically consume the import files.
The entire process is automated through Ansible:
database_full settingThe import process creates the following table relationships:

f_singles column of source CSVAfter import, verify data integrity by executing the following SQL on the MySQL container:
-- Check for duplicate song associations (should be none after fix)
SELECT f.file_name, s.title, sess.date, sess.title as session_title
FROM files f
JOIN song_files sf ON f.file_id = sf.file_id
JOIN songs s ON sf.song_id = s.song_id
JOIN session_songs ss ON s.song_id = ss.song_id
JOIN sessions sess ON ss.session_id = sess.session_id
WHERE f.file_name LIKE '%19971230_2%'
ORDER BY sess.date;
$GIGHIVE_HOME/ansible/roles/docker/files/mysql/dbScripts/backupsdatabase_full: false) before full import$GIGHIVE_HOME/assets/video and $GIGHIVE_HOME/assets/audioThis documentation reflects the current state after applying the critical fix for unique song IDs per session.