id summary reporter owner description type status priority component severity resolution keywords cc release 12312 Segmentation fault migrating very large table Ryan J Ollos Jun Omae "I was attempting to migrate an SQLite database to PostgreSQL last evening (in-place migration). The SQLite database is 4 GB on disk. The migration halts on `bitten_report_item` table a little under a minute after it finishes migrating the previous table (`bitten_report`), but before reporting that any rows have been migrated. `bitten_report_item` is the 6th or 7th table to be migrated. The only reported error is //Segmentation Fault// (I don't have the exact output at the moment, but I'll try to post that later). I guess it halts somewhere around: [browser:/tracmigrateplugin/0.12/tracmigrate/admin.py@14462:155-163#L123]. I don't have direct access to the database. I've been advising someone on the work remotely, writing the migration scripts and observing execution over video conference. The data in the database is proprietary so I can't have the entire database transferred to me, and I don't have direct remote access due to the firewall. That's just a lame explanation of why I don't have a better bug report, and might limit my ability to reproduce things. I'm going to try and reproduce the situation by creating a database of similar size with contrived data. I imagine more information is needed, but I just wanted to open the ticket ahead of time to see if you had an idea of what the failure mode might be with a very large table, assuming that is the cause. Should the migration be significantly affected by the amount of memory in the system? The server has 8 GB, but could try the migration on a workstation with more memory." defect closed normal TracMigratePlugin normal fixed