Its been years since I have last updated Qwtly and I was given the opportunity to play around and modernize the database for my application. I wanted to try the cloud offering of MongoDB called Atlas being that its free for a small database.
With this in mind and considering that Qwtly doesn't get traffic after I have disabled the add, edit and delete quote function along with the login, I don't see the application getting to that limit of 5GB anyway. Well, that is considering if I can even get this to work.
The first order of business was to see if we can import the MySQL export painlessly to MongoDB Atlas. I have searched for MongoDB tools, external tools, scripts, only to find old abandoned projects which would not be ideal given my situation. I have considered writing a PHP script to do it but that too would cost time. I was looking for something that consists of using existing tools or features I am familiar with along with some manual eyballing and checking. Luckily, I came across this post from more than a decade ago!
The basic concept is to have your MySQL data be exported into CSV and then feed it to MongoDB via the mongoimport command.
After my first attempt, I was getting trash data. It was getting imported but for some reason the JSON does not even resemble anything close to the original schema. Apparently, I was exporting as semicolon delimited. After fixing it, it was importing with some minor issue: the newline and the quotations are messing the imports.
select * from tbl_quotes WHERE quote_body REGEXP "\n";
Lo and behold: "7 rows"
I cleaned the quote bodies by hand and replaced the new lines with \n in the hopes that the import will keep the \n (update: it did not).
Another issue was the user website URL column and user description column which were riddled with newlines and rogue quotation. I just NULL-ed it all out by hand since the record only hand 13 entries.
Exported the tables and imported via the commands:
mongoimport --uri mongodb+srv://<user>:<password>@<endpoint>.mongodb.net/qwtly --collection=<user_collection_name> --headerline --ignoreBlanks --type=csv --file=users.csv
mongoimport --uri mongodb+srv://<user>:<password>@<endpoint>.mongodb.net/qwtly --collection=<quotes_collection_name> --headerline --ignoreBlanks --type=csv --file=quotes.csv
I opted to use the --ignoreBlanks since it really defeats the purpose of using a document store database if you would just fill it up with NULLs. We are avoiding NULL hell in the first place thats why we are moving to Mongo DB.
After importing, I guess I am happy with the result. The number of records imported and converted as documents matches the original number in the website:
While my old MySQL schema wasn't really that complex, and considering that I have a separate table for users, it was easy for the export import to correlate the information. I assume that more complex databases would need better field association but then again, there are paid tools for that considering if it's needed in the enterprise level.
Data looks nice.
TLDR;
If you have a simple enough database, you might be able to get away with a simple CSV export and a mongoimport to convert your database. The generic steps include:
- Sanitize your data, clean, convert, edit anything that would mess up the importing like rogue newlines and quotation marks
- Export to CSV
- Import via mongoimport
Up next in part 2 would be to create a sample PHP code to do create, read, update and delete (CRUD) operations. I am also in the process of checking if there are CodeIgniter libraries which I can use to interface with my data in Atlas.
Comments
Post a Comment