Skip to main content

Modernizing Qwtlys Database Part 1

Its been years since I have last updated Qwtly and I was given the opportunity to play around and modernize the database for my application. I wanted to try the cloud offering of MongoDB called Atlas being that its free for a small database. 

With this in mind and considering that Qwtly doesn't get traffic after I have disabled the add, edit and delete quote function along with the login, I don't see the application getting to that limit of 5GB anyway. Well, that is considering if I can even get this to work. 

The first order of business was to see if we can import the MySQL export painlessly to MongoDB Atlas. I have searched for MongoDB tools, external tools, scripts, only to find old abandoned projects which would not be ideal given my situation. I have considered writing a PHP script to do it but that too would cost time. I was looking for something that consists of using existing tools or features I am familiar with along with some manual eyballing and checking. Luckily, I came across this post from more than a decade ago!

The basic concept is to have your MySQL data be exported into CSV and then feed it to MongoDB via the mongoimport command.

After my first attempt, I was getting trash data. It was getting imported but for some reason the JSON does not even resemble anything close to the original schema. Apparently, I was exporting as semicolon delimited. After fixing it, it was importing with some minor issue: the newline and the quotations are messing the imports.


A little Googling and a nice little query gave me an idea if I can clean these up by hand:

select * from tbl_quotes WHERE quote_body REGEXP "\n";

Lo and behold: "7 rows"

I cleaned the quote bodies by hand and replaced the new lines with \n in the hopes that the import will keep the \n (update: it did not).

Another issue was the user website URL column and user description column which were riddled with newlines and rogue quotation. I just NULL-ed it all out by hand since the record only hand 13 entries.

Exported the tables and imported via the commands: 

mongoimport --uri mongodb+srv://<user>:<password>@<endpoint>.mongodb.net/qwtly --collection=<user_collection_name> --headerline --ignoreBlanks --type=csv --file=users.csv

mongoimport --uri mongodb+srv://<user>:<password>@<endpoint>.mongodb.net/qwtly --collection=<quotes_collection_name> --headerline --ignoreBlanks --type=csv --file=quotes.csv

I opted to use the --ignoreBlanks since it really defeats the purpose of using a document store database if you would just fill it up with NULLs. We are avoiding NULL hell in the first place thats why we are moving to Mongo DB.

After importing, I guess I am happy with the result. The number of records imported and converted as documents matches the original number in the website: 



While my old MySQL schema wasn't really that complex, and considering that I have a separate table for users, it was easy for the export import to correlate the information. I assume that more complex databases would need better field association but then again, there are paid tools for that considering if it's needed in the enterprise level. 

Data looks nice. 


TLDR;

If you have a simple enough database, you might be able to get away with a simple CSV export and a mongoimport to convert your database. The generic steps include: 
  1. Sanitize your data, clean, convert, edit anything that would mess up the importing like rogue newlines and quotation marks
  2. Export to CSV
  3. Import via mongoimport
Up next in part 2 would be to create a sample PHP code to do create, read, update and delete (CRUD) operations. I am also in the process of checking if there are CodeIgniter libraries which I can use to interface with my data in Atlas. 

Comments

Popular posts from this blog

Moving to a New Linux Web Based Torrent Client

For years, I have been using TorrentFlux (url here) as my primary torrent client situated in my Ubuntu download server. But as time went on, the developers completely abandoned the development of TorrentFlux which led to several forks which I think is still insufficient for my needs. Main GUI of TorrentFlux Ive checked several options which runs on a GUI-less environment. Since my Ubuntu server is just running on command line to save precious memory, I needed something bare, simple and is packed with features. Installing uTorrent Server is pretty straight forward. Download. Uncompress. Run. This is better than the approach of TorrentFlux which you need to setup LAMP server and create a database. More often than not, it happens to me that some of the data in the DB gets corrupted. I normally just reinstall the whole thing again. Main GUI of uTorrent Server To further elaborate on the setup process, I've gotten an excerpt from this thread which, quite simply discusses ho

LTE and the Unlocked iPhone 5

So heres the deal, theres a new iPhone out there and its got the new blazing LTE. If you're planning to get one buying from countries which have unlocked options (Canada, Singapore, Hong Kong) doesnt necessarily merit that you'll get LTE compatibility wherever you go. In the iPhone 5 website, theres a little asterisk there. Pretty sneaky if you ask me, that they have created 2 different models probably using 2 different LTE chips. GSM model A1428*: UMTS/HSPA+/DC-HSDPA (850, 900, 1900, 2100 MHz); GSM/EDGE (850, 900, 1800, 1900 MHz); LTE (Bands 4 and 17)  GSM model A1429*: UMTS/HSPA+/DC-HSDPA (850, 900, 1900, 2100 MHz); GSM/EDGE (850, 900, 1800, 1900 MHz); LTE (Bands 1, 3, 5) - From: http://www.apple.com/iphone/specs.html So what does this mean? If you're planning to get an unlocked unit from other country, LTE speeds might not be compatible with yours. From Wikpedia , theres a working list of LTE bands per country. With me residing in the Philippines it seems t

Ive Messed Up My Master Boot Record

I got too overly excited in refreshing my OS installation in my old Inspiron 640m that I just cleaned off the Linux partition through the Drive Management Snap-in of Microsoft while I was still booted in XP. I completely forgot that the GRUB was the bootloader managing my OS selection in the Master Boot Record of my drive. In plain English, I wanted to clear out my old Linux installation and merge the partition with the old XP partition when I run the Windows 7 install. It was a mistake to use the Drive Management Snap-in rather than having the Windows 7 installation take care of the partition clearing. This caused problems because the bootloader (GRUB) or the one which asks which OS are you going to boot is in the partition I wiped out. Therefore, I cannot go into the Linux partition (obviously, since it has already been wiped out) nor the Windows XP partition. There is a quick fix with this by using the XP install CD and fixing the MBR by going to the command prompt and typing fi