Forums

MySQL Database Shadowing

Hi all,

Soon to be PythonAnywhere customer here, I have a question about the database options.

I am running a mySQL database to power my website and I saw you can create multiple databases. Would it be possible to set up an additional database and have it "shadow" the main one?

So every time a change is affected in the main database, that change also happens in the "shadow" one. However any changes to the shadow database wouldn't be carried over to the main one.

The reason for this is I would like to connect my dev environment to the PA database so I always have the most up-to-date version, however I think connecting directly to the production database is a bad idea, for obvious reasons.

Do any of you know if something like this is possible, or if there is a viable alternative?

Many thanks!

We don't have anything like that built in, but it's an interesting idea!

One thing you could do to get the effect you want would simply be to copy the production database to the shadow one manually. Which web framework are you using? I ask because Django has some great tools to do that.

@giles Thanks! Looks like I'm in luck: I'm using Django with Wagtail CMS. Would you be able to link me to some resources on those tools you mentioned so that I can look into it? Hopefully it's not too complicated, I know close to nothing when it comes to databases.

Great! So the tools you need are dumpdata and loaddata, which respectively extract your data as JSON and copy a JSON file into the database. Both are run using Django's manage.py

The most important thing to make sure of -- and I can't emphasize this enough -- is that you should use dumpdata when running manage.py in a code tree that's got settings pointing to your live database, and loaddata in a different tree pointing to your dev database. It's the kind of thing it's easy to get round the wrong way, and of course the results of doing it the other way could be disastrous...

Do you think I could write and schedule a script to automate the dumping and loading of this data?

Sure! Of course, you'd need to be careful -- but you could write a simple bash script to do it and then set that up on the "Schedule" page.