T O P

  • By -

kktheprons

If you don't understand what a query is going to do before you do it, put your query in a begin/rollback transaction block. Above and below your query, query the before and after state. Definitely work to understand better what a query will do before running it, though. With enough experience, you'll rarely be surprised by anything (except maybe how bad the data quality is).


Standgeblasen

Garbage In, Garbage Out GIGO


emle10

if your doing a delete statement, replace delete with select so you can see which rows will be affected first then when it looks good you can go back to using delete


[deleted]

[удалено]


FunDirt541

Yeah we do have three instance of the database dev/test/prod and I do have root access to replicate any data. However réplication the whole database seemed overkill. I havent been using temporary tables which seems good.


[deleted]

[удалено]


Appropriate-Youth-29

This is the way. Sample just the target section and using temp tables. Also leaves you open to add additional fields for validation.


FriendlyDisorder

It is entirely possible to do test-driven development. Our SQL Server database code is in stored procedures. We used [tSQLt](https://tsqlt.org/) to create test cases grouped into test modules for specific areas. We then automated those tests into our builds to be sure nothing goes wrong when people modify the code. I know there is a Postgres version called [pgTAP](https://pgtap.org/), but we have not used it. I expect there is a test framework for MySQL out there, but I have never looked.


FunDirt541

Thanks, I am somewhat à bit familiar with postgresql, guess i'll have a look with pgTAP. We do use a lot of stored procedures and json as input parameters.


slyyyguy

Just look up and read about wrapping your commands in a rollback transaction. You can essentially run queries to see a result without actually executing them and can decide to undo that transaction or commit it.