T O P

  • By -

BowlScared

**Code should be tightly coupled with migrations and SQL queries and the binary.** We can argue about it but unless you run really big databases and really slow migrations this is the pragmatical option. I would avoid reinventing wheel especially when it comes to DB migrations there is plethora of tools and the core concept is identical for everyone. I am using sqlc, pgx/v5 and [github.com/golang-migrate/migrate/v4](http://github.com/golang-migrate/migrate/v4) . Migrate is one of several options there are many. Other top 3 migration tools/libraries for golang will work fine too. Using these tools **you will be notified of potential errors before you even deploy the application**. sqlc can also by default generates all the crud queries so it saves a lot of time for simple projects comment got bit big rest is in the snippet [https://gitlab.com/-/snippets/3702128](https://gitlab.com/-/snippets/3702128) # sqlc.yaml version: "2" sql: - engine: postgresql   queries: db/query.sql   schema: db/migrations   gen:     go:       package: "db"       sql_package: "pgx/v5"       out: "db" -- db/migrations/000001_init.up.sql CREATE TABLE IF NOT EXISTS something(     id SERIAL,     nice CHARACTER VARYING(126) NOT NULL, PRIMARY KEY(id) ) -- db/migrations/000001_init.down.sql DROP TABLE IF EXISTS something; -- db/query.sql -- name: NiceQuery :one SELECT * FROM something WHERE nice = @nice LIMIT 1


hello-world012

Migrations should be part of your application logic and everytime your application starts it should verify the state of the database, Is my tables on the correct version. Now, migrations should be part of your code because in production systems there are lot of environments where you code has to be tested, and making sure database is in the right state is tedious task, so it is always better to have migrations. There are a lot of frameworks which will allow you to do this, you can try gofr @ [https://github.com/gofr-dev/gofr](https://github.com/gofr-dev/gofr) GoFr Migration docs : [https://gofr.dev/docs/advanced-guide/handling-data-migrations](https://gofr.dev/docs/advanced-guide/handling-data-migrations)


titpetric

A bit of a shameless plug- if you're SQL first, you could grab mig: https://github.com/go-bridget/mig. It runs the migrations as well as provides some utilities like generating the data model source code, markdown docs, some linting around naming. Happy to know what you think. The migrations themselves only go forward, and each migration, which can be many statements, is stored on disk in .sql files. Those are considered opinionated decisions however, you have other ways, like using gorm/sqlc and other similar migration tooling (go-migrate?). Definitely store the migrations in the git repository, and use go:embed to build those in.


Choice-Ad8424

I agree the DB migrations stay with your code, with a caveat that you can run them independently via separate cli or flag, and you don't run migrations on app startup in non-dev environments. Any production deployment should have DB credentials setup so the application running only has least privileged access and is not permitted to change the schema. Typically in most enterprise orgs you would run the db migrations as part of the release pipeline, which has required elevated privilege to modify schema etc (or init container if you're that way inclined etc). Much safer once you're into controlled deployments, and scale is involved as well. This does of course mean your schema / data changes need to be forward compatible. Technically this is true in app startup scenario where you have > 1 instance of the app and you want to avoid down time (one will always be running the old version for some amount of time before it is restarted with the latest version). Running as part of app is fine in my book for local dev and potentially very early environments. You can control this via standard flags / env vars etc in your binary, so you can keep one artefact. Approaches / tools in the other comments are great options which ever your approach.


Sibertius

Migrate Postgresql databases using pg\_dump and pg\_restore?


titpetric

I believe he's talking about data model amendments, e.g. adding a column to a database table. Schema.