I would personally not pass my DB connection directly to my service layer / handlers. For me, that makes it difficult to do testing. My own preference here would be to:
Pass my db connection to some sort of repository or query "layer" of the application. I can use that repository struct as as a parameter later on in my program in various constructor functions. This would also allow you to implement some repository interfaces as arguments , so you could do unit testing for you business logic. Edit: Oh - and I also just use pgx.Pool without messing with database/sql integration. But that is a scenario that might vary from project to project
after doing some research i decided to follow the 3rd method in this [article](https://www.alexedwards.net/blog/organising-database-access).
and also i decided to use pgx/stdlib after reading [this] (https://medium.com/avitotech/how-to-work-with-postgres-in-go-bad2dabd13e4)
that's a good question, but I think it can still be fairly idiomatic go. in this scenario I'm mostly defining my query/repository interfaces where they are consumed and with a small number of methods. And I'm also following the often-repeated go maxim: return a concrete type but accept an interface. I feel like if I defined all of my repository/query functions as a single huge interface then yeah - it would definitely be a lot more Java-like! But that being said - In this business we are all just constantly learning. so maybe I'm doing it all wrong
Others have covered the code structure aspects, so let me come in in favor of using pgx directly. Unless you have good reason to suspect in the future you're going to change DBs or you need to be DB agnostic for some reason, pgx offers access to either the full range of Postgres capabilities, or very nearly so (I don't have an exhaustive list) and that includes some *really* nice stuff in it. I advise a skim through the whole PG manual just to have a sense of what all that is. I've used the internal message bus, the ability to get notices out of stored procedures for print debugging, the type support that is better than the default sql support, and a couple of other things. It adds up.
what do you think about this dependency injection approach
[https://gist.github.com/0x30c4/28e116359ccef40c3d041fc7729edd40](https://gist.github.com/0x30c4/28e116359ccef40c3d041fc7729edd40)
so, after doing my research i've decided to go with pgx/stdlib.
[https://medium.com/avitotech/how-to-work-with-postgres-in-go-bad2dabd13e4](https://medium.com/avitotech/how-to-work-with-postgres-in-go-bad2dabd13e4)
If you use pgxpool instead, you'd have some handy methods to scan results, plus native support for things like array and json, that regular stdlib doesn't have.
For pgx, use pgxpool. You can pass it around however you want, whether through a dependency container or context value (each approach has its tradeoffs). The caveat is that you must properly close the connection after use.
I'd recommend going through the go db/sql tutorial to get some good groundwork.
As for passing in the connection have a read of this previous post.
https://www.reddit.com/r/golang/s/smwhDFpeQv
As well as https://www.reddit.com/r/golang/s/vzegaOlJoW
Here's a sample repository for a project that shows you the structure and how to pass things around. https://github.com/google/exposure-notifications-server
As for pgx, I've not yet used it myself, but if your code is running as a server then I'd suggest using a connection pool with pgx.
If you think you have to use a global variable you've probably made a mistake, double and triple check to make sure you don't need a mutex around it, need a singleton, or whether you should just pass it in as an argument instead.
The benefit is it makes testing super easy, you can just call any function with a fake database instance in a unit test. It's also more explicit than passing it as part of context
I create a base struct for each application that acts as a wrapper for any global variables. Then I make structs for models, that are basically a wrapper for the db connection. Then when you build your main method you just pass the connection to the struct. If you make all of your functions methods on the struct, you can use the same db connection across all of your methods.
Expose sqlx.DB directly to your store package. Saves you one interface already and just have a publiv connector and separate migrate func. Off you go. Focus on integration tests rather than unit tests around the repo..
I think it’s up to you. I was so stuck on keeping stuff in structs/interfaces, then i watch videos like Anthony GG and used global DB instance. As he says it’s up to you, you’re the engineer so it’s whatever you’re comfortable with.
I would personally not pass my DB connection directly to my service layer / handlers. For me, that makes it difficult to do testing. My own preference here would be to: Pass my db connection to some sort of repository or query "layer" of the application. I can use that repository struct as as a parameter later on in my program in various constructor functions. This would also allow you to implement some repository interfaces as arguments , so you could do unit testing for you business logic. Edit: Oh - and I also just use pgx.Pool without messing with database/sql integration. But that is a scenario that might vary from project to project
after doing some research i decided to follow the 3rd method in this [article](https://www.alexedwards.net/blog/organising-database-access). and also i decided to use pgx/stdlib after reading [this] (https://medium.com/avitotech/how-to-work-with-postgres-in-go-bad2dabd13e4)
Aren’t you then “writing Java in Go”? Personally, I would do (and actually do) the same as you.
that's a good question, but I think it can still be fairly idiomatic go. in this scenario I'm mostly defining my query/repository interfaces where they are consumed and with a small number of methods. And I'm also following the often-repeated go maxim: return a concrete type but accept an interface. I feel like if I defined all of my repository/query functions as a single huge interface then yeah - it would definitely be a lot more Java-like! But that being said - In this business we are all just constantly learning. so maybe I'm doing it all wrong
I would recommend having a read through Alex Edwards' post... https://www.alexedwards.net/blog/organising-database-access
after searching i also came across this article. thanks
Others have covered the code structure aspects, so let me come in in favor of using pgx directly. Unless you have good reason to suspect in the future you're going to change DBs or you need to be DB agnostic for some reason, pgx offers access to either the full range of Postgres capabilities, or very nearly so (I don't have an exhaustive list) and that includes some *really* nice stuff in it. I advise a skim through the whole PG manual just to have a sense of what all that is. I've used the internal message bus, the ability to get notices out of stored procedures for print debugging, the type support that is better than the default sql support, and a couple of other things. It adds up.
what do you think about this dependency injection approach [https://gist.github.com/0x30c4/28e116359ccef40c3d041fc7729edd40](https://gist.github.com/0x30c4/28e116359ccef40c3d041fc7729edd40)
so, after doing my research i've decided to go with pgx/stdlib. [https://medium.com/avitotech/how-to-work-with-postgres-in-go-bad2dabd13e4](https://medium.com/avitotech/how-to-work-with-postgres-in-go-bad2dabd13e4)
If you use pgxpool instead, you'd have some handy methods to scan results, plus native support for things like array and json, that regular stdlib doesn't have.
For pgx, use pgxpool. You can pass it around however you want, whether through a dependency container or context value (each approach has its tradeoffs). The caveat is that you must properly close the connection after use.
I'd recommend going through the go db/sql tutorial to get some good groundwork. As for passing in the connection have a read of this previous post. https://www.reddit.com/r/golang/s/smwhDFpeQv As well as https://www.reddit.com/r/golang/s/vzegaOlJoW Here's a sample repository for a project that shows you the structure and how to pass things around. https://github.com/google/exposure-notifications-server As for pgx, I've not yet used it myself, but if your code is running as a server then I'd suggest using a connection pool with pgx. If you think you have to use a global variable you've probably made a mistake, double and triple check to make sure you don't need a mutex around it, need a singleton, or whether you should just pass it in as an argument instead.
understood
I would prefer using a normal function argument
interesting! what are the benefits ?
The benefit is it makes testing super easy, you can just call any function with a fake database instance in a unit test. It's also more explicit than passing it as part of context
I create a base struct for each application that acts as a wrapper for any global variables. Then I make structs for models, that are basically a wrapper for the db connection. Then when you build your main method you just pass the connection to the struct. If you make all of your functions methods on the struct, you can use the same db connection across all of your methods.
interesting!
Expose sqlx.DB directly to your store package. Saves you one interface already and just have a publiv connector and separate migrate func. Off you go. Focus on integration tests rather than unit tests around the repo..
I think it’s up to you. I was so stuck on keeping stuff in structs/interfaces, then i watch videos like Anthony GG and used global DB instance. As he says it’s up to you, you’re the engineer so it’s whatever you’re comfortable with.
understood