REST API with 1000 query templates

How do I manage maybe 1000 SQL-queries in a Golang REST API?

My SQL experience is at the upper basic level using Postgresql. I am today using a tool that you can use plain SQL as well as sort of ORM to direct access the database. The aim of using ORM is to make simpler, but with complex queries it really makes it harder. So I prefer using plain SQL and have no wish to learn GORM or other tool.

When it comes to Golang, my experience is at the early beginners level. I have done a simple REST API with about 5 queries. And it is manageable.

All Golang REST API examples are at the Hello World level. But I need a way to handle several hundreds structs and queries. Easy to code and easy to understand and maintain. And I have so far found 3 ways to do this.

  1. Store the queries using templates or similar. https://github.com/gchaincl/dotsql

  2. Use 1000 packages. But this seems to be unmanageable to me.

  3. Store the queries together with the struct in a lookup Postgresql database. Fetch the desired query and fire the query. This will be easy to maintain, but adds another layer that may impact speed.

Besides these thoughts, I have tried a dynamic approach with no luck.

I wonder if anyone that run into this can share some thoughts and advises?

TIA!

1 Like

Some quick thoughts… If you have X queries, I suspect you have X REST API endpoints. So the client (Javascript) code running in the browser will invoke one the X endpoints on some user action on the pages. Now you have to take the data from the user (embedded in the URL or POST - unless you are using something like GraphQL) and combine it with the SQL to complete the intended query/update and send something back to the user.

So only a given endpoint knows what to do the data from the user and is the only place that needs to know the SQL, plus the logic of how to generate the full SQL statement with the user data. At this point both the SQL fragments and user data are treated as data by the endpoint… so I would keep the SQL in the endpoint itself. Only if the processing logic was extremely regular would I consider storing it outside the endpoint logic; “extremely regular” means that a given endpoint serves multiple forms/pages on the client and the combining rules are simple and regular.

You must take care to sanitize the input, since this approach is amenable to SQL injection attacks.

Hope this helps.

2 Likes

Should I interpret your answer: “Use 1000 packages”?

Your SQL probably share some parts, so use an SQL builder package (like go-sqlbuilder) and reuse code where possible.

Since you say that you are a beginner with both Go and SQL, I’m wondering if what you say you want to do and what you actually need to do are vastly different things.

Some more details would be helpful. Are you trying to RESTify an existing system? Do you have any examples of your API endpoints and associated queries?

1 Like

The general wisdom is to start by keeping everything in one package. An “endpoint” will resolve to a function or method in a package. As you develop to support all 1000 endpoints it should become clear how to reorganize and break it into multiple packages. Perhaps one package for all customer endpoints, for example. If you have nginx or similar to direct the traffic, then it is simpler to break apart the packages without breaking your RESTful API. Even without nginx, it isn’t that hard.

Hope this helps!

I am not consider myself a beginner with SQL. Only with Golang. I want to RESTify an existing system and have no examples to share 1000 query templates - yet. I am in the phase of choosing method to handle tons of queries. I both need and want to do this in a maintainable way.

A wild guess: Every query with structs may take 20 lines of code or more. Maybe 20 K lines of code or even more. Searching, maintain and deploying this for some decade to come is a challenge. And I want to choose as correct way as possible.

So I am thankful for any input that make me choose a path.

sb.Select("id", "name", sb.As("COUNT(*)", "c"))

Is this not a sort of ORM?

So you prefer using packages?

Yes. Some queries are reused in many pages but presented in different ways in the front end. What is the benefits for storing outside in this case?

I didn’t mean to offend you. The very first line of your original post says you are at a “basic” level of proficiency with SQL. “Basic” or “beginner” is the same to me. No matter though.

You’re trying to RESTify an existing system, so have you even developed your API yet? Developing/organizing your API will likely go a long way to organizing not only your structs and queries, but the number of queries you require; perhaps far less than 1000. You might even decide that REST isn’t the way to go, and a GraphQL approach would be more suitable.

Unfortunately, you provide little “real” information other than the notion that you must somehow organize 1000 distinct queries. Would I use packages to do this? Yes. Would I use one package per query? No.

Look at it this way, how likely are you to change the SQL but not the logic?

  • If the normal case is to change both, then you might as well keep them together.
  • if you think you’ll make a lot of changes to the SQL but without having to change the logic, then storing the SQL external has some appeal.

Hope that makes sense,
Cecil

I was not quite clear. “upper” basic level :slight_smile:

No. Before starting, I must decide how to start.

Is not GraphQL another ORM?

Clear answer. Thank you!

Hi Sibert,

I would create a catalog of queries in the database. I mean… a table into the database to store the SQL with a unique ID to identify the SQL instead of put 1000 lines of SQL at the code.
The ID could be a parameter from the endpoint or even the resource URI (it depends on the business rule). So you would develop in go the mechanism to select the appropriate record.

Please, Let me know if this solution makes sense.

Cheers!
Marco.

To store the queries in a lookup database is one of the choices I consider. The advantage is that queries could be added and updated without deploying and restarting the API. And this should really simplify maintaining the API.

The disadvantages are that you add an extra (delaying?) layer AND the API cannot be moved to another server in an easy way.

Anyone have any opinion about this or already have done anything using a database for storing queries?

Updating and adding queries are done by daily basis (bugs, new stuff etc). The REST logic is basic, it may be changed monthly or even only some times a year.

But then you have 2 choices left. Database or templates. Which one is the best approach? Speed and easy to maintain?

Yes, I did !

It was about four hundred SQL called formula. The formula might be changed in anytime, so it was very useful to take advantage of that database approach.

Happy coding!

Marco.

Nice to hear that my idea works in reality. Not any delays caused by the database?

Yes, it works !
We didn’t detect delays so far.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.