Home Web Development Full-Stack Web Development with Go

Full-Stack Web Development with Go

By Nanik Tolaram , Nick Glynn
books-svg-icon Book
eBook $31.99 $21.99
Print $39.99
Subscription $15.99 $10 p/m for three months
$10 p/m for first 3 months. $15.99 p/m after that. Cancel Anytime!
What do you get with a Packt Subscription?
This book & 7000+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with a Packt Subscription?
This book & 6500+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with eBook + Subscription?
Download this book in EPUB and PDF formats, plus a monthly download credit
This book & 6500+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with a Packt Subscription?
This book & 6500+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with eBook?
Download this book in EPUB and PDF formats
Access this title in our online reader
DRM FREE - Read whenever, wherever and however you want
Online reader with customised display settings for better reading experience
What do you get with video?
Download this video in MP4 format
Access this title in our online reader
DRM FREE - Watch whenever, wherever and however you want
Online reader with customised display settings for better learning experience
What do you get with video?
Stream this video
Access this title in our online reader
DRM FREE - Watch whenever, wherever and however you want
Online reader with customised display settings for better learning experience
What do you get with Audiobook?
Download a zip folder consisting of audio files (in MP3 Format) along with supplementary PDF
What do you get with Exam Trainer?
Flashcards, Mock exams, Exam Tips, Practice Questions
Access these resources with our interactive certification platform
Mobile compatible-Practice whenever, wherever, however you want
BUY NOW $10 p/m for first 3 months. $15.99 p/m after that. Cancel Anytime!
eBook $31.99 $21.99
Print $39.99
Subscription $15.99 $10 p/m for three months
What do you get with a Packt Subscription?
This book & 7000+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with a Packt Subscription?
This book & 6500+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with eBook + Subscription?
Download this book in EPUB and PDF formats, plus a monthly download credit
This book & 6500+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with a Packt Subscription?
This book & 6500+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with eBook?
Download this book in EPUB and PDF formats
Access this title in our online reader
DRM FREE - Read whenever, wherever and however you want
Online reader with customised display settings for better reading experience
What do you get with video?
Download this video in MP4 format
Access this title in our online reader
DRM FREE - Watch whenever, wherever and however you want
Online reader with customised display settings for better learning experience
What do you get with video?
Stream this video
Access this title in our online reader
DRM FREE - Watch whenever, wherever and however you want
Online reader with customised display settings for better learning experience
What do you get with Audiobook?
Download a zip folder consisting of audio files (in MP3 Format) along with supplementary PDF
What do you get with Exam Trainer?
Flashcards, Mock exams, Exam Tips, Practice Questions
Access these resources with our interactive certification platform
Mobile compatible-Practice whenever, wherever, however you want
  1. Free Chapter
    Chapter 1: Building the Database and Model
About this book
Go is a modern programming language with capabilities to enable high-performance app development. With its growing web framework ecosystem, Go is a preferred choice for building complete web apps. This practical guide will enable you to take your Go skills to the next level building full stack apps. This book walks you through creating and developing a complete modern web service from auth, middleware, server-side rendering, databases, and modern frontend frameworks and Go-powered APIs. You’ll start by structuring the app and important aspects such as networking, before integrating all the different parts together to build a complete web product. Next, you’ll learn how to build and ship a complete product by starting with the fundamental building blocks of creating a Go backend. You’ll apply best practices for cookies, APIs, and security, and level up your skills with the fastest growing frontend framework, Vue. Once your full stack application is ready, you’ll understand how to push the app to production and be prepared to serve customers and share it with the world. By the end of this book, you’ll have learned how to build and ship secure, scalable, and complete products and how to combine Golang with existing products using best practices.
Publication date:
February 2023
Publisher
Packt
Pages
302
ISBN
9781803234199

 

Building the Database and Model

In this chapter, we will design the database that our sample application will use. We will walk through the design of the database and look at some of the tools that we are going to use to help us on our database design journey. We will be using the Postgres database and will look at how to run it locally using Docker. What is Docker? In simple terms, Docker is a tool that allows developers to run a variety of applications such as the database, the HTTP server, system tools, and so on – locally or in the cloud. Docker removes the need to install all the different dependencies required to use a particular application such as a database, and it makes it easier to manage and maintain applications than installing on bare metal in both local and cloud environments. This is possible using Docker because it packages everything into a single file similar to how a compressed file contains different files internally.

We will learn how to design a database that supports the features that we want to build, such as the following:

  • Creating an exercise
  • Creating a workout plan
  • Logging in to the system

We will also explore tools that will help in automatic code generation based on SQL queries, which reduces the amount of database-related code that needs to be written to a large extent. Readers will learn to use the tool to also auto-generate all the relevant CRUD operations without writing a single line of Go code.

In this chapter, we’ll be covering the following:

  • Installing Docker
  • Setting up Postgres
  • Designing the database
  • Installing sqlc
  • Using sqlc
  • Setting up the database
  • Generating CRUD with sqlc
  • Building the makefile
 

Technical requirements

In this book, we will be using version 1.16 of the Go programming language, but you are free to use later versions of Go, as the code will work without any changes. To make it easy, all the relevant files explained in this chapter can be checked out at https://github.com/PacktPublishing/Full-Stack-Web-Development-with-Go/tree/main/Chapter01. To work on the sample code in this chapter, make sure you change the directory to Chapter 1Full-Stack-Web-Development-with-Go/chapter1. If you are using Windows as a development machine, use WSL2 to perform all the different operations explained in this chapter.

 

Installing Docker

In this book, we will be using Docker to do things such as running databases and executing database tools, among others. You can install either Docker Desktop or Docker Engine. To understand more about the difference between Docker Desktop and Engine, visit the following link: https://docs.docker.com/desktop/linux/install/#differences-between-docker-desktop-for-linux-and-docker-engine. The authors use Docker Engine in Linux and Docker Desktop for Mac.

If you are installing Docker Desktop on your local machine, the following are the links for the different operating systems:

If you want to install Docker binaries, you can follow the following guide: https://docs.docker.com/engine/install/binaries/.

 

Setting up Postgres

The database we chose for the sample application is Postgres; we chose Postgres over other databases because of the wide variety of open source tools available for building, configuring, and maintaining Postgres. Postgres has been open source from version 1 since 1989 and it is used by big tech startups worldwide. The project has a lot of community support in terms of tools and utilities, which makes it easier to manage and maintain. The database is suitable for small all the way to big replicated data stores.

The easiest way to run it locally is to run it as a Docker container. First, use the following command to run Postgres:

docker run --name test-postgres \
-e POSTGRES_PASSWORD=mysecretpassword -p 5432:5432 -d postgres

The command will run postgres on port 5432; if by any chance you have other applications or other Postgres instances listening to this port, the command will fail. If you need to run Postgres on a different port, change the -p parameter (for example, -p 5555:5432) to a different port number.

If successful, you will see the container ID printed out. The ID will differ from what is shown here:

f7bdfb7d2c10c5f0c9227c9b0a720f21d3c7fa65907eb0c546b8f20f12621102

Check whether Postgres is up and running by using docker ps. The next thing to do is use the psql-client tool to connect to Postgres to test it out. A list of the different Postgres client tools available on different platforms can be found here: https://wiki.postgresql.org/wiki/PostgreSQL_Clients.

We will use the standard postgres psql tool using Docker. Open another terminal and use the following Docker command to run psql:

docker exec -it test-postgres psql -h localhost -p 5432 -U postgres -d postgres

What we are doing is executing the psql command inside the running Postgres container. You will see output such as the following, indicating that it has successfully connected to the Postgres database:

psql (12.3, server 14.5 (Debian 14.5-1.pgdg110+1))
WARNING: psql major version 12, server major version 14.
         Some psql features might not work.
Type "help" for help.
postgres=#

On a successful connection, you will see the following output. Note that the warning message mentions server major version 14 – this is to indicate that the server version is newer than the current psql version as per the documentation (https://www.postgresql.org/docs/12/app-psql.html). The psql client will work without any problem with the Postgres server:

psql (12.3, server 14.0 (Debian 14.0-1.pgdg110+1))
WARNING: psql major version 12, server major version 14.
         Some psql features might not work.
Type "help" for help.
postgres=#

Exit psql to go back to the command prompt by typing exit.

The following is some guidance on common errors when trying to connect to the database:

Error Message

Description

psql: error: could not connect to server: FATAL: password authentication failed for user “postgres”

The password specified when running Postgres does not match with the password passed in using psql. Check the password.

psql: error: could not connect to server: could not connect to server: Host is unreachable

The IP address that you use to connect to Postgres is wrong.

With this, you have completed the local setup of Postgres and are now ready to start looking into designing the database.

 

Designing the database

In this section, we will look at how to design the database to allow us to store information for the fitness tracking application. The following screenshot shows a mockup of the application:

Figure 1.1 – Screenshot of the sample application

Figure 1.1 – Screenshot of the sample application

Looking at these functionalities, we will look at designing a database structure that will look like the following entity relationship diagram:

Entity relationship diagram

An entity relationship diagram shows the relationships of entity sets stored in a database.

Figure 1.2 – Entity relationship diagram of our fitness application

Figure 1.2 – Entity relationship diagram of our fitness application

Let’s drill further into each table to understand the data that they contain:

Table Name

Description

Users

This table contains user information for login purposes. Passwords will be stored as a hash, not plaintext.

Images

This table contains images of exercises that users want to do. This table will store all the exercise images that the user uploads.

Exercises

This table contains the name of the exercise that the user wants to do. Users will define what kind of exercise they want to do.

Sets

This table contains the number of sets of each exercise that the user wants to do.

Workouts

This table contains the workouts that the user wants to do. Users define a workout as a combination of exercises with the number of sets that they want to do.

The trade-off we are making to store images in the database is to simplify the design; in reality, this might not be suitable for bigger images and production. Now that we have defined the database structure and understand what kind of data it will store, we need to look at how to implement it. One of the major criteria that we want to focus on is to completely separate writing SQL from the code; this way, we have a clear separation between the two, which will allow higher maintainability.

 

Installing sqlc

We have defined the database structure so now it’s time to talk a bit more about the tool that we are going to be using called sqlc. sqlc is an open source tool that generates type-safe code from SQL; this allows developers to focus on writing SQL and leave the Go code to sqlc. This reduces the development time, as sqlc takes care of the mundane coding of queries and types.

The tool is available at https://github.com/kyleconroy/sqlc. The tool helps developers focus on writing the SQL code that is needed for the application and it will generate all the relevant code needed for the application. This way, developers will be using pure Go code for database operations. The separation is clean and easily trackable.

The following diagram shows the flow that developers normally adopt when using the tool at a high level.

Figure 1.3 – Flow to use sqlc to generate Go code

Figure 1.3 – Flow to use sqlc to generate Go code

All SQL code will be written in .sql files, which will be read and converted by the sqlc tool into the different Go code.

Download and install SQL binary by using the following command:

go install github.com/kyleconroy/sqlc/cmd/sqlc@latest

Make sure your path includes the GOPATH/bin directory – for example, in our case, our path looks like the following:

…:/snap/bin:/home/nanik/goroot/go1.16.15/go/bin:/home/nanik/go/bin

If you don’t have GOPATH as part of the PATH environment variable, then you can use the following command to run sqlc:

$GOPATH/bin/sqlc
Usage:
  sqlc [command]
Available Commands:
  compile     Statically check SQL for syntax and type
  errors
  completion  Generate the autocompletion script for the
  specified shell
  generate    Generate Go code from SQL
  help        Help about any command
  init        Create an empty sqlc.yaml settings file
  upload      Upload the schema, queries, and configuration
  for this project
  version     Print the sqlc version number
Flags:
  -x, --experimental   enable experimental features (default: false)
  -f, --file string    specify an alternate config file (default: sqlc.yaml)
  -h, --help           help for sqlc

Use "sqlc [command] --help" for more information about a command.

At the time of writing, the latest version of sqlc is v1.13.0.

Now that we have installed the tool and understand the development workflow that we will be following when using the tool, we will look at how to use the tool for our application.

 

Using sqlc

First, let’s take a look at the different commands provided by sqlc and how they work.

Commands

Explanation

compile

This command helps check SQL syntax and reports any typing errors.

completion

This command is to generate an auto-completion script for your environment. The following are the supported environments: Bash, Fish, PowerShell, and zsh.

generate

A command to generate the .go files based on the provided SQL statements. This will be the command that we will be using a lot for the application.

init

This command is the first command that is used to initialize your application to start using this tool.

The following will show how to get started with using sqlc to set up a project. Create a directory inside chapter1 – for example, dbtest – and change the directory to the new directory (dbtest). Next, we will run sqlc with the init command:

sqlc init

This will automatically generate a file called sqlc.yaml, which contains a blank configuration as shown here:

version: "1"
project:
    id: ""
packages: []

The sqlc.yaml contains configuration information that sqlc will use to generate all the relevant .go code for our SQL statements.

Let’s take a look at the structure of the .yaml file to understand the different properties. The following shows an example of a completed structure:

version: "1"
packages:
 - name: "db"
   path: "db"
   queries: "./sqlquery"
   schema: "./sqlquery/schema/"
   engine: "postgresql"
   sql_engine: "database/sql"
   emit_db_tags: "true"
   emit_prepared_queries: true
   emit_interface: false
   emit_exact_table_names: false
   emit_empty_slices: false
   emit_exported_queries: false
   emit_json_tags: true
   json_tags_case_style: "snake"
   output_db_file_name: "db.go"
   output_models_file_name: "dbmodels.go"
   output_querier_file_name: "dbquerier.go"
   output_files_suffix: "_gen"

The following table explains the different fields:

Tag Name

Description

Name

Any string to be used as the package name.

Path

Specifies the name of the directory that will host the generated .go code.

Queries

Specifies the directory name containing the SQL queries that sqlc will use to generate the .go code.

Schema

A directory containing SQL files that will be used to generate all the relevant .go files.

Engine

Specifies the database engine that will be used: sqlc supports either MySQL or Postgres.

emit_db_tags

Setting this to true will generate the struct with db tags – for example:

type ExerciseTable struct {

ExerciseID int64 `db:"exercise_id"

ExerciseName string `db:"exercise_name"

}

emit_prepared_queries

Setting this to true instructs sqlc to support prepared queries in the generated code.

emit_interface

Setting this to true will instruct sqlc to generate the querier interface.

emit_exact_table_names

Setting this to true will instruct sqlc to mirror the struct name to the table name.

emit_empty_slices

Setting this to true will instruct sqlc to return an empty slice for returning data on many sides of the table.

emit_exported_queries

Setting this to true will instruct sqlc to allow the SQL statement used in the auto-generated code to be accessed by an outside package.

emit_json_tags

Setting this to true will generate the struct with JSON tags.

json_tags_case_style

This setting can accept the following – camel, pascal, snake, and none. The case style is used for the JSON tags used in the struct. Normally, this is used with emit_json_tags.

output_db_file_name

Name used as the filename for the auto-generated database file.

output_models_file_name

Name used as the filename for the auto-generated model file.

output_querier_file_name

Name used as the filename for the auto-generated querier file.

output_files_suffix

Suffix to be used as part of the auto-generated query file.

We have looked at the different parameters available in the tool, along with how to use the .yaml file to specify the different properties used to generate the relevant Go files. In the next section, we will set up our sample app database.

 

Setting up the database

We need to prepare and create the database using the psql client tool. The SQL database script can be found inside schema.sql under the db folder in the GitHub repository, and we are going to use this to create all the relevant tables inside Postgres.

Change the directory to chapter1 and run the Postgres database using the following Docker command:

docker run --name test-postgres -e POSTGRES_PASSWORD=mysecretpassword -v $(pwd):/usr/share/chapter1 -p 5432:5432 postgres

Once postgres is running, use the following command to enter into psql:

docker exec -it test-postgres psql -h localhost -p 5432 -U postgres -d postgres

Once inside the psql command, run the following:

\i /usr/share/chapter1/db/schema.sql

This will instruct psql to execute the commands inside schema.sql, and on completion, you will see the following output:

postgres=# \i /usr/share/chapter1/db/schema.sql
CREATE SCHEMA
CREATE TABLE
CREATE TABLE
CREATE TABLE
CREATE TABLE
CREATE TABLE

To reconfirm that everything is set up correctly, use the following command (do not forget to include the dot after gowebapp):

\dt gowebapp.* 

You should see the following output:

postgres=# \dt gowebapp.*
            List of relations
  Schema  |   Name    | Type  |  Owner
----------+-----------+-------+----------
 gowebapp | exercises | table | postgres
 gowebapp | images    | table | postgres
 gowebapp | sets      | table | postgres
 gowebapp | users     | table | postgres
 gowebapp | workouts  | table | postgres
(5 rows)

Now that we have completed setting up our database, we are ready to move to the next section, where we will be setting up sqlc to generate the Go files.

 

Generating CRUD with sqlc

CRUD stands for Create, Read, Update, and Delete, which refers to all the major functions that are inherent to relational databases. In this section, we will do the following for the application:

  • Complete the sqlc configuration file
  • Create SQL query files

Once done, we will be able to autogenerate the different files required to allow us to perform CRUD operations to the database from the application. First, open sqlc.yaml and enter the following configuration:

---
version: '1'
packages:
 - name: chapter1
   path: gen
   schema: db/
   queries: queries/
   engine: postgresql
   emit_db_tags: true
   emit_interface: false
   emit_exact_table_names: false
   emit_empty_slices: false
   emit_exported_queries: false
   emit_json_tags: true
   json_tags_case_style: camel
   output_files_suffix: _gen
   emit_prepared_queries: false

Our application is now complete with all that we need for the database, and sqlc will autogenerate the .go files. The following is how the application directory and files will look:

.
├── db
│   └── schema.sql
├── go.mod
├── queries
│   └── query.sql
└── sqlc.yaml

We can run sqlc to generate the .go files using the following command:

sqlc generate 

By default, sqlc will look for the sqlc.yaml file. If the filename is different, you can specify it using the -f flag as follows:

sqlc generate -f sqlc.yaml 

Once the operation completes, there will be no output; however, a new directory called gen will be generated as shown here:

./gen/
├── db.go
├── models.go
└── query.sql_gen.go

We have completed the auto-generation process using sqlc; now, let’s take a look at the schema and queries that sqlc uses to generate the code.

The following is a snippet of the schema.sql file that is used by sqlc to understand the structure of the database:

CREATE SCHEMA IF NOT EXISTS gowebapp;
CREATE TABLE gowebapp.users (
User_ID        BIGSERIAL PRIMARY KEY,
User_Name      text NOT NULL,
....
);
....
CREATE TABLE gowebapp.sets (
Set_ID      BIGSERIAL PRIMARY KEY,
Exercise_ID BIGINT NOT NULL,
Weight      INT NOT NULL DEFAULT 0
);

The other file sqlc uses is the query file. The query file contains all the relevant queries that will perform CRUD operations based on the database structure given here. The following is a snippet of the query.sql file:

-- name: ListUsers :many
-- get all users ordered by the username
SELECT *
FROM gowebapp.users
ORDER BY user_name;
...
-- name: DeleteUserImage :exec
-- delete a particular user's image
DELETE
FROM gowebapp.images i
WHERE i.user_id = $1;
...
-- name: UpsertExercise :one
-- insert or update exercise of a particular id
INSERT INTO gowebapp.exercises (Exercise_Name)
VALUES ($1) ON CONFLICT (Exercise_ID) DO
UPDATE
    SET Exercise_Name = EXCLUDED.Exercise_Name
    RETURNING Exercise_ID;
-- name: CreateUserImage :one
-- insert a new image
INSERT INTO gowebapp.images (User_ID, Content_Type,
                             Image_Data)
values ($1,
        $2,
        $3) RETURNING *;
...

Using query.sql and schema.sql, sqlc will automatically generate all the relevant .go files, combining information for these two files together and allowing the application to perform CRUD operations to the database by accessing it like a normal struct object in Go.

The last piece that we want to take a look at is the generated Go files. As shown previously, there are three auto-generated files inside the gen folders: db.go, models.go, and query.sql_gen.go.

Let’s take a look at each one of them to understand what they contain and how they will be used in our application:

  • db.go:

This file contains an interface that will be used by the other auto-generated files to make SQL calls to the database. It also contains functions to create a Go struct that is used to do CRUD operations.

A new function is used to create a query struct, passing in a DBTX struct. The DBTX struct implementation is either sql.DB or sql.Conn.

The WithTx function is used to wrap the Queries object in a database transaction; this is useful in situations where there could be an update operation on multiple tables that need to be committed in a single database transaction:

func New(db DBTX) *Queries {
  return &Queries{db: db}
}
func (q *Queries) WithTx(tx *sql.Tx) *Queries {
  return &Queries{
    db: tx,
  }
}
  • models.go:

This file contains the struct of the tables in the database:

type GowebappExercise struct {
  ExerciseID   int64  `db:"exercise_id"
    json:"exerciseID"`
  ExerciseName string `db:"exercise_name"
    json:"exerciseName"`
}
...
type GowebappWorkout struct {
  WorkoutID int64     `db:"workout_id"
    json:"workoutID"`
  UserID    int64     `db:"user_id" json:"userID"`
  SetID     int64     `db:"set_id" json:"setID"`
  StartDate time.Time `db:"start_date"
    json:"startDate"`
}
  • query.sql_gen.go:

This file contains CRUD functions for the database, along with the different parameters struct that can be used to perform the operation:

const deleteUsers = `-- name: DeleteUsers :exec
DELETE FROM gowebapp.users
WHERE user_id = $1
`
func (q *Queries) DeleteUsers(ctx context.Context,
userID int64) error {
  _, err := q.db.ExecContext(ctx, deleteUsers, userID)
  return err
}
...
const getUsers = `-- name: GetUsers :one
SELECT user_id, user_name, pass_word_hash, name, config, created_at, is_enabled FROM gowebapp.users
WHERE user_id = $1 LIMIT 1
`
func (q *Queries) GetUsers(ctx context.Context, userID int64) (GowebappUser, error) {
  row := q.db.QueryRowContext(ctx, getUsers, userID)
  var i GowebappUser
  err := row.Scan(
           &i.UserID,
           &i.UserName,
           &i.PassWordHash,
           &i.Name,
           &i.Config,
           &i.CreatedAt,
           &i.IsEnabled,
  )
  return i, err
}
...

Now that the database and auto-generated data to perform CRUD operations are complete, let’s try all this by doing a simple insert operation into the user table.

The following is a snippet of main.go:

package main
import (
  ...
)
func main() {
  ...
  // Open the database
  db, err := sql.Open("postgres", dbURI)
  if err != nil {
    panic(err)
  }
  // Connectivity check
  if err := db.Ping(); err != nil {
    log.Fatalln("Error from database ping:", err)
  }
  // Create the store
  st := chapter1.New(db)
  st.CreateUsers(context.Background(),
  chapter1.CreateUsersParams{
    UserName:     "testuser",
    PassWordHash: "hash",
    Name:         "test",
  })
}

The app is doing the following:

  1. Initializing the URL and opening the database
  2. Pinging the database
  3. Creating a new user using the CreateUsers(..) function

Make sure you are in the chapter1 directory and build the application by running the following command:

go build -o chapter1

The compiler will generate a new executable called chapter1. Execute the file, and on a successful run, you will see the data inserted successfully into the users table:

2022/05/15 16:10:49 Done!
Name : test, ID : 1

We have completed setting up everything from the database and using sqlc to generate the relevant Go code. In the next section, we are going to put everything together for ease of development.

 

Building the makefile

A makefile is a file that is used by the make utility; it contains a set of tasks consisting of different combined shell scripts. Makefiles are most used to perform operations such as compiling source code, installing executables, performing checks, and many more. The make utility is available for both macOS and Linux, while in Windows, you need to use Cygwin (https://www.cygwin.com/) or NMake (https://docs.microsoft.com/en-us/cpp/build/reference/nmake-reference).

We will create the makefile to automate the steps that we have performed in this chapter. This will make it easy to do the process repetitively when required without typing it manually. We are going to create a makefile that will do tasks such as the following:

  • Bringing up/down Postgres
  • Generating code using sqlc

The makefile can be seen in the chapter1 directory; the following shows a snippet of the script:

..
.PHONY : postgresup postgresdown psql createdb teardown_recreate generate
postgresup:
    docker run --name test-postgres -v $(PWD):/usr/share/chapter1 -e POSTGRES_PASSWORD=$(DB_PWD) -p 5432:5432 -d $(DB_NAME)
...
# task to create database without typing it manually
createdb:
    docker exec -it test-postgres psql $(PSQLURL) -c "\i /usr/share/chapter1/db/schema.sql"
...

With the makefile, you can now bring up the database easily using this command:

make postgresup

The following is used to bring down the database:

make postgresdown

sqlc will need to be invoked to regenerate the auto-generated code whenever changes are made to the schema and SQL queries. You can use the following command to regenerate the files:

make generate 
 

Summary

In this chapter, we have covered the different stages that we need to go through to set up the database for our fitness application. We have also written a makefile to save us time by automating different database-related tasks that will be needed for the development process.

In the next chapter, we will look at logging for our sample application. Logging is a simple, yet crucial component. Applications use logging to provide visibility into the running state of an application.

About the Authors
  • Nanik Tolaram

    Nanik is based in Sydney and is a big fan of open source technologies and projects. He has developed software using many different programming languages - Assembler, C, C++, Java, Go and many others. He is passionate about building products and teams from the ground up, and loves to share his knowledge through meetup presentations and tutorials.

    Browse publications by this author
  • Nick Glynn

    Nick is a Sydney based tech and product-focused entrepreneur and CTO with long experience building software projects used by 1000s of people daily. Formerly Nick was a technical consultant and trainer on Android, Embedded Linux, Security, C and C++ but now works to help teams create and ship meaningful products after exiting his previous company QuickaPay which his current employer acquired.

    Browse publications by this author
Full-Stack Web Development with Go
Unlock this book and the full library FREE for 7 days
Start now