Garbage value from JSON?

This is the desired result: DB Fiddle - SQL Database Playground

PGAdmin have about the same result:
image

But from Go it puts garbage instead of JSON value:

[{“status_id”:0,“val”:“IkFrdGl2Ig==”},{“status_id”:1,“val”:“IkluYWt0aXYi”}]

The query:

WITH list AS 
(SELECT status_id, json_array_elements(lang) as row FROM status)
SELECT status_id, row -> 'val' as val FROM list
WHERE row ->> 'key' = $1

The Go code:

// query to return a list with status languages
func getlang(query string, val string) interface{} {
	if len(query) > 0 {
		var list []map[string]interface{}
		rows, err := db.Queryx(query, val)
		if err != nil {
			log("no records")
		}

		defer rows.Close()

		for rows.Next() {
			row := make(map[string]interface{})
			err = rows.MapScan(row)
			if err != nil {
				log(err.Error())
			}
			list = append(list, row)
		}

		rows.Close()
		if len(list) == 0 {
			return ("norec")
		}
		return list
	}
	return nil
}

The query is OK outside Go, but replaces value with garbage?
No matter if I use placeholder or not.
What am I doing wrong?

This is not “garbage”:

$ base64 --decode <<< IkFrdGl2Ig==
"Aktiv"                                                          
$ base64 --decode <<< IkluYWt0aXYi
"Inaktiv"
1 Like

Why do I have to decode only the “val”? And not the rest of the result of the query?

And why do all other queries work without decoding but this one?

You haven’t shown any other queries, nor do you have shown any details about the tables and colums in question.

Though from the “PGAdmin” screenshot it looks as if the val column is a JSON column. Not sure what type the other things are which “work”.

Other queries not using JSON.

image

But the question remains where and how to decode, and why only ONE of the JSON columns? All other columns are correct UTF-8

Is nothing else using JSON or is all other JSON “fine”?

This is my first JSON column. Other uses “normal” columns. And it is just the JSON column that is not UTF-8.

Which DB driver do you use? Does it support JSON columns natively? Why do you use a JSON column at all for just plain strings?

How do you convert the result of your DB query into the JSON you shared? How does it look like if you inspect the query result directly using fmt.Printf("%#v", list)?

I found one solution:

instead of using the “JSON object”
row -> 'val' (Meaning “Active”)

I used the “value”
row ->> ''val'' (Meaning Active)

The question remains. Why cannot Go handle Object with UTF-8 like dbfiddle and PGAdmin?

Thats why I asked you which driver you use and if it supports JSON colums.

From other languages I know that there is only partial support for JSON colums, eg. Elixirs ecto does only support maps and arrays in the JSON colums, and the values have to be homogenously typed.

got your point pretty clear here. You have made a lot easier for me.

How do I know if these drivers support JSON?

The latter mentions in the README that it is in maintainance mode, though the description of it reads more like a self depreciation in favour of GitHub - jackc/pgx: PostgreSQL driver and toolkit for Go, which in turn explicitly mentions JSON and JSONB support in their README.

1 Like

@Sibert The driver question and answer from @niamul21 is the right answer for this, but it might miss a nice recommendation if you want to have the struct scanning from sqlx and but with the native level pgx support.

In this case, lib/pq with sqlx means you aren’t using native postgres types and also everything goes over to the db as text. The pgx library supports all the postgres types and can have better performance, gets security updates (since it isn’t archived like the repo you’re using), and finally, if you want the features of sqlx (a fine lib but it uses just the database interface and can’t use native interfaces) then you can probably move to using scany: https://github.com/georgysavva/scany

1 Like

This post was flagged by the community and is temporarily hidden.