Dynamic sql create,import,

We are getting excel file from customer
We don’t know the column names of excel file

We have create table runtime on the bases of excel column name.
Import excel into MySQL

We have use in select command . dynamic binding columns.

Hi @packs,

At which step of this scenario do you have problems?
What did you try that did not work?
Can you share a short version of the code that fails?

We are getting excel file from customer .
We don’t know how many columns are there in excel file .( reading excel column )
We have to store the excel file data into table ( dynamic column should be created from excel column all are text field )
We have to show the table data to user ( need dynamic structure for scan )

Hi, Prakash,

I recommend you look into the Excelize package to handle processing the Excel file: excelize package - github.com/xuri/excelize/v2 - Go Packages.

You should be able to generate the CREATE TABLE command pretty easily here (I haven’t tested this, but it should be a good starting point to generate the DDL to create the table. Written for Microsoft SQL Server, but it should be easy to tweak it to work with other RDBMS’s):

type textColumn struct {
    name string
    length int
}

func createCreateTableSQLForTextColumnTable(tableName string, textColumns []textColumn) string {
    sb := strings.Builder{}
    sb.WriteString("CREATE TABLE ")
    sb.WriteString(tableName)
    sb.WriteString(" (\n")
    for i, textCol := range textColumns {
        sb.WriteByte('\t')
        sb.WriteString(textcol.name)
        sb.WriteString(" VARCHAR(")
        sb.WriteString(strings.Atoi(textCol.length))
        sb.WriteString(",\n")
    }
    sb.WriteString(");")
    return sb.String()
}

Why do you need a dynamic structure? Because the columns are all strings, can’t you just call (*sql.Rows).Scan/(*sql.Row).Scan and scan into a []string slice?

func scanTextColumnTable(scanner interface{ Scan(...interface{}) error }, fields []string) error {
    vs := make([]interface{}, len(fields))
    for i := range fields {
        vs[i] = &fields[i]
    }
    return scanner.Scan(vs...)
}
1 Like

func scanTextColumnTable(scanner interface{ Scan(…interface{}) error, fields []string) error {
vs := make([]interface{}, len(fields))
for i := range fields {
vs[i] = &fields[i]
}
return scanner.Scan(vs…)
}

this function is not clear .
how I will get field names

`result [][]string
_Columnname = []string{}
_Columnwidth = []int{}

	allrows, err := tx.Queryx(appquery.Select_ccsall)
		_Columnname, err = allrows.Columns()

		length := len(_Columnname)

		pointers := make([]interface{}, length)
		container := make([]string, length)
		for i, _ := range pointers {
			pointers[i] = &container[i]
		}

		defer allrows.Close()
		for allrows.Next() {
			/*
				pointers := make([]interface{}, length)
				container := make([]string, length)

				for i := range pointers {
					pointers[i] = &container[i]
				}

			*/
			allrows.Scan(pointers...)
			result = append(result, container)
		}

`

this method is giving output in array format . we need json format

`func queryToJson(db *sql.DB, query string, args …interface{}) ([]byte, error) {
// an array of JSON objects
// the map key is the field name
var objects []map[string]interface{}

rows, err := db.Query(query, args...)
if err != nil {
	return nil, err
}
for rows.Next() {
	// figure out what columns were returned
	// the column names will be the JSON object field keys
	columns, err := rows.ColumnTypes()
	if err != nil {
		return nil, err
	}

	// Scan needs an array of pointers to the values it is setting
	// This creates the object and sets the values correctly
	values := make([]interface{}, len(columns))
	object := map[string]interface{}{}
	for i, column := range columns {
		object[column.Name()] = reflect.New(column.ScanType()).Interface()
		values[i] = object[column.Name()]
	}

	err = rows.Scan(values...)
	if err != nil {
		return nil, err
	}

	objects = append(objects, object)
}

// indent because I want to read the output
return json.MarshalIndent(objects, "", "\t")

}
`

this code working for me , I want column in same order as in the database table.

You could:

  1. Use your database’s INFORMATION_SCHEMA (or whatever else your particular RDBMS uses to describe its schema).
  2. Or you could store the information yourself (you could create a table to keep track of your dynamic tables’ column names).
  3. Or you could use the (*sql.Rows).Columns function to get the column names.

I assume you mean that you want the output to look like {"columnName0": "columnValue0", "columnName1": "columnValue1"}. In that case, you could combine the contents of _Columnname and container into a map before you return it:

m := make(map[string]string, len(_Columnname))
for i, k := range _Columnname {
    m[k] = container[i]
}
return m  // If you pass this to json.Marshal, you will get a JSON object.

That is hard! In Go, maps are unordered and JSON objects are unordered per the JSON specification, so you’ll have to find some 3rd party JSON library to handle it. Maybe someone here will have a suggestion.

Do you know any library for ordering column in database table ( field order )

Anybody is having any options. My customer is not accepting the alphabetic order . He want data should visualized according to the tables in the field.

for this development we are ready to pay.

I searched for “golang ordered json” and found this package that seems like it will do what you want: GitHub - virtuald/go-ordered-json: A fork of the golang encoding/json package to support ordered JSON decoding/encoding

That being said, why does the customer care about what the JSON looks like? The web application (or whatever user interface) should be able to show the data in any order regardless of the JSON layout.

2 Likes