Is there an easy way to get the visible length of a string? My program returns the following string:
e[0;97;44;1m per e[0;34;104me[0me[0me[0;97;104;1m 2021-11-15 13:06 e[0;94;107me[0me[0me[0;94;107;1m ~/code/goprompt e[0;97;104me[0me[0me[0;97;104;1m go 1.17 e[0;94;44me[0me[0me[0;97;44;1m master|~3 e[0;34me[0me[0m
This string is over 200 bytes long, but the majority of characters in this string is non-printable characters, like color codes etc. The output in terminal is actually this:
These terminal escape sequences are specific to Unix shells (and there, probably only specific to a particular shell like bash or zsh). They exist since before the Unicode era, hence Unicode functions like RuneCountInString do not apply here.
You would need a custom length function that is aware of these shell escape sequences and the size of their visual representations in the terminal. Not sure if something like that exists in the wild.
Aha, I understand what you are saying. I will solve it manually then, by counting the length before the coloring is being applied. Sounds doable…thanks!