How to check/look up UTF-8/Unicode characters on the command-line?

I did immediately only find this superuser.com question: the top answer suggests installing uniutils, which doesn’t appear to have been packaged for Fedora. Another answer makes use of xxd (nothing seems to provide the command on Fedora, dnf only finds the perl-Data-HexDump-XXD package) and one user has written a Perl script.

Some files and folders (inside archives) have had unknown and/or blank characters, causing various issues. Has similarly applied to specific text copied from the web. I’d love to be able to easily identify such characters and act accordingly (file an issue or contact otherwise).

Being able to supply a string would be particularly useful, when necessary.


For cleaning (i.e. renaming) local files and paths, I’m currenly using detox.

I normally use iconv for handling this from command line:

If you want to work with strings instead of files, there’s piconv:

1 Like