A Comprehensive Guide to Formatting JSON Data as Terminal Tables Using jq and Bash Tools

Dec 06, 2025 · Programming · 14 views · 7.8

Keywords: jq | JSON formatting | Bash scripting

Abstract: This article explores how to leverage jq's @tsv filter and Bash tools like column and awk to transform JSON arrays into structured terminal table outputs. By analyzing best practices, it explains data filtering, header generation, automatic separator line creation, and column alignment techniques to help developers efficiently handle JSON data visualization needs.

In Bash scripting, processing JSON data often requires converting it into a readable table format for terminal output. jq, as a powerful JSON processing tool, combined with other Bash commands, can flexibly achieve this goal. Based on a real-world Q&A scenario, this article systematically introduces how to convert a JSON array containing user information into a neat table with headers.

Core Problem and Data Example

Assume we have a JSON array containing multiple user objects, each with name, id, and email attributes. The goal is to output a table in the terminal with only ID and Name columns, adding headers and separator lines. The original data example is as follows:

[{
    "name": "George",
    "id": 12,
    "email": "george@domain.example"
}, {
    "name": "Jack",
    "id": 18,
    "email": "jack@domain.example"
}, {
    "name": "Joe",
    "id": 19,
    "email": "joe@domain.example"
}]

The desired output format is:

ID        Name
=================
12        George
18        Jack
19        Joe

Basic Solution: Using jq's @tsv Filter

jq's @tsv (Tab-Separated Values) filter can convert arrays into tab-separated text, which is the foundation for generating tables. First, we need to extract the id and name properties from each object, ignoring email. The basic command is:

jq -r '.[] | [.id, .name] | @tsv'

The output is:

12	George
18	Jack
19	Joe

Here, the -r option ensures raw string output (without quotes), the pipe | connects filters, [.id, .name] creates an array with the desired properties, and @tsv converts it to tab-separated format.

Adding Headers and Separator Lines

To create a complete table, header rows and separator lines need to be added manually. This can be achieved by including static arrays in the jq expression:

jq -r '[\"ID\",\"NAME\"], [\"--\",\"------\"], (.[] | [.id, .name]) | @tsv'

Output:

ID	NAME
--	------
12	George
18	Jack
19	Joe

This method is straightforward, but separator line lengths need manual adjustment to match content. For automation, the map function can generate dashes based on header lengths:

jq -r '([\"ID\",\"NAME\"] | (., map(length*\"-\"))), (.[] | [.id, .name]) | @tsv'

Here, map(length*\"-\") generates hyphens of corresponding length for each header, improving flexibility.

Beautifying Tables with the column Command

Although @tsv produces tab-separated text, it may not align properly in the terminal. Combining with the column command enables automatic column alignment:

jq -r '([\"ID\",\"NAME\"] | (., map(length*\"-\"))), (.[] | [.id, .name]) | @tsv' | column -ts $'\t'

The output is neater:

ID  NAME
--  ------
12  George
18  Jack
19  Joe

column -ts $'\t' specifies tab as the separator and performs alignment, suitable for most Bash environments.

Advanced Techniques: Fine Formatting with awk

For more complex formatting needs, awk can be integrated. For example, using the @csv filter to generate comma-separated values, then adding headers and separators with awk:

jq -r '.[] | [.id, .name] | @csv' | awk -v FS=\",\" 'BEGIN{print \"ID\tName\";print \"============\"}{printf \"%s\t%s%s\",$1,$2,ORS}'

Output:

ID	Name
============
12	\"George\"
18	\"Jack\"
19	\"Joe\"

This method allows custom delimiters and formats, but note that @csv retains string quotes, which may require additional handling.

Handling Complex Data Structures

In practical applications, JSON data can be more complex. For instance, if the email property is an array, it can be extracted similarly:

jq -r '.[] | .email'

This outputs nested arrays, demonstrating jq's capability to handle diverse data.

Summary and Best Practices

Based on the above analysis, using the @tsv filter combined with the column command is recommended as the standard solution due to its simplicity and ability to handle edge cases. Key steps include: filtering desired properties, constructing arrays, applying @tsv, adding headers and separator lines, and aligning with column. For dynamic headers, use map(length*\"-\") to auto-generate separators. When fine control is needed, consider awk, but balance complexity.

In conclusion, through the synergy of jq and Bash toolchains, JSON data can be efficiently converted into terminal-friendly tables, enhancing data readability and script utility.

Copyright Notice: All rights in this article are reserved by the operators of DevGex. Reasonable sharing and citation are welcome; any reproduction, excerpting, or re-publication without prior permission is prohibited.