Table JSON to Records Converter (Free Online Tool)

Convert table json to records online with a free tool for columns-and-rows exports, spreadsheet-style JSON, and API table responses. Paste the table payload, run the converter, and copy normal JSON objects.

Paste your JSON → Get results instantly (no signup)

⚡ Instant resultsNo signupRuns in your browser
Try examples:

Convert this table JSON export into normal JSON records.

{
"columns": [
{
"values": [
{
"data": [
"Project Name",
"Owner",
"Creation Date",
"Completed Tasks"
]
}
]
}
],
"rows": [
{
"values": [
{
"data": [
"My Project 1",
"Franklin",
"7/1/2015",
"387"
]
}
]
},
{
"values": [
{
"data": [
"My Project 2",
"Beth",
"7/12/2015",
"402"
]
}
]
}
]
}
Output
1[
2 {
3 "projectName": "My Project 1",
4 "owner": "Franklin",
5 "creationDate": "7/1/2015",
6 "completedTasks": "387"
7 },
8 {
9 "projectName": "My Project 2",
10 "owner": "Beth",
11 "creationDate": "7/12/2015",
12 "completedTasks": "402"
13 }
14]

Love the result?

Use this exact pipeline in your app, backend, or LLM workflow.

No setup needed. Works with curl, Node, Python.

Uses example data. For edited input, copy from the playground.

Read integration guide

Works with:

  • Columns and rows JSON exports
  • Spreadsheet-style JSON format
  • API responses with separate headers and row arrays
  • Table envelopes from reporting tools

Example: input → output

What is table json to records

Table json to records is the process of converting a table-shaped JSON export into normal JSON objects. The awkward input usually has one array for column names and another array for row values, often shaped like { "columns": [...], "rows": [...] }. That format is common in reporting APIs, spreadsheet-style JSON exports, low-code tools, and dashboard downloads, but it is hard to use directly in application code.

This free online converter turns rows and columns JSON into an array of records where every row becomes an object and every column label becomes a key. Instead of reading cell 0, cell 1, and cell 2, downstream code can read projectName, owner, or completedTasks. That makes the data easier to validate, clean, flatten, export, diff, or send into a ForgeJSON pipeline.

How to convert table json to records

Paste the table JSON into the input panel and run the converter. The default settings support the nested table envelope used by many exports: column labels at columns[0].values[0].data, rows at rows, and row values at values[0].data.

If your API response uses a simpler spreadsheet JSON format, set the headers path to the array of column names and the rows path to the array of row arrays. For example, { "headers": ["Name", "Score"], "data": [["Ada", "10"]] } uses headers as the headers path, data as the rows path, and an empty row values path because each row is already an array.

Use camelCase keys when the output should be JavaScript-friendly. Preserve keys as-is when exact column names matter for audits or handoff. If rows have too many or too few cells, choose whether to ignore extras, include extra values, or fail fast before the data reaches Normalize JSON, JSON to CSV, or another pipeline step.

table json to records example

Input
{
  "columns": [
    {
      "values": [
        {
          "data": ["Project Name", "Owner", "Creation Date", "Completed Tasks"]
        }
      ]
    }
  ],
  "rows": [
    {
      "values": [
        {
          "data": ["My Project 1", "Franklin", "7/1/2015", "387"]
        }
      ]
    },
    {
      "values": [
        {
          "data": ["My Project 2", "Beth", "7/12/2015", "402"]
        }
      ]
    }
  ]
}

Output
[
  {
    "projectName": "My Project 1",
    "owner": "Franklin",
    "creationDate": "7/1/2015",
    "completedTasks": "387"
  },
  {
    "projectName": "My Project 2",
    "owner": "Beth",
    "creationDate": "7/12/2015",
    "completedTasks": "402"
  }
]

This example shows the core conversion: the column array supplies field names, each row supplies cell values, and the converter zips them together by index. The result is normal JSON records that are easier to inspect and automate.

Common use cases of table json to records

Use table json to records when a service returns an API response rows columns shape instead of a normal array of objects. This happens in analytics products, spreadsheet integrations, database admin tools, reporting endpoints, and business intelligence exports that need to preserve table metadata.

Developers often use this converter before importing data into tests, scripts, dashboards, or LLM workflows. Once the table envelope becomes records, you can use Extract Fields from JSON to keep only important columns, Clean JSON to remove empty values, or JSON to CSV to hand the cleaned result back to spreadsheet users.

It is also useful when you need to normalize table JSON from several vendors. One API may call the headers columns, another may call them fields, and another may wrap each row in a values object. Configurable paths let you convert those variants without writing a one-off script for every export.

Common errors when using table json to records

The most common error is pointing the headers path or rows path at the wrong level. If the headers path resolves to an object instead of an array, the converter cannot build record keys. If the rows path resolves to a wrapper object instead of the row list, the tool cannot iterate through records.

Length mismatches are another frequent issue. A row with fewer cells than headers creates missing values. A row with extra cells has no obvious column name. Use strict mismatch handling when the source data should be complete, or ignore extras when the export includes trailing metadata you do not need.

Duplicate headers can also cause confusing output. Two columns named Status cannot both become status without a rule. The suffix option keeps both values by creating unique keys. If duplicate columns indicate a bad export, switch duplicate handling to error and validate the source first with the JSON Validator.

Why use this table json to records tool

This tool exists because the input shape is ugly and surprisingly common. A table envelope is compact for the system that produced it, but inconvenient for the people who need to work with it. Normal records are easier for humans to read and much easier for code to consume.

Using the browser tool is faster than writing a custom mapper when you are debugging a payload, testing an integration, or preparing a sample for another team. The output is visible immediately, and the same conversion can become part of a reusable pipeline when the workflow grows beyond a one-time paste.

The converter also keeps the transformation explicit. Paths, key casing, type inference, mismatch behavior, and duplicate header behavior are visible choices instead of hidden assumptions in a script.

table json to records vs other approaches and APIs

Table JSON to records is different from CSV to JSON. CSV to JSON starts with text rows and delimiters. This tool starts with JSON that already represents a table but separates headers from row values. If your input is a .csv string, use CSV to JSON. If your input is { "columns": [...], "rows": [...] }, use this converter.

It is also different from Normalize JSON. Normalization makes existing records share the same shape. Table JSON conversion creates the records first. In many API workflows, you convert columns and rows to JSON records, then normalize the result so missing fields, extra fields, and key order are consistent.

For APIs, this conversion is often the first cleanup step after receiving a reporting response. Convert the table envelope, inspect the record keys, then continue with validation, field extraction, flattening, or CSV export depending on the downstream contract.

Example pipeline on GitHub

The ForgeJSON Pipeline Spec repo includes a portable Table JSON to Records pipeline you can inspect, copy, or version with your own tooling:

Table JSON to Records pipeline example

The linked file is a plain JSON pipeline document using the ForgeJSON Pipeline Spec.

Frequently asked questions

What is table JSON?+

Table JSON is a JSON payload that stores column names separately from row values, often as `{ "columns": [...], "rows": [...] }`. It represents a table, but it is not yet a normal array of JSON records.

How does the tool turn rows and columns into objects?+

The converter reads the header array, reads each row's cell array, and matches them by index. The first header becomes the key for the first cell, the second header becomes the key for the second cell, and so on.

What happens when a row has more or fewer cells than the columns?+

You can choose the mismatch behavior. Ignore extra cells for forgiving exports, include extra cells when you need to preserve everything, or throw an error when the table shape must be exact.

Is this the same as CSV to JSON?+

No. CSV to JSON parses text with delimiters such as commas or tabs. Table JSON to Records starts with JSON that already contains separate header and row arrays, then converts that table envelope into normal JSON objects.

Can I use this for API responses?+

Yes. It is designed for API responses that return columns and rows instead of record objects. After conversion, the output can move into validation, cleanup, field extraction, CSV export, or another ForgeJSON pipeline step.

Common next steps

Related tools

Advanced usage (optional)

Table JSON to Records

v1.0.0
Convert
objectdestructive

Description

Table JSON to Records

Convert table-like JSON exports into normal JSON records.

This utility reads one array of headers and one array of rows, then zips each
row's cells into an object by index.

Default Shape

The defaults support table envelopes like:

{
  "columns": [{ "values": [{ "data": ["Project Name", "Owner"] }] }],
  "rows": [{ "values": [{ "data": ["My Project 1", "Franklin"] }] }]
}

Default output is an array:

[
  {
    "projectName": "My Project 1",
    "owner": "Franklin"
  }
]

Set outputKey to wrap the result:

{
  "projects": [
    {
      "projectName": "My Project 1"
    }
  ]
}

Row Values Path

rowValuesPath is resolved from each row object. Use an empty string when each
row is already the cell array:

{
  "headers": ["Name", "Score"],
  "data": [
    ["Ada", "10"],
    ["Linus", "11"]
  ]
}

Config:

{
  "headersPath": "headers",
  "rowsPath": "data",
  "rowValuesPath": ""
}

Configuration

NameTypeDefaultDescription
Headers Pathstringcolumns[0].values[0].dataPath to the array of column/header labels
Rows PathstringrowsPath to the array of row entries
Row Values Pathstringvalues[0].dataPath from each row to its cell array. Use an empty value when each row is already an array.
Output KeystringOptional wrapper key. Leave empty to return the records array directly.
Key CaseenumcamelCaseConvert header labels to camelCase or preserve them as-is camelCase none
Infer TypesbooleanfalseConvert numeric and boolean-like strings to JSON primitives
Length MismatchenumignoreExtraHow to handle rows with more or fewer cells than headers ignoreExtra includeExtra error
Duplicate HeadersenumsuffixSuffix duplicate header keys or fail on duplicates suffix error

Examples

AI Prompt
Convert this table JSON export into normal JSON records.
{
"columns": [
{
"values": [
{
"data": [
"Project Name",
"Owner",
"Creation Date",
"Completed Tasks"
]
}
]
}
],
"rows": [
{
"values": [
{
"data": [
"My Project 1",
"Franklin",
"7/1/2015",
"387"
]
}
]
},
{
"values": [
{
"data": [
"My Project 2",
"Beth",
"7/12/2015",
"402"
]
}
]
}
]
}
Output
1[
2 {
3 "projectName": "My Project 1",
4 "owner": "Franklin",
5 "creationDate": "7/1/2015",
6 "completedTasks": "387"
7 },
8 {
9 "projectName": "My Project 2",
10 "owner": "Beth",
11 "creationDate": "7/12/2015",
12 "completedTasks": "402"
13 }
14]
Config
Headers Path
columns[0].values[0].data
Rows Path
rows
Row Values Path
values[0].data
Output Key
Key Case
camelCase
Infer Types
OFF
Length Mismatch
ignoreExtra
Duplicate Headers
suffix

API Usage

POST /api/v1/utilities/convert.table-json-to-records
Example:
curl -X POST https://your-domain.com/api/v1/utilities/convert.table-json-to-records \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"inputs":{"primary":{"columns":[{"values":[{"data":["Project Name","Owner","Creation Date","Completed Tasks"]}]}],"rows":[{"values":[{"data":["My Project 1","Franklin","7/1/2015","387"]}]},{"values":[{"data":["My Project 2","Beth","7/12/2015","402"]}]}]}},"config":{"headersPath":"columns[0].values[0].data","rowsPath":"rows","rowValuesPath":"values[0].data","outputKey":"","keyCase":"camelCase","inferTypes":false,"onLengthMismatch":"ignoreExtra","duplicateHeaderMode":"suffix"}}'
Response
1[
2 {
3 "projectName": "My Project 1",
4 "owner": "Franklin",
5 "creationDate": "7/1/2015",
6 "completedTasks": "387"
7 },
8 {
9 "projectName": "My Project 2",
10 "owner": "Beth",
11 "creationDate": "7/12/2015",
12 "completedTasks": "402"
13 }
14]