โ† Back to blog
Validation

How to Validate JSON Against a Schema

Catch missing fields, wrong types, and invalid API payloads before they break your workflow.

2026-04-014 min readUpdated Apr 29, 2026

Valid JSON can still be wrong JSON.

A payload may parse correctly but still be missing required fields, using the wrong data types, or sending values your workflow does not expect.

This guide shows how to validate JSON against a schema so you can catch bad payloads before they move into storage, automation, analytics, or another API.

In Forge Json, a schema acts as a lightweight contract โ€” a set of rules your JSON must follow.

This is a common pattern when you need to validate API responses, check webhook payloads, or make sure AI-generated JSON follows the structure your app expects.

๐Ÿ’ก Tip

Paste your JSON into Forge Json and use Schema Validation to check whether the data matches the shape you expect.

This guide is for developers, data analysts, and operators working with API payloads, webhooks, integrations, or AI-generated JSON.

When to Use This

This approach is a good fit if:

  • your JSON is valid but not always reliable
  • you need required fields before sending data downstream
  • you want to catch wrong types, missing fields, or unexpected values
  • you need a reusable validation report before a transform pipeline

Before and After

Before: the JSON is syntactically valid, but it has missing and inconsistent fields.

{
  "order_id": "ORD_1001",
  "customer_email": null,
  "total_amount": "42.50",
  "currency": "USD",
  "items": 2
}

After: the validation step reports the problems before this payload breaks the next workflow.

{
  "valid": false,
  "errors": [
    {
      "field": "customer_email",
      "message": "Expected string, received null"
    },
    {
      "field": "total_amount",
      "message": "Expected number, received string"
    },
    {
      "field": "items",
      "message": "Expected array, received number"
    }
  ]
}

If you also need to clean messy API responses or normalize e-commerce orders with AI Draft, validation is a useful guardrail before or after the transformation step.

Why JSON Validation Matters

JSON syntax only tells you whether the data is formatted correctly. It does not tell you whether the data is useful, complete, or safe to pass into the next system.

That becomes painful when you need:

  • required fields like order_id, customer_email, or created_at
  • numbers to stay numbers instead of strings
  • arrays to contain the expected item shape
  • predictable payloads before storing, comparing, or exporting data

Without a repeatable validation workflow, these checks often end up scattered across one-off scripts, API handlers, or spreadsheet cleanup steps.

The raw JSON example below shows a common order payload that looks usable at first but fails the schema rules your downstream system expects.

How to Validate It with Forge Json

The validation step produces a contract-style report you can use before any pipeline step.

You can copy this setup directly:

{
  "schema": {
    "type": "object",
    "required": ["order_id", "customer_email", "total_amount", "currency", "items"],
    "properties": {
      "order_id": {
        "type": "string"
      },
      "customer_email": {
        "type": "string"
      },
      "total_amount": {
        "type": "number"
      },
      "currency": {
        "type": "string",
        "enum": ["USD", "EUR", "GBP"]
      },
      "items": {
        "type": "array",
        "items": {
          "type": "object",
          "required": ["sku", "qty"],
          "properties": {
            "sku": {
              "type": "string"
            },
            "qty": {
              "type": "number"
            }
          }
        }
      }
    }
  }
}

For this example, the workflow produces a short set of useful checks:

  1. Confirm the payload is an object with the required top-level fields.
  2. Check that each field has the expected type.
  3. Report invalid fields before the payload moves into the next pipeline step.

Result

You end up with:

  • a clear valid result
  • field-level errors that explain what failed
  • a reusable validation step you can run before cleanup, transformation, or export

If the validation rules are too strict, start smaller. First require the fields your workflow truly needs, then add enum checks, nested array checks, and stricter field rules once the core schema is stable.

The support material below shows the sample input, schema config, and validation result.

This pattern works well for API payloads, webhook events, AI-generated JSON, and any workflow where the next step expects a predictable structure.

Related JSON workflows:

Use the example panel below to open this sample input and validation workflow directly in the editor.

FAQ

What does it mean to validate JSON against a schema?

It means checking valid JSON against rules for required fields, expected types, allowed values, and nested structures.

Is JSON Schema validation the same as checking valid JSON?

No. Valid JSON only checks syntax. Schema validation checks whether the data follows the structure your workflow expects.

When should I validate JSON in a pipeline?

Validate JSON before a cleanup, transform, storage, or API step when the next system depends on a predictable payload shape.

Support material

Practical example and product context

Use these examples to understand the transformation and apply the same workflow in your own JSON tasks.

Example Transformation

Input
1{
2 "order_id": "ORD_1001",
3 "customer_email": null,
4 "total_amount": "42.50",
5 "currency": "USD",
6 "items": 2
7}
Config
1{
2 "schema": {
3 "type": "object",
4 "required": [
5 "order_id",
6 "customer_email",
7 "total_amount",
8 "currency",
9 "items"
10 ],
11 "properties": {
12 "order_id": {
13 "type": "string"
14 },
15 "customer_email": {
16 "type": "string"
17 },
18 "total_amount": {
19 "type": "number"
20 },
21 "currency": {
22 "type": "string",
23 "enum": [
24 "USD",
25 "EUR",
26 "GBP"
27 ]
28 },
29 "items": {
30 "type": "array",
31 "items": {
32 "type": "object",
33 "required": [
34 "sku",
35 "qty"
36 ],
37 "properties": {
38 "sku": {
39 "type": "string"
40 },
41 "qty": {
42 "type": "number"
43 }
44 }
45 }
46 }
47 }
48 }
49}
โ†“
Output
1{
2 "valid": false,
3 "errors": [
4 {
5 "field": "customer_email",
6 "message": "Expected string, received null"
7 },
8 {
9 "field": "total_amount",
10 "message": "Expected number, received string"
11 },
12 {
13 "field": "items",
14 "message": "Expected array, received number"
15 }
16 ]
17}
Built with Validation utility
Open the sample input and cleanup workflow in the editor.
View Utility

Related Articles

Continue with another practical guide in the same workflow area.