JSON to Excel: Convert JSON Files Into Spreadsheets the Right Way

Convert JSON to Excel using Power Query, VBA, Python, or online tools. Step-by-step methods with screenshots, error fixes, and nested array handling.

JSON to Excel: Convert JSON Files Into Spreadsheets the Right Way

You opened a JSON file expecting clean rows and columns. Instead you got a wall of curly braces, square brackets, and quotation marks that looks more like punctuation soup than data. If that sounds familiar, you're in the right place. Converting JSON to Excel is one of those tasks that seems trivial until you actually try it on a real file with nested arrays, mixed types, or 50,000 records that crash your browser.

The good news? Excel has handled JSON natively since 2019 through Power Query, and modern versions make the process surprisingly painless once you know which path to take. The bad news? Picking the wrong method can flatten your nested data into a useless mess, skip records silently, or leave you wrestling with #VALUE! errors for an hour.

This guide walks through the four reliable ways to get JSON into Excel: Power Query (the built-in option), VBA (for repeatable workflows), Python with pandas (for serious automation), and online converters (when you need it done in 30 seconds). We'll also cover the gotchas, common error messages, and how to handle the awkward structures that trip up most tutorials.

JSON to Excel by the Numbers

1.04MCells Excel can hold per sheet
2016+Excel versions with Power Query
32 MBTypical JSON file ceiling for Excel
4Reliable conversion methods

Why JSON Is Awkward in the First Place

JSON, short for JavaScript Object Notation, stores data as nested key-value pairs. A simple JSON might look like a flat list of customer records, each with a name, email, and phone number. That converts to Excel trivially. The trouble starts when objects contain other objects, or when arrays sit inside arrays.

Take a typical API response. The top level is an array of orders. Each order has a customer object, a shipping address object, and an array of line items. Each line item has its own product details. Mapping that into a flat spreadsheet means deciding which fields become columns, which records get duplicated, and which nested arrays get exploded into their own sheets.

Excel can't read your mind. You have to tell it how to flatten the structure, and that's where Power Query earns its keep. Before you convert anything, open the JSON in a text editor and look at the shape. Is it a single object, an array of objects, or an object containing nested arrays? The answer changes which import option to pick. If you can't tell, paste the first few lines into a JSON viewer like jsonlint.com — it'll render the hierarchy as a collapsible tree.

Power Query also supports parameter-driven imports. You can replace the hard-coded file path with a cell reference, meaning end users can change the JSON file location without ever opening the Power Query Editor. This is the difference between a one-off conversion and a reusable workbook.

Microsoft Excel - Microsoft Excel certification study resource

Small flat JSON under 5 MB? Use an online converter. Need to repeat the import monthly? Use Power Query and save the query. Doing this in production or with millions of rows? Use Python with pandas. Working in Excel without internet? VBA is your fallback. The biggest mistake is treating all JSON files the same — the right method depends on size, frequency, and how nested the structure is.

Method 1: Power Query (The Built-In Way)

If you're on Excel 2016 or newer, Power Query is already installed and it's genuinely good at this. The workflow takes about five clicks for simple files and maybe fifteen minutes for complex nested ones. Open Excel, go to the Data tab, click Get Data, choose From File, then From JSON. Browse to your file, click Import. Excel hands you off to the Power Query Editor.

What you see next depends on the JSON structure. If it's an array, you'll get a single column called List with rows that say Record. If it's an object, you'll see a Record at the top with fields you need to expand. Either way, look for the small icon at the top of each column that has arrows pointing in opposite directions — that's the Expand button. Click it, choose which fields to bring out, and Power Query flattens that level. You'll often need to expand two or three levels before the data looks like a normal table.

Nested arrays are the most common stumbling block. Say each order has a line_items field that's a list of products. When you expand that column, Power Query asks if you want to Expand to New Rows (which duplicates the parent record for each child) or keep the values as a list. For most analytical work, expand to new rows. You'll get a row per line item with the order details repeated, which is exactly what tools like pivot tables want.

One often-missed feature is the View Source button in the Applied Steps panel. It shows the actual M code Power Query is generating. Reading this code teaches you what's happening behind the scenes and helps when something breaks. You can also edit the code directly for advanced cases the UI doesn't expose.

The 5 Steps in Power Query

Import

Data tab > Get Data > From File > From JSON. Pick your file and Excel opens the Power Query Editor.

Convert to Table

If Power Query shows a single Record, click To Table on the ribbon to make it iterable.

Expand Columns

Click the expand icon on each column. Pick the fields you want to surface as columns.

Change Types

Power Query guesses types. Fix any wrong ones — dates and numbers especially need attention.

Close and Load

Home tab > Close and Load. The data lands on a new sheet as a refreshable table.

Method 2: VBA Macro (No Internet, Repeatable)

VBA isn't elegant, but it works on every version of Excel from 2007 onward and runs completely offline. The catch? VBA has no native JSON parser. You'll need to add one — the standard choice is the open-source VBA-JSON library by Tim Hall, which is a single .bas file you import into the VBA Editor.

Once that's in place, a basic conversion macro reads the file with a FileSystemObject, passes the contents to JsonConverter.ParseJson, and walks the resulting Dictionary or Collection. For an array of flat objects, you can dump it onto a sheet in about 20 lines of code. Nested data needs a recursive function, which gets uglier fast.

The honest truth: if you're considering VBA for anything more than a flat array, switch to Power Query. VBA shines when you need to convert dozens of files in one go, or when the conversion is part of a larger macro workflow that already runs on a schedule. Most VBA JSON conversions follow the same shape: open the file, parse the text into a Dictionary, loop through the items, and write each property to a cell. The tricky parts are detecting whether a value is itself a Dictionary or a Collection (nested data) and deciding what to do with it.

Another consideration with VBA is error handling. Production-grade scripts need to catch malformed JSON, missing files, and network timeouts if you're pulling from a URL. The basic pattern is wrapping the parse call in an On Error Resume Next block, checking Err.Number, and logging anything unexpected.

Pick Your Method

Built into Excel 2016+. Best for one-off conversions and repeatable monthly imports. Handles nested data well with click-to-expand. The refresh button updates everything when the source JSON changes. No coding required and no third-party libraries to install. Saves the query as a reusable step in the workbook.

Excel Spreadsheet - Microsoft Excel certification study resource

Method 3: Python With Pandas (For Heavy Lifting)

For files over 50 MB, deeply nested structures, or anything you need to run unattended, Python is the right tool. The pandas library reads JSON in one line — pd.read_json('file.json') — and the json_normalize function flattens nested objects automatically. Install pandas and openpyxl. Read the JSON into a DataFrame. Use json_normalize with a record_path argument to point at the array you want to explode. Pass meta to keep parent fields attached to each child row. Then call df.to_excel('output.xlsx', index=False) and you're done.

What makes Python worth learning for this task is how well it handles edge cases. Missing fields become NaN automatically. Mixed types in the same column don't crash anything. Files with millions of records process in seconds rather than locking up Excel for ten minutes. And once you write the script, you can schedule it to run every night with cron or Task Scheduler.

Python also wins on reproducibility. A script in a Git repo is version-controlled and reviewable. A Power Query workflow lives inside an Excel file and is harder to inspect. For team environments or production pipelines, that reproducibility matters as much as the conversion itself.

Handling the Tricky Cases

Real JSON is messy. Here are the patterns that catch people out and how to deal with them. Sometimes an array contains different shapes — some items are strings, some are objects, some are numbers. Power Query handles this gracefully by leaving the column as type Any. In pandas, you'll need to handle each type separately, usually with an isinstance check. The pragmatic answer is to ask whoever produced the JSON to fix it. Mixed-type arrays usually indicate a bug upstream.

JSON doesn't have a date type. Dates come through as strings, usually in ISO format like 2026-05-13T14:30:00Z. Power Query will guess that these are dates and convert them, but only if the format is recognizable. If you see your dates as text, manually change the column type in Power Query to Date/Time, or in Excel use the DATEVALUE function on the column.

If a JSON field contains a leading zero, like a US zip code 02134, treating it as a number will silently drop the zero. The fix is to change the column type to Text before Power Query auto-converts. This applies to phone numbers, account IDs, and any value where the format matters more than the math.

Excel's general formatting rules also catch out fields with leading plus signs, parentheses around negative numbers, and percent signs. If your JSON has values formatted as text strings like (123) or +44, Power Query may try to interpret them as numeric expressions. Setting the column type to Text early in the query prevents this silent corruption.

Power Query Functions for JSON Work

Json.Document

Parses raw JSON text into a Power Query record or list. Used automatically by the From JSON connector but also callable in manual M code.

Table.ExpandRecordColumn

Expands a record column into multiple columns. This is what runs when you click the expand icon in the UI on a nested object field.

Table.ExpandListColumn

Expands a list column into multiple rows. Each item in the list becomes its own row with the parent fields duplicated alongside it.

Json.FromValue

Converts a Power Query record or table back into JSON text. Useful for Excel-to-JSON workflows that mirror the import process in reverse.

Web.Contents

Pulls JSON directly from a URL endpoint. Combined with Json.Document, this lets Power Query consume REST APIs without ever saving a local file first.

Pre-Import Checklist

  • Open the JSON in a text editor or viewer to see the structure
  • Confirm whether the top level is an array or an object
  • Identify which nested arrays need to be exploded into rows
  • Check for date fields and note their format
  • Flag any fields with leading zeros, special characters, or mixed types
  • Estimate the row count — over 1 million needs Power Pivot or Python
  • Decide whether the data is sensitive (rules out online converters)
  • Pick the method that matches your file size and frequency
Excellence Playa Mujeres - Microsoft Excel certification study resource

Common Errors and How to Fix Them

Even with the right method, you'll hit errors. The "DataFormat.Error: We couldn't parse the input" message from Power Query usually means the JSON is malformed. Open it in jsonlint.com or jsonformatter.org and run a syntax check. Common culprits are trailing commas, single quotes instead of double quotes, or unescaped special characters in string values. Fix the source JSON, save, and re-import.

If you see "The query produced too many records" during preview, that's a preview limit only, not an actual load limit. Just click Close and Load anyway — the full data will arrive on the sheet. Long numeric strings like account IDs get converted to scientific notation by default. Change the column type to Text in Power Query before loading. Once it's in Excel as scientific notation, the original value may already be lost.

Nested arrays expanded to new rows but some parent rows have no children produce empty cells. This is correct behavior — there genuinely was no child. If you need a placeholder, use Power Query's Replace Values feature to fill blanks with something like N/A. Numbers showing as dates is another classic problem — Excel sometimes interprets things like 1-2 as January 2nd. Format the column as Text before pasting or importing to lock the values down.

One final pattern worth knowing: the Buffer function. When a Power Query step references the same upstream data multiple times, performance can degrade significantly because the query re-runs the upstream steps each time. Wrapping a step in Table.Buffer or List.Buffer materializes the result in memory, often cutting refresh times in half on larger files.

Power Query vs. Python: The Real Comparison

Pros
  • +Power Query requires no coding — it's point and click for non-developers
  • +Power Query is built into Excel 2016 and newer — zero installation needed
  • +Power Query's refresh button updates from the source automatically
  • +Power Query is excellent for business users sharing files
  • +Power Query queries become reusable steps saved inside the workbook
Cons
  • Python requires installing Python plus pandas and openpyxl libraries
  • Python has a steep learning curve if you've never coded before
  • Python output is a static file with no refresh button
  • Python is overkill for a one-time 100-row conversion task
  • Python errors are less friendly than Power Query's visual editor

Working Backwards: Excel to JSON

Sometimes you need the opposite — exporting Excel data as JSON for an API or a developer. Excel doesn't have a built-in export, but Power Query has an undocumented trick. Load your table, then in the Advanced Editor add a step using Json.FromValue. The cleaner method is a short VBA macro or, again, Python with df.to_json(). Online tools also handle Excel to JSON conversion. Paste your data or upload the workbook, pick the orientation (records vs columns), and download. Same privacy caveats apply — don't paste sensitive data into a public tool.

If you're doing this conversion daily, or if downstream consumers of the data are also tools (databases, BI dashboards, web apps), consider skipping Excel. Tools like Power BI, Tableau, and most modern databases read JSON directly. The only reason to push it through Excel is if a human is going to look at the result in spreadsheet form. For data engineers and analysts, the workflow often becomes: JSON arrives, Python script converts it and loads it to a database, BI tool reads from the database. Excel only enters the picture when someone wants to slice and dice locally.

Excel Questions and Answers

The Bottom Line

Converting JSON to Excel used to require coding, hacks, or paying for plugins. Today, Power Query handles 90% of cases natively and well. Pick it as your default. Reach for Python when files get huge or the conversion needs to run unattended. Keep VBA in your back pocket for offline batch work, and use online converters only for trivial, non-sensitive files.

The single biggest mistake people make is jumping straight into the conversion without looking at the JSON structure first. Five minutes in a JSON viewer saves an hour of wrong-column wrestling later. Know what you have, pick the right tool, and the rest is mostly clicking expand buttons until your data looks like a table. If you work with spreadsheets often, learning Power Query pays off far beyond JSON imports. It's the same tool that handles CSVs, databases, web APIs, and folder-full-of-files imports. Get comfortable with it once and a huge category of data wrangling becomes click-and-load rather than search-Stack-Overflow.

One final practical tip: keep a small library of sample JSON files representing the structures you commonly work with. When something breaks, comparing the broken file against a known-good sample exposes the difference faster than reading error messages. A folder with five or six reference files becomes a powerful debugging tool over time.

The skills transfer too. Once you can convert JSON to Excel comfortably, the same Power Query knowledge handles XML, CSV with weird delimiters, REST API responses, and folders of files. Treat your first JSON conversion as a stepping stone rather than a destination. The investment pays back across dozens of unrelated data tasks for years afterward.

About the Author

James R. HargroveJD, LLM

Attorney & Bar Exam Preparation Specialist

Yale Law School

James R. Hargrove is a practicing attorney and legal educator with a Juris Doctor from Yale Law School and an LLM in Constitutional Law. With over a decade of experience coaching bar exam candidates across multiple jurisdictions, he specializes in MBE strategy, state-specific essay preparation, and multistate performance test techniques.