While JSON is currently a well-accepted data serialization format, so was XML in its time. Do you really want to be using a spec that forces you to use JSON in 10 years when there is some other new hotness? If you're skeptical that anything could ever replace JSON, keep in mind that many people felt the same about XML back in the day.
Take a look at a Google search for "alternatives to JSON", and you'll find at least half a dozen implemented alternatives and proposed new specs.
MessagePack is an efficient binary serialization format. It lets you exchange data among multiple languages like JSON. But it's faster and smaller.
Frustration with JSON has spurred examination and proposal of entirely new schemes. Perhaps one of the most interesting is TOML from Tom Preston-Werner, a cofounder of GitHub.
Cap'n Proto - Encoding/Decoding Speed
Cap'n Proto is an insanely fast data interchange format and capability-based RPC system. Think JSON, except binary.
Protocol Buffers - Schemas
There is a certain painful irony to the fact that we carefully craft our data models inside our databases, maintain layers of code to keep these data models in check, and then allow all of that forethought to fly out the window when we want to send that data over the wire to another service.
Cutting Down Bandwidth with JSON Alternatives
The problem with JSON is that it's too simple. It lacks features. Yesterday while working on an API that is supposed to return PNG images, I was again reminded by the fact that JSON does not handle binary data. Let's see what else JSON does not support: Inf & NaN, differentiating normal hashes vs objects, Regexps, circular references, ... (some people might want to add comments and trailing commas to that list).
So go ahead and design your standards with JSON, and come back in 10 years and see how the standard feels.