This commit is contained in:
yuzu 2025-06-03 02:08:52 -05:00
commit ac375522ae
330 changed files with 2556 additions and 0 deletions

1
.gitignore vendored Normal file
View File

@ -0,0 +1 @@
.zig-cache

1
.hgignore Normal file
View File

@ -0,0 +1 @@
.zig-cache/

28
LICENSE Normal file
View File

@ -0,0 +1,28 @@
BSD 3-Clause License
Copyright (c) 2025, yuzucchii.xyz
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

36
README.md Normal file
View File

@ -0,0 +1,36 @@
# explanation
* (a) Why making a JSON parser?
- I think that Zig's std.json parser fucking sucks
- I write a ton of web apps, therefore, I need a JSON parser
- idk
* (a) What's wrong with std.json?
- It is impossible to override the defaults, I'll proceed to tell you the defauls
- It parses `enums` into `strings` instead of `ints`
- `Packed structs` get parsed as structs, instead of `ints`
- `max_value_len`, couldn't they just learn how to resize a buffer? Is it so difficult?
- Strings are not dupped, you have to MANUALLY enable this
- They don't support parsing unions, laughable
- It crashes randomly when the type doesn't strictly match the payload
- `error.MissingField`, you have to annotate every 'optional' field with `= null`, do they don't know how JSON works?
- Considering what I've said so far, the type reflection is just awful, and there is no way to parse without reflection, just straigt up data? Or is there?
- Huge boilerplate to define custom parsing, if every default is overriden then the only option left is to define your own `jsonParse` function, which generates [huge boilerplate](https://git.yuzucchii.xyz/yuzucchii/discord.zig/src/commit/88088e4ae862dcce31bc212eac92b74e803c63f8/src/structures/shared.zig#L104)
- Like 10 different functions for doing the exact same thing, parsing, `innerParse`, `innerParseFromValue`, `parseFromSlice`, `parseFromSliceLeaky`, `parseFromTokenSource`, `parseFromTokenSourceLeaky` (wtf?), `parseFromValue`, `parseFromValueLeaky`
- The debug options are trash, you have to quite literally pass a struct called `Diagnostcs`
- [This](https://www.openmymind.net/Custom-String-Formatting-And-JSON-in-Zig/) article
* (a) Is this better than std.json?
- I don't know, I just know that it parses JSON and it works, it's simple, the code is readable and straightforward, it is 3 simple files that do simple things, nothing else
- This parser is based off of a [string pool](https://en.wikipedia.org/wiki/String_interning) written by Andrew Kelley, so it must be good, [watch a video](https://www.hytradboi.com/2025/05c72e39-c07e-41bc-ac40-85e8308f2917-programming-without-pointers)
- Probably no, but less featureful software (or less worse) is usually better than over-engineered mess
* (a) Ok but all of your reasons are dumb
- I'll answer this later
## Behaviour
- All control characters except DEL are forbidden
- Null control characters, eg: U+0000 are forbidden and will be ignored by the parser
- Of course, uses null terminated strings, clearly this is not the best approach, but it's memory efficient and fast as fuck!
- It passes most unit tests of the [JSON test suite](https://github.com/nst/JSONTestSuite), totalling 570 tests as of 2025-05-29 when I'm writing this.
- All defaults are configurable via the `Flags` bitfield
- Reflection is supported
## Sic respondeo:
The Zig Discord server is plagued with modern scum, of course, modern scum will dismiss all of my claims or label them as "dumb" or "you're using it wrong!", has any of these individuals not considered that Zig is over complicated? There is a reason why Andrew Kelley himself detached from the communities [and has spoken in multiple instances](https://andrewkelley.me/post/goodbye-twitter-reddit.html) about the "shitification" of software communities, it's like turning a good community of like-minded programmers into a soydev shill. One good thing that he did was shutting down the r/zig subreddit.
On different note, I defend simplicity and minimalism, so I am not saying that *every* Zig developer who thinks differently is scum, I just say that if you cannot think beyond your viewpoint you will reach nowhere. Lastly, if your software is not straightforward and simple to use, then why criticising someone for now knowing how to use it? The only reasonable way to make software complicated is as long as user friendliness is not a tradeoff for performance or minimalism.

8
build.zig Normal file
View File

@ -0,0 +1,8 @@
pub fn build(b: *@import("std").Build) void {
_ = b.addStaticLibrary(.{
.name = "aether",
.root_source_file = b.path("root.zig"),
.target = b.standardTargetOptions(.{}),
.optimize = b.standardOptimizeOption(.{}),
});
}

15
build.zig.zon Normal file
View File

@ -0,0 +1,15 @@
.{
.name = .aether,
.version = "1.0.0",
.fingerprint = 0x27a0a7c056e7482c,
.minimum_zig_version = "0.15.0-dev.552+bc2f7c754",
.dependencies = .{},
.paths = .{
"build.zig",
"build.zig.zon",
"src",
},
}

585
language.zig Normal file
View File

@ -0,0 +1,585 @@
const std = @import("std");
const mem = std.mem;
const Tokenizer = @import("tokenizer.zig");
const TokenType = Tokenizer.TokenType;
const Token = Tokenizer.Token;
const StringPool = @import("strings.zig");
const StringIndex = StringPool.StringIndex;
const assert = std.debug.assert;
const Self = @This();
pub const Error = enum { Eof, TrailingComma, MissingKey, MissingValue, UnexpectedToken };
pub const JsonType = enum { null, bool, number, string, array, object };
pub const JsonNumber = union(enum) {
int: i128,
float: f64,
pub fn cast(self: JsonNumber, comptime T: type) T {
return switch (self) {
.int => |i| switch (@typeInfo(T)) {
.float => @as(T, @floatFromInt(i)),
.int => @as(T, @intCast(i)),
else => @compileError("not a number type"),
},
.float => |f| switch (@typeInfo(T)) {
.float => @as(T, @floatCast(f)),
.int => @as(T, @intFromFloat(f)),
else => @compileError("not a number type"),
},
};
}
};
pub const JsonValue = union(JsonType) {
null: void,
bool: bool,
number: JsonNumber,
string: StringIndex,
array: ArraySlice,
object: ObjectEntry,
};
pub const JsonInput = union(JsonType) {
null: void,
bool: bool,
number: JsonNumber,
string: []const u8,
array: []JsonInput,
object: std.StringArrayHashMapUnmanaged(JsonInput),
pub fn deinit(self: JsonInput, allocator: mem.Allocator) void {
switch (self) {
.array => |array| {
for (array) |json_input|
json_input.deinit(allocator);
allocator.free(array);
},
.object => |*object| {
var it = object.iterator();
while (it.next()) |entry|
entry.value_ptr.deinit(allocator);
@constCast(object).deinit(allocator);
},
else => {},
}
}
pub fn format(
self: @This(),
comptime fmt: []const u8,
opts: std.fmt.FormatOptions,
writer: anytype,
) !void {
switch (self) {
.null => try writer.writeAll("null"),
.bool => try writer.writeAll(if (self.bool) "true" else "false"),
.number => switch (self.number) {
.int => try writer.print("{d}", .{self.number.int}),
.float => try writer.print("{d:.1}", .{self.number.float}),
},
.string => try writer.print("\"{s}\"", .{self.string}),
.array => {
try writer.writeByte('[');
for (self.array, 0..) |val, i| {
try val.format(fmt, opts, writer);
if (i < self.array.len - 1) try writer.writeByte(',');
}
try writer.writeByte(']');
},
.object => {
try writer.writeByte('{');
for (self.object.keys(), self.object.values(), 0..) |k, v, i| {
try writer.print("\"{s}\"", .{k});
try writer.writeByte(':');
try v.format(fmt, opts, writer);
if (i < self.object.entries.len - 1) try writer.writeByte(',');
}
try writer.writeByte('}');
},
}
}
};
/// same as ObjectEntry but simpler
///.tip is the offset
pub const ArraySlice = struct {
len: usize,
tip: usize,
};
/// just += the value indexes to get the next item
pub const ObjectEntry = struct {
len: usize,
tip: usize,
};
pub const PropertyEntry = struct {
tip: StringIndex,
};
pub const Flags = packed struct {
/// Make the tokenizer omit comments, TBD
allow_comments: bool = false,
/// Not to error on trailing comma, default is `false` for obvious reasons
allow_trailing_comma: bool = false,
};
pub const Options = struct {
comptime indent_len: usize = 4,
comptime max_depth: usize = 256,
flags: Flags = .{},
};
index: std.MultiArrayList(JsonValue) = .{},
strings: StringPool = .empty,
properties: StringPool = .empty,
property_map: std.AutoArrayHashMapUnmanaged(usize, PropertyEntry) = .empty,
options: Options = .{},
pub const init = Self{};
pub fn deinit(self: *Self, allocator: mem.Allocator) void {
self.index.deinit(allocator);
self.properties.deinit(allocator);
self.strings.deinit(allocator);
self.property_map.deinit(allocator);
}
fn addNumber(self: *Self, allocator: mem.Allocator, number: JsonNumber) !usize {
try self.index.ensureUnusedCapacity(allocator, 1);
const idx = self.index.addOneAssumeCapacity();
self.index.set(idx, .{ .number = number });
return idx;
}
fn addProperty(self: *Self, allocator: mem.Allocator, bytes: []const u8) !usize {
const stridx = try self.properties.add(allocator, bytes);
try self.property_map.ensureUnusedCapacity(allocator, 1);
return @intFromEnum(stridx);
}
fn addString(self: *Self, allocator: mem.Allocator, bytes: []const u8) !usize {
const stridx = try self.strings.add(allocator, bytes);
try self.index.ensureUnusedCapacity(allocator, 1);
const idx = self.index.addOneAssumeCapacity();
self.index.set(idx, .{ .string = stridx });
return idx;
}
fn addEmpty(self: *Self, allocator: mem.Allocator) !usize {
try self.index.ensureUnusedCapacity(allocator, 1);
const idx = self.index.addOneAssumeCapacity();
return idx;
}
fn addBool(self: *Self, allocator: mem.Allocator, value: bool) !usize {
try self.index.ensureUnusedCapacity(allocator, 1);
const idx = self.index.addOneAssumeCapacity();
self.index.set(idx, .{ .bool = value });
return idx;
}
fn addNull(self: *Self, allocator: mem.Allocator) !usize {
try self.index.ensureUnusedCapacity(allocator, 1);
const idx = self.index.addOneAssumeCapacity();
self.index.set(idx, .{ .null = {} });
return idx;
}
// Recursively compute how many index slots a node occupies (including nested)
pub fn skipSlots(self: *Self, slot: usize) usize {
switch (self.index.get(slot)) {
.object => |obj| {
var total: usize = 1;
var v = obj.tip;
for (0..obj.len) |_| {
const s = skipSlots(self, v);
total += s;
v += s;
}
return total;
},
.array => |arr| {
var total: usize = 1;
var c = arr.tip;
for (0..arr.len) |_| {
const s = skipSlots(self, c);
total += s;
c += s;
}
return total;
},
else => return 1,
}
}
pub fn getValue(
self: *Self,
allocator: mem.Allocator,
idx: usize,
) !JsonInput {
if (self.index.len == 0)
return error.InvalidSyntax;
switch (self.index.get(idx)) {
.null => return .{ .null = {} },
.bool => |b| return .{ .bool = b },
.number => |number| return .{ .number = number },
.string => |string| {
const sl = string.slice(&self.strings);
return .{ .string = sl };
},
.array => |arr| {
var out = try allocator.alloc(JsonInput, arr.len);
errdefer allocator.free(out);
var c = arr.tip;
for (0..arr.len) |i| {
const v = try self.getValue(allocator, c);
out[i] = v;
c += skipSlots(self, c);
}
return .{ .array = out[0..arr.len] };
},
.object => |obj| {
var map: std.StringArrayHashMapUnmanaged(JsonInput) = .empty;
errdefer map.deinit(allocator);
var tip = obj.tip;
for (0..obj.len) |_| if (self.property_map.get(tip)) |pen| {
try map.put(
allocator,
pen.tip.slice(&self.properties),
try self.getValue(allocator, tip),
);
tip += self.skipSlots(tip);
} else return error.MissingKey;
return .{ .object = map };
},
}
}
/// always returns 0 (root)
pub fn parse(self: *Self, allocator: mem.Allocator, tokenizer: *Tokenizer) !usize {
tokenizer.skipWhitespace();
if (tokenizer.endOfInput())
return error.Eof;
const root = try self.addEmpty(allocator);
var token = try tokenizer.nextToken(allocator);
var query: std.BoundedArray(usize, self.options.max_depth) = try .init(0);
flag: switch (token.type) {
.eof => {
if (root != 0) return error.InvalidSyntax;
if (query.slice().len != 0) return error.InvalidSyntax;
return root;
},
.property => {
const scope_idx = query.get(query.len - 1);
switch (self.index.get(scope_idx)) {
.object => |scope| {
const pidx = try self.addProperty(allocator, token.value.?.string);
const reer = self.index.len;
self.property_map.putAssumeCapacity(reer, .{ .tip = @enumFromInt(pidx) });
allocator.free(token.value.?.string);
self.index.set(scope_idx, .{ .object = ObjectEntry{
.len = scope.len + 1,
.tip = scope.tip,
} });
},
else => return error.InvalidSyntax,
}
const next = try tokenizer.nextToken(allocator);
token = next;
switch (next.type) {
.colon => {
token = try tokenizer.nextToken(allocator);
continue :flag token.type;
},
else => continue :flag next.type,
}
},
.object_begin => {
if (query.slice().len < 1) {
const ptr = try query.addOne();
ptr.* = root;
self.index.set(root, .{ .object = ObjectEntry{
.len = 0,
.tip = 1,
} });
} else {
//order
const parent_idx = query.get(query.len - 1);
const idx_ptr = try query.addOne();
idx_ptr.* = try self.addEmpty(allocator);
self.index.set(idx_ptr.*, .{
.object = ObjectEntry{
.len = 0,
.tip = self.index.len,
},
});
switch (self.index.get(parent_idx)) {
.array => |slice| {
self.index.set(parent_idx, .{ .array = ArraySlice{
.len = slice.len + 1,
.tip = if (slice.len == 0) idx_ptr.* else slice.tip,
} });
},
else => {},
}
}
const next = try tokenizer.nextToken(allocator);
token = next;
switch (next.type) {
.string => continue :flag .property,
.object_end => continue :flag .object_end,
else => return error.InvalidSyntax,
}
},
.object_end, .array_end => {
if (query.pop() == null)
return error.InvalidSyntax; // double close
if (query.slice().len == 0)
return root;
const next = try tokenizer.nextToken(allocator);
token = next;
switch (next.type) {
.comma => continue :flag .comma,
.object_end, .array_end => continue :flag next.type,
else => return error.InvalidSyntax,
}
},
.array_begin => {
defer tokenizer.skipWhitespace();
if (query.slice().len < 1) {
const ptr = try query.addOne();
ptr.* = root;
self.index.set(root, .{ .array = ArraySlice{
.len = 0,
.tip = 1,
} });
} else {
// order matters
const parent_idx = query.get(query.len - 1);
const idx_ptr = try query.addOne();
idx_ptr.* = try self.addEmpty(allocator);
self.index.set(idx_ptr.*, .{ .array = ArraySlice{
.len = 0,
.tip = idx_ptr.* + 1,
} });
switch (self.index.get(parent_idx)) {
.array => |slice| {
self.index.set(parent_idx, .{ .array = ArraySlice{
.len = slice.len + 1,
.tip = if (slice.len == 0) idx_ptr.* else slice.tip,
} });
},
else => {},
}
}
const next = try tokenizer.nextToken(allocator);
token = next;
switch (next.type) {
.property => return error.InvalidSyntax,
else => continue :flag next.type,
}
},
.true, .false => {
const idx = try self.addBool(allocator, if (token.type == .true) true else false);
if (query.len == 0) {
// root
self.index.set(root, .{ .bool = if (token.type == .true) true else false });
return root;
}
const parent_idx = query.get(query.len - 1);
switch (self.index.get(parent_idx)) {
.array => |slice| {
self.index.set(parent_idx, .{ .array = ArraySlice{
.len = slice.len + 1,
.tip = if (slice.len == 0) idx else slice.tip,
} });
},
else => {},
}
const next = try tokenizer.nextToken(allocator);
token = next;
switch (next.type) {
.comma => continue :flag .comma,
.object_end, .array_end => continue :flag next.type,
else => return error.InvalidSyntax,
}
},
.string => {
if (query.len == 0) {
// root
_ = try self.addString(allocator, token.value.?.string);
allocator.free(token.value.?.string);
// hardcoded shite
self.index.set(root, .{ .string = @enumFromInt(0) });
return root;
}
const parent_idx = query.get(query.len - 1);
const next = try tokenizer.nextToken(allocator);
switch (next.type) {
.colon => {
continue :flag .property;
},
else => |t| {
const idx = try self.addString(allocator, token.value.?.string);
allocator.free(token.value.?.string);
switch (self.index.get(parent_idx)) {
.array => |slice| {
self.index.set(parent_idx, .{ .array = ArraySlice{
.len = slice.len + 1,
.tip = if (slice.len == 0) idx else slice.tip,
} });
},
else => {},
}
token = next;
continue :flag t;
},
}
},
.int, .float => |number| {
if (query.len == 0) {
// root
_ = switch (number) {
.int => try self.addNumber(allocator, .{ .int = token.value.?.int }),
.float => try self.addNumber(allocator, .{ .float = token.value.?.float }),
else => unreachable,
};
self.index.set(root, .{ .number = switch (number) {
.int => .{ .int = token.value.?.int },
.float => .{ .float = token.value.?.float },
else => unreachable,
} });
return root;
}
const parent_idx = query.get(query.len - 1);
const idx = switch (number) {
.int => try self.addNumber(allocator, .{ .int = token.value.?.int }),
.float => try self.addNumber(allocator, .{ .float = token.value.?.float }),
else => unreachable,
};
switch (self.index.get(parent_idx)) {
.array => |slice| {
self.index.set(parent_idx, .{ .array = ArraySlice{
.len = slice.len + 1,
.tip = if (slice.len == 0) idx else slice.tip,
} });
},
else => {},
}
const next = try tokenizer.nextToken(allocator);
token = next;
switch (next.type) {
.comma => continue :flag .comma,
.object_end, .array_end => continue :flag next.type,
else => return error.InvalidSyntax,
}
},
.comma => {
if (!self.options.flags.allow_trailing_comma) {
const next = try tokenizer.nextToken(allocator);
token = next;
switch (next.type) {
.object_end, .array_end => return error.TrailingComma,
.comma => return error.InvalidSyntax,
else => continue :flag token.type,
}
}
},
.null => {
const idx = try self.addNull(allocator);
if (query.len == 0) {
// root
self.index.set(root, .{ .null = {} });
return root;
}
const parent_idx = query.get(query.len - 1);
switch (self.index.get(parent_idx)) {
.array => |slice| {
self.index.set(parent_idx, .{ .array = ArraySlice{
.len = slice.len + 1,
.tip = if (slice.len == 0) idx else slice.tip,
} });
},
else => {},
}
const next = tokenizer.nextToken(allocator) catch |err| switch (err) {
error.InvalidSyntax => return err,
else => return root,
};
token = next;
switch (next.type) {
.comma => continue :flag .comma,
.object_end, .array_end => continue :flag next.type,
else => return error.InvalidSyntax,
}
},
else => return error.InvalidSyntax,
}
return root;
}
test getValue {
const allocator = std.testing.allocator;
const text =
\\{
\\ "a":"A",
\\ "b":"B",
\\ "c": {
\\ "d": "D"
\\ },
\\ "e": "E",
\\ "f": [1]
\\}
; // 1: a, 2: b, 3: c, 4: d, 5: e, 6: f
var tokenizer: Tokenizer = try .init(allocator, text);
defer tokenizer.deinit(allocator);
var self = try allocator.create(Self);
self.* = Self.init;
defer allocator.destroy(self);
defer self.deinit(allocator);
const idx: usize = try self.parse(allocator, &tokenizer);
var root = try self.getValue(allocator, idx);
defer root.deinit(allocator);
try std.testing.expect(root == .object);
std.debug.print("{}\n", .{root});
}

238
reflection.zig Normal file
View File

@ -0,0 +1,238 @@
const std = @import("std");
const mem = std.mem;
const Language = @import("language.zig");
const Tokenizer = @import("tokenizer.zig");
const assert = std.debug.assert;
const Self = @This();
pub const Error = error{TypeError};
language: *Language,
tokenizer: *Tokenizer,
pub fn parse(self: *Self, allocator: mem.Allocator) !usize {
return self.language.parse(allocator, self.tokenizer);
}
pub fn init(allocator: mem.Allocator, text: []const u8) !Self {
const self = Self{
.language = try allocator.create(Language),
.tokenizer = try allocator.create(Tokenizer),
};
self.language.* = .init;
self.tokenizer.* = try .init(allocator, text);
return self;
}
pub fn deinit(self: *Self, allocator: mem.Allocator) void {
self.language.deinit(allocator);
self.tokenizer.deinit(allocator);
allocator.destroy(self.language);
allocator.destroy(self.tokenizer);
}
/// needs an index but useful otherwise
/// root starting from idx
pub fn reflectT(self: *Self, comptime T: type, allocator: mem.Allocator, idx: usize) !T {
const Schema = @typeInfo(T);
const flags = self.language.options.flags;
if (std.meta.hasFn(T, "toJson")) {
return T.toJson(self, allocator, idx);
}
switch (self.language.index.get(idx)) {
.null => {
if (Schema == .null) return null;
return error.TypeError;
},
.bool => |b| switch (Schema) {
.bool => return b,
.@"union" => |unionInfo| inline for (unionInfo.fields) |field| {
var r: T = undefined;
r = @unionInit(T, field.name, b);
return r;
},
else => return error.TypeError,
},
.number => |number| switch (Schema) {
.int, .comptime_int => return @intCast(number.int),
.float, .comptime_float => return switch (number) {
.float => @floatCast(number.float),
.int => @floatFromInt(number.int),
},
.@"enum" => |enumInfo| {
const int: enumInfo.tag_type = @intCast(number.int);
return @enumFromInt(int);
},
.@"struct" => |structInfo| switch (structInfo.layout) {
.@"packed" => {
const int: structInfo.backing_integer.? = @intCast(number.int);
return @bitCast(int);
},
else => return error.TypeError,
},
.@"union" => |unionInfo| {
inline for (unionInfo.fields) |field| switch (@typeInfo(field.type)) {
.int, .comptime_int => return @unionInit(T, field.name, @intCast(number.int)),
.float, .comptime_float => return @unionInit(T, field.name, @floatCast(number.float)),
else => {},
};
return error.TypeError;
},
else => unreachable,
},
.string => |string| switch (Schema) {
.@"enum" => |enumInfo| {
const strslice = string.slice(&self.language.strings);
inline for (enumInfo.fields) |field| if (mem.eql(u8, field.name, strslice)) {
return std.meta.stringToEnum(T, strslice) orelse error.TypeError;
};
},
.@"union" => |unionInfo| inline for (unionInfo.fields) |field| {
if (field.type == []const u8) {
var r: T = undefined;
const strslice = string.slice(&self.language.strings);
r = @unionInit(T, field.name, strslice);
return r;
}
},
.array => |arrayInfo| {
assert(arrayInfo.child == u8);
const strslice = string.slice(&self.language.strings);
assert(arrayInfo.len == strslice.len - 1);
var r: T = undefined;
for (strslice, 0..) |char, i|
r[i] = char;
return r;
},
.pointer => |ptrInfo| switch (ptrInfo.size) {
.slice => {
assert(ptrInfo.child == u8);
const strslice = string.slice(&self.language.strings);
var arraylist: std.ArrayList(u8) = .init(allocator);
try arraylist.ensureUnusedCapacity(strslice.len);
for (strslice) |char| if (char != 0x00)
arraylist.appendAssumeCapacity(char);
if (ptrInfo.sentinel_ptr) |some| {
const sentinel = @as(*align(1) const ptrInfo.child, @ptrCast(some)).*;
return try arraylist.toOwnedSliceSentinel(sentinel);
}
if (ptrInfo.is_const) {
arraylist.deinit();
return strslice;
} else {
arraylist.deinit();
const slice = try allocator.dupe(u8, strslice);
return @as(T, slice);
}
return try arraylist.toOwnedSlice();
},
else => return error.TypeError,
},
else => return error.TypeError,
},
.array => |slice| switch (Schema) {
.array => |arrayInfo| {
assert(slice.len == arrayInfo.len);
var r: T = undefined;
for (0..slice.len) |i|
r[i] = try self.reflectT(arrayInfo.child, allocator, slice.tip + i);
return r;
},
.pointer => |ptrInfo| switch (ptrInfo.size) {
.slice => {},
else => return error.TypeError,
},
else => return error.TypeError,
},
.object => |object| switch (Schema) {
.@"struct" => |structInfo| {
if (structInfo.is_tuple) return error.TypeError;
var tip = object.tip;
var map: std.StringArrayHashMapUnmanaged(usize) = .empty;
try map.ensureTotalCapacity(allocator, object.len);
defer map.deinit(allocator);
for (0..object.len) |_| if (self.language.property_map.get(tip)) |pen| {
const key = pen.tip.slice(&self.language.properties);
map.putAssumeCapacity(key, tip);
tip += self.language.skipSlots(tip);
};
var r: T = undefined;
inline for (structInfo.fields) |field| {
if (field.is_comptime)
@panic(@typeName(T) ++ "." ++ field.name ++ " may not be a comptime field");
if (map.get(field.name)) |next_i| {
@field(r, field.name) = try self.reflectT(field.type, allocator, next_i);
} else switch (@typeInfo(field.type)) {
.optional => {
if (flags.bitfields)
@field(r, field.name) = null;
@panic("Unknown property: " ++ field.name);
},
else => @panic("Unknown property: " ++ field.name),
}
}
return r;
},
else => return error.TypeError,
},
}
unreachable;
}
test reflectT {
const allocator = std.testing.allocator;
const text =
\\{
\\ "age": 15,
\\ "name": "Yuzu",
\\ "admin": true,
\\ "flags": 0,
\\ "union": ":D",
\\ "enum": "world"
\\}
;
var self = try allocator.create(Self);
self.* = try init(allocator, text);
defer allocator.destroy(self);
defer self.deinit(allocator);
const idx: usize = try self.parse(allocator);
const UserFlags = packed struct {
is_cool: bool = false,
is_friendly: bool = false,
};
const UserSchema = struct {
age: f64,
name: []const u8,
admin: bool,
flags: UserFlags,
@"union": union { hi: bool, bye: f64, n128: []const u8 },
@"enum": enum { hello, world },
};
const root = try self.reflectT(UserSchema, allocator, idx);
std.debug.print("hello? {s}\n", .{@tagName(root.@"enum")});
std.debug.print("friend? {s}\n", .{root.@"union".n128});
}

4
root.zig Normal file
View File

@ -0,0 +1,4 @@
pub const Language = @import("language.zig");
pub const Tokenizer = @import("tokenizer.zig");
pub const StringPool = @import("strings.zig");
pub const Reflect = @import("reflection.zig");

108
strings.zig Normal file
View File

@ -0,0 +1,108 @@
/// credits to Andrew Kelley
/// strings.zig
const std = @import("std");
const mem = std.mem;
const assert = std.debug.assert;
const Allocator = std.mem.Allocator;
const Self = @This();
const max_load_percent = std.hash_map.default_max_load_percentage;
string_bytes: std.ArrayListUnmanaged(u8) = .empty,
string_table: StringIndex.Table = .empty,
pub const empty = Self{
.string_bytes = .empty,
.string_table = .empty,
};
pub fn deinit(self: *Self, allocator: Allocator) void {
self.string_bytes.deinit(allocator);
self.string_table.deinit(allocator);
}
pub const StringIndex = enum(u32) {
_,
pub const Table = std.HashMapUnmanaged(StringIndex, void, TableContext, max_load_percent);
pub const TableContext = struct {
bytes: []const u8,
pub fn eql(_: @This(), a: StringIndex, b: StringIndex) bool {
return a == b;
}
pub fn hash(ctx: @This(), key: StringIndex) u64 {
return std.hash_map.hashString(mem.sliceTo(ctx.bytes[@intFromEnum(key)..], 0));
}
};
pub const TableIndexAdapter = struct {
bytes: []const u8,
pub fn eql(ctx: @This(), a: []const u8, b: StringIndex) bool {
return mem.eql(u8, a, mem.sliceTo(ctx.bytes[@intFromEnum(b)..], 0));
}
pub fn hash(_: @This(), adapted_key: []const u8) u64 {
assert(mem.indexOfScalar(u8, adapted_key, 0) == null);
return std.hash_map.hashString(adapted_key);
}
};
pub fn slice(index: StringIndex, state: *const Self) [:0]const u8 {
const start_slice = state.string_bytes.items[@intFromEnum(index)..];
return start_slice[0..mem.indexOfScalar(u8, start_slice, 0).? :0];
}
pub fn iterator(start: StringIndex, bytes: []const u8) Iterator {
return .{
.bytes = bytes,
.pos = @intFromEnum(start),
};
}
pub const Iterator = struct {
bytes: []const u8,
pos: usize = 0,
pub fn next(self: *Iterator) ?[:0]const u8 {
if (self.pos >= self.bytes.len) return null;
// Find the next null terminator starting from current position
const end_pos = mem.indexOfScalarPos(u8, self.bytes, self.pos, 0) orelse {
// No null found: return remaining bytes (invalid, but handle gracefully)
const s = self.bytes[self.pos..];
self.pos = self.bytes.len;
return s;
};
const s = self.bytes[self.pos..end_pos :0];
self.pos = end_pos + 1; // Skip the null terminator
return s;
}
};
};
pub fn add(state: *Self, allocator: Allocator, bytes: []const u8) !StringIndex {
try state.string_bytes.ensureUnusedCapacity(allocator, bytes.len + 1);
const gop = try state.string_table.getOrPutContextAdapted(
allocator,
bytes,
StringIndex.TableIndexAdapter{ .bytes = state.string_bytes.items },
StringIndex.TableContext{ .bytes = state.string_bytes.items },
);
if (gop.found_existing) return gop.key_ptr.*;
const new_off: StringIndex = @enumFromInt(state.string_bytes.items.len);
state.string_bytes.appendSliceAssumeCapacity(bytes);
state.string_bytes.appendAssumeCapacity(0);
gop.key_ptr.* = new_off;
return new_off;
}

654
test.zig Normal file
View File

@ -0,0 +1,654 @@
const std = @import("std");
const mem = std.mem;
const testing = std.testing;
const Language = @import("language.zig");
const Tokenizer = @import("tokenizer.zig");
const allocator = std.testing.allocator;
test Language {
const text =
\\ {
\\ "cute": true,
\\ "metadata": {
\\ "post": [1,2,3,{"uwu":1}],
\\ "a": 2,
\\ "c": {
\\ "d": 4,
\\ "uwua": [[[[[1], [2]]]]],
\\ "x": true
\\ }
\\ },
\\ "b": 3
\\ }
;
var tokenizer: Tokenizer = try .init(allocator, text);
defer tokenizer.deinit(allocator);
var self = try allocator.create(Language);
defer allocator.destroy(self);
self.* = Language.init;
defer self.deinit(allocator);
const idx: usize = try self.parse(allocator, &tokenizer);
var root = try self.getValue(allocator, idx);
defer root.deinit(allocator);
try std.testing.expect(root == .object);
std.debug.print("{}\n", .{root});
}
test {
_ = @import("language.zig");
_ = @import("strings.zig");
_ = @import("tokenizer.zig");
_ = @import("reflection.zig");
}
fn expectPass(comptime path: []const u8) !void {
const file = @embedFile("tests/" ++ path);
var tokenizer: Tokenizer = try .init(allocator, file);
defer tokenizer.deinit(allocator);
var self = try allocator.create(Language);
self.* = Language.init;
defer allocator.destroy(self);
defer self.deinit(allocator);
const idx: usize = try self.parse(allocator, &tokenizer);
var root = try self.getValue(allocator, idx);
defer root.deinit(allocator);
std.debug.print("{}\n", .{root});
}
fn expectFail(comptime path: []const u8) !void {
const file = @embedFile("tests/" ++ path);
var tokenizer: Tokenizer = try .init(allocator, file);
defer tokenizer.deinit(allocator);
var self = try allocator.create(Language);
self.* = Language.init;
defer allocator.destroy(self);
defer self.deinit(allocator);
const idx: usize = self.parse(allocator, &tokenizer) catch
return;
var root = self.getValue(allocator, idx) catch
return;
defer root.deinit(allocator);
}
// zig fmt: off
test { try expectFail( "n_array_1_true_without_comma.json" ); }
test { try expectFail( "n_array_a_invalid_utf8.json" ); }
test { try expectFail( "n_array_colon_instead_of_comma.json" ); }
test { try expectFail( "n_array_comma_after_close.json" ); }
test { try expectFail( "n_array_comma_and_number.json" ); }
test { try expectFail( "n_array_double_comma.json" ); }
test { try expectFail( "n_array_double_extra_comma.json" ); }
test { try expectFail( "n_array_extra_close.json" ); }
test { try expectFail( "n_array_extra_comma.json" ); }
test { try expectFail( "n_array_incomplete.json" ); }
test { try expectFail( "n_array_incomplete_invalid_value.json" ); }
test { try expectFail( "n_array_inner_array_no_comma.json" ); }
test { try expectFail( "n_array_invalid_utf8.json" ); }
test { try expectFail( "n_array_items_separated_by_semicolon.json" ); }
test { try expectFail( "n_array_just_comma.json" ); }
test { try expectFail( "n_array_just_minus.json" ); }
test { try expectFail( "n_array_missing_value.json" ); }
test { try expectFail( "n_array_newlines_unclosed.json" ); }
test { try expectFail( "n_array_number_and_comma.json" ); }
test { try expectFail( "n_array_number_and_several_commas.json" ); }
test { try expectFail( "n_array_spaces_vertical_tab_formfeed.json" ); }
test { try expectFail( "n_array_star_inside.json" ); }
test { try expectFail( "n_array_unclosed.json" ); }
test { try expectFail( "n_array_unclosed_trailing_comma.json" ); }
test { try expectFail( "n_array_unclosed_with_new_lines.json" ); }
test { try expectFail( "n_array_unclosed_with_object_inside.json" ); }
test { try expectFail( "n_incomplete_false.json" ); }
test { try expectFail( "n_incomplete_null.json" ); }
test { try expectFail( "n_incomplete_true.json" ); }
test { try expectFail( "n_multidigit_number_then_00.json" ); }
test { try expectFail( "n_number_++.json" ); }
test { try expectFail( "n_number_+1.json" ); }
test { try expectFail( "n_number_+Inf.json" ); }
test { try expectFail( "n_number_-01.json" ); }
test { try expectFail( "n_number_-1.0..json" ); }
test { try expectFail( "n_number_-2..json" ); }
test { try expectFail( "n_number_-NaN.json" ); }
test { try expectFail( "n_number_.-1.json" ); }
test { try expectFail( "n_number_.2e-3.json" ); }
test { try expectFail( "n_number_0.1.2.json" ); }
test { try expectFail( "n_number_0.3e+.json" ); }
test { try expectFail( "n_number_0.3e.json" ); }
test { try expectFail( "n_number_0.e1.json" ); }
test { try expectFail( "n_number_0_capital_E+.json" ); }
test { try expectFail( "n_number_0_capital_E.json" ); }
test { try expectFail( "n_number_0e+.json" ); }
test { try expectFail( "n_number_0e.json" ); }
test { try expectFail( "n_number_1.0e+.json" ); }
test { try expectFail( "n_number_1.0e-.json" ); }
test { try expectFail( "n_number_1.0e.json" ); }
test { try expectFail( "n_number_1_000.json" ); }
test { try expectFail( "n_number_1eE2.json" ); }
test { try expectFail( "n_number_2.e+3.json" ); }
test { try expectFail( "n_number_2.e-3.json" ); }
test { try expectFail( "n_number_2.e3.json" ); }
test { try expectFail( "n_number_9.e+.json" ); }
test { try expectFail( "n_number_Inf.json" ); }
test { try expectFail( "n_number_NaN.json" ); }
test { try expectFail( "n_number_U+FF11_fullwidth_digit_one.json" ); }
test { try expectFail( "n_number_expression.json" ); }
test { try expectFail( "n_number_hex_1_digit.json" ); }
test { try expectFail( "n_number_hex_2_digits.json" ); }
test { try expectFail( "n_number_infinity.json" ); }
test { try expectFail( "n_number_invalid+-.json" ); }
test { try expectFail( "n_number_invalid-negative-real.json" ); }
test { try expectFail( "n_number_invalid-utf-8-in-bigger-int.json" ); }
test { try expectFail( "n_number_invalid-utf-8-in-exponent.json" ); }
test { try expectFail( "n_number_invalid-utf-8-in-int.json" ); }
test { try expectFail( "n_number_minus_infinity.json" ); }
test { try expectFail( "n_number_minus_sign_with_trailing_garbage.json" ); }
test { try expectFail( "n_number_minus_space_1.json" ); }
test { try expectFail( "n_number_neg_int_starting_with_zero.json" ); }
test { try expectFail( "n_number_neg_real_without_int_part.json" ); }
test { try expectFail( "n_number_neg_with_garbage_at_end.json" ); }
test { try expectFail( "n_number_real_garbage_after_e.json" ); }
test { try expectFail( "n_number_real_with_invalid_utf8_after_e.json" ); }
test { try expectFail( "n_number_real_without_fractional_part.json" ); }
test { try expectFail( "n_number_starting_with_dot.json" ); }
test { try expectFail( "n_number_with_alpha.json" ); }
test { try expectFail( "n_number_with_alpha_char.json" ); }
test { try expectFail( "n_number_with_leading_zero.json" ); }
test { try expectFail( "n_object_bad_value.json" ); }
test { try expectFail( "n_object_bracket_key.json" ); }
test { try expectFail( "n_object_comma_instead_of_colon.json" ); }
test { try expectFail( "n_object_double_colon.json" ); }
test { try expectFail( "n_object_emoji.json" ); }
test { try expectFail( "n_object_garbage_at_end.json" ); }
test { try expectFail( "n_object_key_with_single_quotes.json" ); }
test { try expectFail( "n_object_lone_continuation_byte_in_key_and_trailing_comma.json" ); }
test { try expectFail( "n_object_missing_colon.json" ); }
test { try expectFail( "n_object_missing_key.json" ); }
test { try expectFail( "n_object_missing_semicolon.json" ); }
test { try expectFail( "n_object_missing_value.json" ); }
test { try expectFail( "n_object_no-colon.json" ); }
test { try expectFail( "n_object_non_string_key.json" ); }
test { try expectFail( "n_object_non_string_key_but_huge_number_instead.json" ); }
test { try expectFail( "n_object_repeated_null_null.json" ); }
test { try expectFail( "n_object_several_trailing_commas.json" ); }
test { try expectFail( "n_object_single_quote.json" ); }
test { try expectFail( "n_object_trailing_comma.json" ); }
test { try expectFail( "n_object_trailing_comment.json" ); }
test { try expectFail( "n_object_trailing_comment_open.json" ); }
test { try expectFail( "n_object_trailing_comment_slash_open.json" ); }
test { try expectFail( "n_object_trailing_comment_slash_open_incomplete.json" ); }
test { try expectFail( "n_object_two_commas_in_a_row.json" ); }
test { try expectFail( "n_object_unquoted_key.json" ); }
test { try expectFail( "n_object_unterminated-value.json" ); }
test { try expectFail( "n_object_with_single_string.json" ); }
test { try expectFail( "n_object_with_trailing_garbage.json" ); }
test { try expectFail( "n_single_space.json" ); }
test { try expectFail( "n_string_1_surrogate_then_escape.json" ); }
test { try expectFail( "n_string_1_surrogate_then_escape_u.json" ); }
test { try expectFail( "n_string_1_surrogate_then_escape_u1.json" ); }
test { try expectFail( "n_string_1_surrogate_then_escape_u1x.json" ); }
test { try expectFail( "n_string_accentuated_char_no_quotes.json" ); }
test { try expectFail( "n_string_backslash_00.json" ); }
test { try expectFail( "n_string_escape_x.json" ); }
test { try expectFail( "n_string_escaped_backslash_bad.json" ); }
test { try expectFail( "n_string_escaped_ctrl_char_tab.json" ); }
test { try expectFail( "n_string_escaped_emoji.json" ); }
test { try expectFail( "n_string_incomplete_escape.json" ); }
test { try expectFail( "n_string_incomplete_escaped_character.json" ); }
test { try expectFail( "n_string_incomplete_surrogate.json" ); }
test { try expectFail( "n_string_incomplete_surrogate_escape_invalid.json" ); }
test { try expectFail( "n_string_invalid-utf-8-in-escape.json" ); }
test { try expectFail( "n_string_invalid_backslash_esc.json" ); }
test { try expectFail( "n_string_invalid_unicode_escape.json" ); }
test { try expectFail( "n_string_invalid_utf8_after_escape.json" ); }
test { try expectFail( "n_string_leading_uescaped_thinspace.json" ); }
test { try expectFail( "n_string_no_quotes_with_bad_escape.json" ); }
test { try expectFail( "n_string_single_doublequote.json" ); }
test { try expectFail( "n_string_single_quote.json" ); }
test { try expectFail( "n_string_single_string_no_double_quotes.json" ); }
test { try expectFail( "n_string_start_escape_unclosed.json" ); }
test { try expectFail( "n_string_unescaped_ctrl_char.json" ); }
test { try expectFail( "n_string_unescaped_newline.json" ); }
test { try expectFail( "n_string_unescaped_tab.json" ); }
test { try expectFail( "n_string_unicode_CapitalU.json" ); }
test { try expectFail( "n_string_with_trailing_garbage.json" ); }
test { try expectFail( "n_structure_100000_opening_arrays.json" ); }
test { try expectFail( "n_structure_U+2060_word_joined.json" ); }
test { try expectFail( "n_structure_UTF8_BOM_no_data.json" ); }
test { try expectFail( "n_structure_angle_bracket_..json" ); }
test { try expectFail( "n_structure_angle_bracket_null.json" ); }
test { try expectFail( "n_structure_array_trailing_garbage.json" ); }
test { try expectFail( "n_structure_array_with_extra_array_close.json" ); }
test { try expectFail( "n_structure_array_with_unclosed_string.json" ); }
test { try expectFail( "n_structure_ascii-unicode-identifier.json" ); }
test { try expectFail( "n_structure_capitalized_True.json" ); }
test { try expectFail( "n_structure_close_unopened_array.json" ); }
test { try expectFail( "n_structure_comma_instead_of_closing_brace.json" ); }
test { try expectFail( "n_structure_double_array.json" ); }
test { try expectFail( "n_structure_end_array.json" ); }
test { try expectFail( "n_structure_incomplete_UTF8_BOM.json" ); }
test { try expectFail( "n_structure_lone-invalid-utf-8.json" ); }
test { try expectFail( "n_structure_lone-open-bracket.json" ); }
test { try expectFail( "n_structure_no_data.json" ); }
test { try expectFail( "n_structure_null-byte-outside-string.json" ); }
test { try expectFail( "n_structure_number_with_trailing_garbage.json" ); }
test { try expectFail( "n_structure_object_followed_by_closing_object.json" ); }
test { try expectFail( "n_structure_object_unclosed_no_value.json" ); }
test { try expectFail( "n_structure_object_with_comment.json" ); }
test { try expectFail( "n_structure_object_with_trailing_garbage.json" ); }
test { try expectFail( "n_structure_open_array_apostrophe.json" ); }
test { try expectFail( "n_structure_open_array_comma.json" ); }
test { try expectFail( "n_structure_open_array_object.json" ); }
test { try expectFail( "n_structure_open_array_open_object.json" ); }
test { try expectFail( "n_structure_open_array_open_string.json" ); }
test { try expectFail( "n_structure_open_array_string.json" ); }
test { try expectFail( "n_structure_open_object.json" ); }
test { try expectFail( "n_structure_open_object_close_array.json" ); }
test { try expectFail( "n_structure_open_object_comma.json" ); }
test { try expectFail( "n_structure_open_object_open_array.json" ); }
test { try expectFail( "n_structure_open_object_open_string.json" ); }
test { try expectFail( "n_structure_open_object_string_with_apostrophes.json" ); }
test { try expectFail( "n_structure_open_open.json" ); }
test { try expectFail( "n_structure_single_eacute.json" ); }
test { try expectFail( "n_structure_single_star.json" ); }
test { try expectFail( "n_structure_trailing_#.json" ); }
test { try expectFail( "n_structure_uescaped_LF_before_string.json" ); }
test { try expectFail( "n_structure_unclosed_array.json" ); }
test { try expectFail( "n_structure_unclosed_array_partial_null.json" ); }
test { try expectFail( "n_structure_unclosed_array_unfinished_false.json" ); }
test { try expectFail( "n_structure_unclosed_array_unfinished_true.json" ); }
test { try expectFail( "n_structure_unclosed_object.json" ); }
test { try expectFail( "n_structure_unicode-identifier.json" ); }
test { try expectFail( "n_structure_whitespace_U+2060_word_joiner.json" ); }
test { try expectFail( "n_structure_whitespace_formfeed.json" ); }
test { try expectFail( "n_array_1_true_without_comma.json" ); }
test { try expectFail( "n_array_a_invalid_utf8.json" ); }
test { try expectFail( "n_array_colon_instead_of_comma.json" ); }
test { try expectFail( "n_array_comma_after_close.json" ); }
test { try expectFail( "n_array_comma_and_number.json" ); }
test { try expectFail( "n_array_double_comma.json" ); }
test { try expectFail( "n_array_double_extra_comma.json" ); }
test { try expectFail( "n_array_extra_close.json" ); }
test { try expectFail( "n_array_extra_comma.json" ); }
test { try expectFail( "n_array_incomplete.json" ); }
test { try expectFail( "n_array_incomplete_invalid_value.json" ); }
test { try expectFail( "n_array_inner_array_no_comma.json" ); }
test { try expectFail( "n_array_invalid_utf8.json" ); }
test { try expectFail( "n_array_items_separated_by_semicolon.json" ); }
test { try expectFail( "n_array_just_comma.json" ); }
test { try expectFail( "n_array_just_minus.json" ); }
test { try expectFail( "n_array_missing_value.json" ); }
test { try expectFail( "n_array_newlines_unclosed.json" ); }
test { try expectFail( "n_array_number_and_comma.json" ); }
test { try expectFail( "n_array_number_and_several_commas.json" ); }
test { try expectFail( "n_array_spaces_vertical_tab_formfeed.json" ); }
test { try expectFail( "n_array_star_inside.json" ); }
test { try expectFail( "n_array_unclosed.json" ); }
test { try expectFail( "n_array_unclosed_trailing_comma.json" ); }
test { try expectFail( "n_array_unclosed_with_new_lines.json" ); }
test { try expectFail( "n_array_unclosed_with_object_inside.json" ); }
test { try expectFail( "n_incomplete_false.json" ); }
test { try expectFail( "n_incomplete_null.json" ); }
test { try expectFail( "n_incomplete_true.json" ); }
test { try expectFail( "n_multidigit_number_then_00.json" ); }
test { try expectFail( "n_number_++.json" ); }
test { try expectFail( "n_number_+1.json" ); }
test { try expectFail( "n_number_+Inf.json" ); }
test { try expectFail( "n_number_-01.json" ); }
test { try expectFail( "n_number_-1.0..json" ); }
test { try expectFail( "n_number_-2..json" ); }
test { try expectFail( "n_number_-NaN.json" ); }
test { try expectFail( "n_number_.-1.json" ); }
test { try expectFail( "n_number_.2e-3.json" ); }
test { try expectFail( "n_number_0.1.2.json" ); }
test { try expectFail( "n_number_0.3e+.json" ); }
test { try expectFail( "n_number_0.3e.json" ); }
test { try expectFail( "n_number_0.e1.json" ); }
test { try expectFail( "n_number_0_capital_E+.json" ); }
test { try expectFail( "n_number_0_capital_E.json" ); }
test { try expectFail( "n_number_0e+.json" ); }
test { try expectFail( "n_number_0e.json" ); }
test { try expectFail( "n_number_1.0e+.json" ); }
test { try expectFail( "n_number_1.0e-.json" ); }
test { try expectFail( "n_number_1.0e.json" ); }
test { try expectFail( "n_number_1_000.json" ); }
test { try expectFail( "n_number_1eE2.json" ); }
test { try expectFail( "n_number_2.e+3.json" ); }
test { try expectFail( "n_number_2.e-3.json" ); }
test { try expectFail( "n_number_2.e3.json" ); }
test { try expectFail( "n_number_9.e+.json" ); }
test { try expectFail( "n_number_Inf.json" ); }
test { try expectFail( "n_number_NaN.json" ); }
test { try expectFail( "n_number_U+FF11_fullwidth_digit_one.json" ); }
test { try expectFail( "n_number_expression.json" ); }
test { try expectFail( "n_number_hex_1_digit.json" ); }
test { try expectFail( "n_number_hex_2_digits.json" ); }
test { try expectFail( "n_number_infinity.json" ); }
test { try expectFail( "n_number_invalid+-.json" ); }
test { try expectFail( "n_number_invalid-negative-real.json" ); }
test { try expectFail( "n_number_invalid-utf-8-in-bigger-int.json" ); }
test { try expectFail( "n_number_invalid-utf-8-in-exponent.json" ); }
test { try expectFail( "n_number_invalid-utf-8-in-int.json" ); }
test { try expectFail( "n_number_minus_infinity.json" ); }
test { try expectFail( "n_number_minus_sign_with_trailing_garbage.json" ); }
test { try expectFail( "n_number_minus_space_1.json" ); }
test { try expectFail( "n_number_neg_int_starting_with_zero.json" ); }
test { try expectFail( "n_number_neg_real_without_int_part.json" ); }
test { try expectFail( "n_number_neg_with_garbage_at_end.json" ); }
test { try expectFail( "n_number_real_garbage_after_e.json" ); }
test { try expectFail( "n_number_real_with_invalid_utf8_after_e.json" ); }
test { try expectFail( "n_number_real_without_fractional_part.json" ); }
test { try expectFail( "n_number_starting_with_dot.json" ); }
test { try expectFail( "n_number_with_alpha.json" ); }
test { try expectFail( "n_number_with_alpha_char.json" ); }
test { try expectFail( "n_number_with_leading_zero.json" ); }
test { try expectFail( "n_object_bad_value.json" ); }
test { try expectFail( "n_object_bracket_key.json" ); }
test { try expectFail( "n_object_comma_instead_of_colon.json" ); }
test { try expectFail( "n_object_double_colon.json" ); }
test { try expectFail( "n_object_emoji.json" ); }
test { try expectFail( "n_object_garbage_at_end.json" ); }
test { try expectFail( "n_object_key_with_single_quotes.json" ); }
test { try expectFail( "n_object_lone_continuation_byte_in_key_and_trailing_comma.json" ); }
test { try expectFail( "n_object_missing_colon.json" ); }
test { try expectFail( "n_object_missing_key.json" ); }
test { try expectFail( "n_object_missing_semicolon.json" ); }
test { try expectFail( "n_object_missing_value.json" ); }
test { try expectFail( "n_object_no-colon.json" ); }
test { try expectFail( "n_object_non_string_key.json" ); }
test { try expectFail( "n_object_non_string_key_but_huge_number_instead.json" ); }
test { try expectFail( "n_object_repeated_null_null.json" ); }
test { try expectFail( "n_object_several_trailing_commas.json" ); }
test { try expectFail( "n_object_single_quote.json" ); }
test { try expectFail( "n_object_trailing_comma.json" ); }
test { try expectFail( "n_object_trailing_comment.json" ); }
test { try expectFail( "n_object_trailing_comment_open.json" ); }
test { try expectFail( "n_object_trailing_comment_slash_open.json" ); }
test { try expectFail( "n_object_trailing_comment_slash_open_incomplete.json" ); }
test { try expectFail( "n_object_two_commas_in_a_row.json" ); }
test { try expectFail( "n_object_unquoted_key.json" ); }
test { try expectFail( "n_object_unterminated-value.json" ); }
test { try expectFail( "n_object_with_single_string.json" ); }
test { try expectFail( "n_object_with_trailing_garbage.json" ); }
test { try expectFail( "n_single_space.json" ); }
test { try expectFail( "n_string_1_surrogate_then_escape.json" ); }
test { try expectFail( "n_string_1_surrogate_then_escape_u.json" ); }
test { try expectFail( "n_string_1_surrogate_then_escape_u1.json" ); }
test { try expectFail( "n_string_1_surrogate_then_escape_u1x.json" ); }
test { try expectFail( "n_string_accentuated_char_no_quotes.json" ); }
test { try expectFail( "n_string_backslash_00.json" ); }
test { try expectFail( "n_string_escape_x.json" ); }
test { try expectFail( "n_string_escaped_backslash_bad.json" ); }
test { try expectFail( "n_string_escaped_ctrl_char_tab.json" ); }
test { try expectFail( "n_string_escaped_emoji.json" ); }
test { try expectFail( "n_string_incomplete_escape.json" ); }
test { try expectFail( "n_string_incomplete_escaped_character.json" ); }
test { try expectFail( "n_string_incomplete_surrogate.json" ); }
test { try expectFail( "n_string_incomplete_surrogate_escape_invalid.json" ); }
test { try expectFail( "n_string_invalid-utf-8-in-escape.json" ); }
test { try expectFail( "n_string_invalid_backslash_esc.json" ); }
test { try expectFail( "n_string_invalid_unicode_escape.json" ); }
test { try expectFail( "n_string_invalid_utf8_after_escape.json" ); }
test { try expectFail( "n_string_leading_uescaped_thinspace.json" ); }
test { try expectFail( "n_string_no_quotes_with_bad_escape.json" ); }
test { try expectFail( "n_string_single_doublequote.json" ); }
test { try expectFail( "n_string_single_quote.json" ); }
test { try expectFail( "n_string_single_string_no_double_quotes.json" ); }
test { try expectFail( "n_string_start_escape_unclosed.json" ); }
test { try expectFail( "n_string_unescaped_ctrl_char.json" ); }
test { try expectFail( "n_string_unescaped_newline.json" ); }
test { try expectFail( "n_string_unescaped_tab.json" ); }
test { try expectFail( "n_string_unicode_CapitalU.json" ); }
test { try expectFail( "n_string_with_trailing_garbage.json" ); }
test { try expectFail( "n_structure_100000_opening_arrays.json" ); }
test { try expectFail( "n_structure_U+2060_word_joined.json" ); }
test { try expectFail( "n_structure_UTF8_BOM_no_data.json" ); }
test { try expectFail( "n_structure_angle_bracket_..json" ); }
test { try expectFail( "n_structure_angle_bracket_null.json" ); }
test { try expectFail( "n_structure_array_trailing_garbage.json" ); }
test { try expectFail( "n_structure_array_with_extra_array_close.json" ); }
test { try expectFail( "n_structure_array_with_unclosed_string.json" ); }
test { try expectFail( "n_structure_ascii-unicode-identifier.json" ); }
test { try expectFail( "n_structure_capitalized_True.json" ); }
test { try expectFail( "n_structure_close_unopened_array.json" ); }
test { try expectFail( "n_structure_comma_instead_of_closing_brace.json" ); }
test { try expectFail( "n_structure_double_array.json" ); }
test { try expectFail( "n_structure_end_array.json" ); }
test { try expectFail( "n_structure_incomplete_UTF8_BOM.json" ); }
test { try expectFail( "n_structure_lone-invalid-utf-8.json" ); }
test { try expectFail( "n_structure_lone-open-bracket.json" ); }
test { try expectFail( "n_structure_no_data.json" ); }
test { try expectFail( "n_structure_null-byte-outside-string.json" ); }
test { try expectFail( "n_structure_number_with_trailing_garbage.json" ); }
test { try expectFail( "n_structure_object_followed_by_closing_object.json" ); }
test { try expectFail( "n_structure_object_unclosed_no_value.json" ); }
test { try expectFail( "n_structure_object_with_comment.json" ); }
test { try expectFail( "n_structure_object_with_trailing_garbage.json" ); }
test { try expectFail( "n_structure_open_array_apostrophe.json" ); }
test { try expectFail( "n_structure_open_array_comma.json" ); }
test { try expectFail( "n_structure_open_array_object.json" ); }
test { try expectFail( "n_structure_open_array_open_object.json" ); }
test { try expectFail( "n_structure_open_array_open_string.json" ); }
test { try expectFail( "n_structure_open_array_string.json" ); }
test { try expectFail( "n_structure_open_object.json" ); }
test { try expectFail( "n_structure_open_object_close_array.json" ); }
test { try expectFail( "n_structure_open_object_comma.json" ); }
test { try expectFail( "n_structure_open_object_open_array.json" ); }
test { try expectFail( "n_structure_open_object_open_string.json" ); }
test { try expectFail( "n_structure_open_object_string_with_apostrophes.json" ); }
test { try expectFail( "n_structure_open_open.json" ); }
test { try expectFail( "n_structure_single_eacute.json" ); }
test { try expectFail( "n_structure_single_star.json" ); }
test { try expectFail( "n_structure_trailing_#.json" ); }
test { try expectFail( "n_structure_uescaped_LF_before_string.json" ); }
test { try expectFail( "n_structure_unclosed_array.json" ); }
test { try expectFail( "n_structure_unclosed_array_partial_null.json" ); }
test { try expectFail( "n_structure_unclosed_array_unfinished_false.json" ); }
test { try expectFail( "n_structure_unclosed_array_unfinished_true.json" ); }
test { try expectFail( "n_structure_unclosed_object.json" ); }
test { try expectFail( "n_structure_unicode-identifier.json" ); }
test { try expectFail( "n_structure_whitespace_U+2060_word_joiner.json" ); }
test { try expectFail( "n_structure_whitespace_formfeed.json" ); }
test { try expectPass( "y_array_arraysWithSpaces.json" ); }
test { try expectPass( "y_array_empty-string.json" ); }
test { try expectPass( "y_array_empty.json" ); }
test { try expectPass( "y_array_ending_with_newline.json" ); }
test { try expectPass( "y_array_false.json" ); }
test { try expectPass( "y_array_heterogeneous.json" ); }
test { try expectPass( "y_array_null.json" ); }
test { try expectPass( "y_array_with_1_and_newline.json" ); }
test { try expectPass( "y_array_with_leading_space.json" ); }
test { try expectPass( "y_array_with_several_null.json" ); }
test { try expectPass( "y_array_with_trailing_space.json" ); }
test { try expectPass( "y_number.json" ); }
test { try expectPass( "y_number_0e+1.json" ); }
test { try expectPass( "y_number_0e1.json" ); }
test { try expectPass( "y_number_after_space.json" ); }
test { try expectPass( "y_number_double_close_to_zero.json" ); }
test { try expectPass( "y_number_int_with_exp.json" ); }
test { try expectPass( "y_number_minus_zero.json" ); }
test { try expectPass( "y_number_negative_int.json" ); }
test { try expectPass( "y_number_negative_one.json" ); }
test { try expectPass( "y_number_negative_zero.json" ); }
test { try expectPass( "y_number_real_capital_e.json" ); }
test { try expectPass( "y_number_real_capital_e_neg_exp.json" ); }
test { try expectPass( "y_number_real_capital_e_pos_exp.json" ); }
test { try expectPass( "y_number_real_exponent.json" ); }
test { try expectPass( "y_number_real_fraction_exponent.json" ); }
test { try expectPass( "y_number_real_neg_exp.json" ); }
test { try expectPass( "y_number_real_pos_exponent.json" ); }
test { try expectPass( "y_number_simple_int.json" ); }
test { try expectPass( "y_number_simple_real.json" ); }
test { try expectPass( "y_object.json" ); }
test { try expectPass( "y_object_basic.json" ); }
test { try expectPass( "y_object_duplicated_key.json" ); }
test { try expectPass( "y_object_duplicated_key_and_value.json" ); }
test { try expectPass( "y_object_empty.json" ); }
test { try expectPass( "y_object_empty_key.json" ); }
test { try expectPass( "y_object_escaped_null_in_key.json" ); }
test { try expectPass( "y_object_extreme_numbers.json" ); }
test { try expectPass( "y_object_long_strings.json" ); }
test { try expectPass( "y_object_simple.json" ); }
test { try expectPass( "y_object_string_unicode.json" ); }
test { try expectPass( "y_object_with_newlines.json" ); }
test { try expectPass( "y_string_1_2_3_bytes_UTF-8_sequences.json" ); }
test { try expectPass( "y_string_accepted_surrogate_pair.json" ); }
test { try expectPass( "y_string_accepted_surrogate_pairs.json" ); }
test { try expectPass( "y_string_allowed_escapes.json" ); }
test { try expectPass( "y_string_backslash_and_u_escaped_zero.json" ); }
test { try expectPass( "y_string_backslash_doublequotes.json" ); }
test { try expectPass( "y_string_comments.json" ); }
test { try expectPass( "y_string_double_escape_a.json" ); }
test { try expectPass( "y_string_double_escape_n.json" ); }
test { try expectPass( "y_string_escaped_control_character.json" ); }
test { try expectPass( "y_string_escaped_noncharacter.json" ); }
test { try expectPass( "y_string_in_array.json" ); }
test { try expectPass( "y_string_in_array_with_leading_space.json" ); }
test { try expectPass( "y_string_last_surrogates_1_and_2.json" ); }
test { try expectPass( "y_string_nbsp_uescaped.json" ); }
test { try expectPass( "y_string_nonCharacterInUTF-8_U+10FFFF.json" ); }
test { try expectPass( "y_string_nonCharacterInUTF-8_U+FFFF.json" ); }
test { try expectPass( "y_string_null_escape.json" ); }
test { try expectPass( "y_string_one-byte-utf-8.json" ); }
test { try expectPass( "y_string_pi.json" ); }
test { try expectPass( "y_string_reservedCharacterInUTF-8_U+1BFFF.json" ); }
test { try expectPass( "y_string_simple_ascii.json" ); }
test { try expectPass( "y_string_space.json" ); }
test { try expectPass( "y_string_surrogates_U+1D11E_MUSICAL_SYMBOL_G_CLEF.json" ); }
test { try expectPass( "y_string_three-byte-utf-8.json" ); }
test { try expectPass( "y_string_two-byte-utf-8.json" ); }
test { try expectPass( "y_string_u+2028_line_sep.json" ); }
test { try expectPass( "y_string_u+2029_par_sep.json" ); }
test { try expectPass( "y_string_uEscape.json" ); }
test { try expectPass( "y_string_uescaped_newline.json" ); }
test { try expectPass( "y_string_unescaped_char_delete.json" ); }
test { try expectPass( "y_string_unicode.json" ); }
test { try expectPass( "y_string_unicodeEscapedBackslash.json" ); }
test { try expectPass( "y_string_unicode_2.json" ); }
test { try expectPass( "y_string_unicode_U+10FFFE_nonchar.json" ); }
test { try expectPass( "y_string_unicode_U+1FFFE_nonchar.json" ); }
test { try expectPass( "y_string_unicode_U+200B_ZERO_WIDTH_SPACE.json" ); }
test { try expectPass( "y_string_unicode_U+2064_invisible_plus.json" ); }
test { try expectPass( "y_string_unicode_U+FDD0_nonchar.json" ); }
test { try expectPass( "y_string_unicode_U+FFFE_nonchar.json" ); }
test { try expectPass( "y_string_unicode_escaped_double_quote.json" ); }
test { try expectPass( "y_string_utf8.json" ); }
test { try expectPass( "y_string_with_del_character.json" ); }
test { try expectPass( "y_structure_lonely_false.json" ); }
test { try expectPass( "y_structure_lonely_int.json" ); }
test { try expectPass( "y_structure_lonely_negative_real.json" ); }
test { try expectPass( "y_structure_lonely_null.json" ); }
test { try expectPass( "y_structure_lonely_string.json" ); }
test { try expectPass( "y_structure_lonely_true.json" ); }
test { try expectPass( "y_structure_string_empty.json" ); }
test { try expectPass( "y_structure_trailing_newline.json" ); }
test { try expectPass( "y_structure_true_in_array.json" ); }
test { try expectPass( "y_structure_whitespace_array.json" ); }
test { try expectPass( "y_array_arraysWithSpaces.json" ); }
test { try expectPass( "y_array_empty-string.json" ); }
test { try expectPass( "y_array_empty.json" ); }
test { try expectPass( "y_array_ending_with_newline.json" ); }
test { try expectPass( "y_array_false.json" ); }
test { try expectPass( "y_array_heterogeneous.json" ); }
test { try expectPass( "y_array_null.json" ); }
test { try expectPass( "y_array_with_1_and_newline.json" ); }
test { try expectPass( "y_array_with_leading_space.json" ); }
test { try expectPass( "y_array_with_several_null.json" ); }
test { try expectPass( "y_array_with_trailing_space.json" ); }
test { try expectPass( "y_number.json" ); }
test { try expectPass( "y_number_0e+1.json" ); }
test { try expectPass( "y_number_0e1.json" ); }
test { try expectPass( "y_number_after_space.json" ); }
test { try expectPass( "y_number_double_close_to_zero.json" ); }
test { try expectPass( "y_number_int_with_exp.json" ); }
test { try expectPass( "y_number_minus_zero.json" ); }
test { try expectPass( "y_number_negative_int.json" ); }
test { try expectPass( "y_number_negative_one.json" ); }
test { try expectPass( "y_number_negative_zero.json" ); }
test { try expectPass( "y_number_real_capital_e.json" ); }
test { try expectPass( "y_number_real_capital_e_neg_exp.json" ); }
test { try expectPass( "y_number_real_capital_e_pos_exp.json" ); }
test { try expectPass( "y_number_real_exponent.json" ); }
test { try expectPass( "y_number_real_fraction_exponent.json" ); }
test { try expectPass( "y_number_real_neg_exp.json" ); }
test { try expectPass( "y_number_real_pos_exponent.json" ); }
test { try expectPass( "y_number_simple_int.json" ); }
test { try expectPass( "y_number_simple_real.json" ); }
test { try expectPass( "y_object.json" ); }
test { try expectPass( "y_object_basic.json" ); }
test { try expectPass( "y_object_duplicated_key.json" ); }
test { try expectPass( "y_object_duplicated_key_and_value.json" ); }
test { try expectPass( "y_object_empty.json" ); }
test { try expectPass( "y_object_empty_key.json" ); }
test { try expectPass( "y_object_escaped_null_in_key.json" ); }
test { try expectPass( "y_object_extreme_numbers.json" ); }
test { try expectPass( "y_object_long_strings.json" ); }
test { try expectPass( "y_object_simple.json" ); }
test { try expectPass( "y_object_string_unicode.json" ); }
test { try expectPass( "y_object_with_newlines.json" ); }
test { try expectPass( "y_string_1_2_3_bytes_UTF-8_sequences.json" ); }
test { try expectPass( "y_string_accepted_surrogate_pair.json" ); }
test { try expectPass( "y_string_accepted_surrogate_pairs.json" ); }
test { try expectPass( "y_string_allowed_escapes.json" ); }
test { try expectPass( "y_string_backslash_and_u_escaped_zero.json" ); }
test { try expectPass( "y_string_backslash_doublequotes.json" ); }
test { try expectPass( "y_string_comments.json" ); }
test { try expectPass( "y_string_double_escape_a.json" ); }
test { try expectPass( "y_string_double_escape_n.json" ); }
test { try expectPass( "y_string_escaped_control_character.json" ); }
test { try expectPass( "y_string_escaped_noncharacter.json" ); }
test { try expectPass( "y_string_in_array.json" ); }
test { try expectPass( "y_string_in_array_with_leading_space.json" ); }
test { try expectPass( "y_string_last_surrogates_1_and_2.json" ); }
test { try expectPass( "y_string_nbsp_uescaped.json" ); }
test { try expectPass( "y_string_nonCharacterInUTF-8_U+10FFFF.json" ); }
test { try expectPass( "y_string_nonCharacterInUTF-8_U+FFFF.json" ); }
test { try expectPass( "y_string_null_escape.json" ); }
test { try expectPass( "y_string_one-byte-utf-8.json" ); }
test { try expectPass( "y_string_pi.json" ); }
test { try expectPass( "y_string_reservedCharacterInUTF-8_U+1BFFF.json" ); }
test { try expectPass( "y_string_simple_ascii.json" ); }
test { try expectPass( "y_string_space.json" ); }
test { try expectPass( "y_string_surrogates_U+1D11E_MUSICAL_SYMBOL_G_CLEF.json" ); }
test { try expectPass( "y_string_three-byte-utf-8.json" ); }
test { try expectPass( "y_string_two-byte-utf-8.json" ); }
test { try expectPass( "y_string_u+2028_line_sep.json" ); }
test { try expectPass( "y_string_u+2029_par_sep.json" ); }
test { try expectPass( "y_string_uEscape.json" ); }
test { try expectPass( "y_string_uescaped_newline.json" ); }
test { try expectPass( "y_string_unescaped_char_delete.json" ); }
test { try expectPass( "y_string_unicode.json" ); }
test { try expectPass( "y_string_unicodeEscapedBackslash.json" ); }
test { try expectPass( "y_string_unicode_2.json" ); }
test { try expectPass( "y_string_unicode_U+10FFFE_nonchar.json" ); }
test { try expectPass( "y_string_unicode_U+1FFFE_nonchar.json" ); }
test { try expectPass( "y_string_unicode_U+200B_ZERO_WIDTH_SPACE.json" ); }
test { try expectPass( "y_string_unicode_U+2064_invisible_plus.json" ); }
test { try expectPass( "y_string_unicode_U+FDD0_nonchar.json" ); }
test { try expectPass( "y_string_unicode_U+FFFE_nonchar.json" ); }
test { try expectPass( "y_string_unicode_escaped_double_quote.json" ); }
test { try expectPass( "y_string_utf8.json" ); }
test { try expectPass( "y_string_with_del_character.json" ); }
test { try expectPass( "y_structure_lonely_false.json" ); }
test { try expectPass( "y_structure_lonely_int.json" ); }
test { try expectPass( "y_structure_lonely_negative_real.json" ); }
test { try expectPass( "y_structure_lonely_null.json" ); }
test { try expectPass( "y_structure_lonely_string.json" ); }
test { try expectPass( "y_structure_lonely_true.json" ); }
test { try expectPass( "y_structure_string_empty.json" ); }
test { try expectPass( "y_structure_trailing_newline.json" ); }
test { try expectPass( "y_structure_true_in_array.json" ); }
test { try expectPass( "y_structure_whitespace_array.json" ); }

View File

@ -0,0 +1 @@
[123.456e-789]

View File

@ -0,0 +1 @@
[0.4e00669999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999969999999006]

View File

@ -0,0 +1 @@
[-1e+9999]

View File

@ -0,0 +1 @@
[1.5e+9999]

View File

@ -0,0 +1 @@
[-123123e100000]

View File

@ -0,0 +1 @@
[123123e100000]

View File

@ -0,0 +1 @@
[123e-10000000]

View File

@ -0,0 +1 @@
[-123123123123123123123123123123]

View File

@ -0,0 +1 @@
[100000000000000000000]

View File

@ -0,0 +1 @@
[-237462374673276894279832749832423479823246327846]

View File

@ -0,0 +1 @@
{"\uDFAA":0}

View File

@ -0,0 +1 @@
["\uDADA"]

View File

@ -0,0 +1 @@
["\uD888\u1234"]

Binary file not shown.

View File

@ -0,0 +1 @@
["譌・ム淫"]

View File

@ -0,0 +1 @@
["<22><><EFBFBD>"]

View File

@ -0,0 +1 @@
["\uD800\n"]

View File

@ -0,0 +1 @@
["\uDd1ea"]

View File

@ -0,0 +1 @@
["\uD800\uD800\n"]

View File

@ -0,0 +1 @@
["\ud800"]

View File

@ -0,0 +1 @@
["\ud800abc"]

View File

@ -0,0 +1 @@
["<22>"]

View File

@ -0,0 +1 @@
["\uDd1e\uD834"]

View File

@ -0,0 +1 @@
["И"]

View File

@ -0,0 +1 @@
["\uDFAA"]

View File

@ -0,0 +1 @@
["<22>"]

View File

@ -0,0 +1 @@
["<22><><EFBFBD><EFBFBD>"]

View File

@ -0,0 +1 @@
["<22><>"]

View File

@ -0,0 +1 @@
["<22>ソソソソ"]

View File

@ -0,0 +1 @@
["<22>€€€€"]

View File

@ -0,0 +1 @@
["<22><>"]

Binary file not shown.

Binary file not shown.

View File

@ -0,0 +1 @@
[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]

View File

@ -0,0 +1 @@
{}

View File

@ -0,0 +1 @@
[1 true]

View File

@ -0,0 +1 @@
[a

View File

@ -0,0 +1 @@
["": 1]

View File

@ -0,0 +1 @@
[""],

View File

@ -0,0 +1 @@
[,1]

View File

@ -0,0 +1 @@
[1,,2]

View File

@ -0,0 +1 @@
["x",,]

View File

@ -0,0 +1 @@
["x"]]

View File

@ -0,0 +1 @@
["",]

View File

@ -0,0 +1 @@
["x"

View File

@ -0,0 +1 @@
[x

View File

@ -0,0 +1 @@
[3[4]]

View File

@ -0,0 +1 @@
[<EFBFBD>]

View File

@ -0,0 +1 @@
[1:2]

1
tests/n_array_just_comma.json Executable file
View File

@ -0,0 +1 @@
[,]

1
tests/n_array_just_minus.json Executable file
View File

@ -0,0 +1 @@
[-]

View File

@ -0,0 +1 @@
[ , ""]

View File

@ -0,0 +1,3 @@
["a",
4
,1,

View File

@ -0,0 +1 @@
[1,]

View File

@ -0,0 +1 @@
[1,,]

View File

@ -0,0 +1 @@
[" a"\f]

1
tests/n_array_star_inside.json Executable file
View File

@ -0,0 +1 @@
[*]

View File

@ -0,0 +1 @@
[""

View File

@ -0,0 +1 @@
[1,

View File

@ -0,0 +1,3 @@
[1,
1
,1

View File

@ -0,0 +1 @@
[{}

View File

@ -0,0 +1 @@
[fals]

View File

@ -0,0 +1 @@
[nul]

View File

@ -0,0 +1 @@
[tru]

Binary file not shown.

1
tests/n_number_++.json Normal file
View File

@ -0,0 +1 @@
[++1234]

1
tests/n_number_+1.json Executable file
View File

@ -0,0 +1 @@
[+1]

1
tests/n_number_+Inf.json Executable file
View File

@ -0,0 +1 @@
[+Inf]

1
tests/n_number_-01.json Executable file
View File

@ -0,0 +1 @@
[-01]

1
tests/n_number_-1.0..json Executable file
View File

@ -0,0 +1 @@
[-1.0.]

1
tests/n_number_-2..json Executable file
View File

@ -0,0 +1 @@
[-2.]

1
tests/n_number_-NaN.json Executable file
View File

@ -0,0 +1 @@
[-NaN]

1
tests/n_number_.-1.json Normal file
View File

@ -0,0 +1 @@
[.-1]

1
tests/n_number_.2e-3.json Executable file
View File

@ -0,0 +1 @@
[.2e-3]

1
tests/n_number_0.1.2.json Executable file
View File

@ -0,0 +1 @@
[0.1.2]

View File

@ -0,0 +1 @@
[0.3e+]

1
tests/n_number_0.3e.json Normal file
View File

@ -0,0 +1 @@
[0.3e]

1
tests/n_number_0.e1.json Normal file
View File

@ -0,0 +1 @@
[0.e1]

View File

@ -0,0 +1 @@
[0E+]

View File

@ -0,0 +1 @@
[0E]

1
tests/n_number_0e+.json Normal file
View File

@ -0,0 +1 @@
[0e+]

1
tests/n_number_0e.json Normal file
View File

@ -0,0 +1 @@
[0e]

1
tests/n_number_1.0e+.json Executable file
View File

@ -0,0 +1 @@
[1.0e+]

1
tests/n_number_1.0e-.json Executable file
View File

@ -0,0 +1 @@
[1.0e-]

1
tests/n_number_1.0e.json Executable file
View File

@ -0,0 +1 @@
[1.0e]

1
tests/n_number_1_000.json Executable file
View File

@ -0,0 +1 @@
[1 000.0]

1
tests/n_number_1eE2.json Executable file
View File

@ -0,0 +1 @@
[1eE2]

1
tests/n_number_2.e+3.json Executable file
View File

@ -0,0 +1 @@
[2.e+3]

1
tests/n_number_2.e-3.json Executable file
View File

@ -0,0 +1 @@
[2.e-3]

Some files were not shown because too many files have changed in this diff Show More