8 comments

  • BugsJustFindMe35 days ago
    Do you handle JSON numbers safely by default or do you require that people make their own deserializers for numbers that would lose precision when coerced into Python's float type? The most common mistake that I see JSON libraries make is using fixed precision floating point types somewhere in the process when handling numbers while JSON's number type specifies no such limitation, which then causes precision loss unless people catch the problem and do their own pre-serialization.
  • orrbenyamini42 days ago
    Hi HN - I’m the author of Jsonic.<p>I built it after repeatedly running into friction with Python’s built-in json module when working with classes, dataclasses, nested objects, and type hints.<p>Jsonic focuses on: - Zero-boilerplate serialization and deserialization - Strict type validation with clear errors - Natural support for dataclasses, enums, tuples, sets, nested objects etc. - Optional field exclusion (e.g. hiding sensitive data) - Extra features like transient fields definition, suport for __slots__ classes etc. - Clean interop with Pydantic models<p>The goal is to make JSON round-tripping feel Pythonic and predictable without writing to_dict() &#x2F; from_dict() everywhere.<p>I’d really appreciate feedback on the API design and tradeoffs.
    • memoriuaysj35 days ago
      all the quoted Python code on the medium post has broken formatting<p>your comment above has the same broken formatting<p>does not inspire confidence if you can&#x27;t spot such obvious breakage
      • orrbenyamini32 days ago
        Appreciate the feedback, the formatting completely broke when pasting the code snippets into Medium.<p>I fixed the article formatting and some of the feedback i got for it.<p>Thanks for investing time reading !
    • zahlman35 days ago
      &gt; after repeatedly running into friction<p>Could you be more specific?
    • DonHopkins35 days ago
      [dead]
  • woodruffw35 days ago
    The degree of LLM writing here makes it hard to determine which parts of this are novel and which parts are derivations of existing popular libraries like Pydantic and msgspec.<p>I also don&#x27;t think either Pydantic or msgspec struggles with any of the &quot;gotcha&quot; cases in the post. Both can understand enums, type tagging, literals, etc.
  • xml35 days ago
    Were there any particular challenges when implementing your library? I have implemented my own serialization library [1] (with a focus on not allowing arbitrary code execution), but had skipped dataclasses for now, since they seemed difficult to get right. What was your experience?<p>[1] <a href="https:&#x2F;&#x2F;github.com&#x2F;99991&#x2F;safeserialize" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;99991&#x2F;safeserialize</a><p>Side note: I think that a warning in the README about arbitrary code execution for deserialization of untrusted inputs would be nice.
    • orrbenyamini32 days ago
      Good question! Dataclasses were actually pretty easy - Python&#x27;s introspection tools made them straightforward.<p>The tricky parts were:<p>- Type hints - Mapping __init__ params to attributes, especially with complex types - Preserving types - Keeping tuples as tuples and sets as sets (not just lists) - Error messages - Tracking paths like obj.address.street through the whole pipeline<p>I checked out safeserialize, by the way—the focus on preventing arbitrary code execution is a really smart niche.
  • dcreater35 days ago
    The article would benefit from a very clear and explicit section on pydantic model_dump_json() vs your tool. As that&#x27;s the primary thing you&#x27;re tool is likely competing against
  • mukundesh35 days ago
    Thanks for sharing, could you please comment on the performance aspect vis-a-vis json reader&#x2F;writer provided by pydantic
  • fucalost41 days ago
    Sorry to be a hater, but wouldn’t using Pydantic be better in almost every circumstance here?
    • orrbenyamini41 days ago
      Pydantic is great lib and and has many advantages over Jsonic,<p>I think main use cases for Jsonic over Pydantic are: - You already have plain Python classes or dataclasses and don’t want to convert them to BaseModel - You prefer minimal intrusion - no inheritance, no decorators, no schema definitions - You need to serialize and deserialize Pydantic models alongside non-Pydantic classes<p>Having said that, Pydantic is the better choice in most cases.<p>This is also why Jsonic integrate natively with Pydantic so you can serialize Pydantic models using Jsonic out of the box
      • japborst41 days ago
        I can see that. Pydantic is great but relatively slow (which matters on edge devices) and can be bloated.<p>The fact that all your projects use Pydantic makes it an easy starting point and created standardisation - of course.<p>Nevertheless, I can definitely see some use-cases for lightweight JSON-serialisation without bringing in Pydantic. Dataclasses are great, but lack proper json handling.
  • leobg41 days ago
    Looks useful. Will try it out. Thanks for making it.