Jump to content
MakeWebGames

MySQL JSON field type


Alan

Recommended Posts

Any have any experience of this ? I have been experimenting with it and while it looks promising, I wonder if there are downsides that I've not encountered yet. For example; this is capable of regenerating to microsecond accuracy player attributes (energy, brave, health, will etc) stored in key-value form with a JSON definition field for each attribute.

UPDATE `player_attribute` pa
LEFT JOIN `attribute` a ON pa.`attribute_id` = a.`id`
LEFT JOIN `player` p ON pa.`player_id` = p.`id`
SET pa.`fval` = GREATEST(
  a.`definition`->"$.minimum",
  LEAST(
    pa.`fval` + (a.`definition`->"$.rate_per_hour" * (UNIX_TIMESTAMP(NOW(6)) - UNIX_TIMESTAMP(p.`regenerated`))) / 3600.0,
    a.`definition`->"$.maximum"
  ),
  p.`regenerated` = NOW(6)
)
WHERE pa.`player_id` = :player_id
AND a.`definition`->"$.type" = "fval"
AND a.`definition`->"$.rate_per_hour" IS NOT NULL;

 

Link to comment
Share on other sites

I recently (last week) had to come up with a query working with JSON data it seemed to work quite well. I did find it a hassle to get it to work with LIKE / array matches too and that was quite a bit of hassle, other than that, I found it to be quite useful.

Edited by Script47
Link to comment
Share on other sites

I once considered using this type, but realised it wasn't available in any version below 8.0. Meaning we couldn't use it as part of the framework whilst we support older versions of MySQL.

I feel like most situations requiring this field type should really have their data architected differently, but I'll be happily proven wrong.

  • Like 1
Link to comment
Share on other sites

22 minutes ago, Dave said:

I once considered using this type, but realised it wasn't available in any version below 8.0. Meaning we couldn't use it as part of the framework whilst we support older versions of MySQL.

I feel like most situations requiring this field type should really have their data architected differently, but I'll be happily proven wrong.

@Dave

In our case we were saving the response from an API. The reason being was that although the API was brilliant at what it did, it was extremely haphazard in what it returned, as such we'd store the whole response as JSON and simply look through the keys and values when showing the output. Sure, there were a couple common fields but other than that, it would differ.

Edited by Script47
Link to comment
Share on other sites

12 minutes ago, Script47 said:

@Dave

In our case we were saving the response from an API. The reason being was that although the API was brilliant at what it did, it was extremely haphazard in what it returned, as such we'd store the whole response as JSON and simply look through the keys and values when showing the output. Sure, there were a couple common fields but other than that, it would differ.

Ah okay! That makes sense. I’d personally want a stricter data architecture as it’ll be less prone to missing data. But in reality it depends on the business case!

Thanks for sharing! 

 

Link to comment
Share on other sites

19 minutes ago, Dave said:

Ah okay! That makes sense. I’d personally want a stricter data architecture as it’ll be less prone to missing data. But in reality it depends on the business case!

Thanks for sharing! 

 

Ideally, yes. However we have two issues:

1. Majority of our clients are part of the government one way or another so they don't like change and their previous software used this API so we have to make use of it. 

2. The API  we are using is done by a behemoth company and from what I know, they are the "best", apparently. 

Link to comment
Share on other sites

11 minutes ago, Alan said:

Oddly enough, I never considered storing API response data in a JSON field ... makes a lot of sense considering the work we do. Might have to investigate that.

The only reason I would do this is for debugging unless you have some strange 3rd party API, but even then, it sounds like @Script47 is doing the same stuff but just on the database side.

Link to comment
Share on other sites

30 minutes ago, Script47 said:

We have a lot of auditing requirements so that's another reason. Everything has to be recorded and records have to kept.  

Doesn't it make more sense to store those sorts of logs on disk? Then you can encrypt them and throw them onto another server.

Link to comment
Share on other sites

11 minutes ago, Dave said:

Doesn't it make more sense to store those sorts of logs on disk? Then you can encrypt them and throw them onto another server.

We also need to display the audit data easily, so db is the best place for it. 

Edited by Script47
Link to comment
Share on other sites

We have ~ 20,000 records per day of API data that needs stored - might be handy in a JSON field; while we don't specifically need to query data within it as I pre-process the data and simply cache it in long text fields; it ~may~ prove useful for future proofing. Saying that, it could be compressed; even a simple LZ compression would yield good results so a blob field possibly with a dynamically created JSON field may suffice.

Link to comment
Share on other sites

3 hours ago, Alan said:

Oddly enough, I never considered storing API response data in a JSON field ... makes a lot of sense considering the work we do. Might have to investigate that.

We used to do it for inbound API requests and some outbound API requests in my previous workplace but we used MongoDB to store it since JSON stores very nicely there (also very easy to search compared to MySQL Json fields)

  • Like 1
Link to comment
Share on other sites

We were actually scoping to use this for a large-scale web application platform not too long ago. Data architecture wise, the JSON field type is limited in size to the mysql max_packet_size according to documentation, which may end up making it smaller than mediumtext depending on server config.

Other database management system (DBMS) like @sniko mentioned does it better no doubt.

3 hours ago, Alan said:

have ~ 20,000 records per day of API data that needs stored

We were looking to save 50k products. So we might have up to 50k x few bytes. max_packet_size: I had a look at the current mysql server. The variable is set to 1048576.

Interestingly enough, the mediumtext is actually bigger (16 MB afaik).

  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...