Search code examples
javascriptjqueryjsonsingle-page-applicationgeojson

Where to place function that loads json data?


I'm working on the project that uses .json file as a main source of data. Application is developed in Bootstrap, JavaScript/jQuery and HTML5. This code was developed few months ago and I'm trying to improve efficiency and update the code.

The first thing I noticed after reviewing the code was the way the data is included in this application. There are few .json files that are used for different screens. These files are all over the place in different locations.

Also every time they do onclick for example they reload the .json file. There is no reason to do that since data is updated once a month. I'm wondering why this wouldn't be done only once (first time when application is loaded) and then set the data in js object?

Is that a good practice or there is something better? Here is example on how I'm thinking to update this code:

var jsonData = {};
$(document).ready(function() {
  $.getJSON('data.json', function(data){ 
    // Load JSON data in JS object.
    jsonData = data;
  });
});

Should the code above be placed in html header or body tags? I know that nowadays .js files are included on the bottom of body tag and all .css is in the header. Is there any difference when comes to including json files? If anyone have any suggestions please let me know. The json files have around 600+ records with multiple fields (over 30). That might change in the future. So if these files get bigger I need to make sure that won't affect efficiency of the application overall.


Solution

  • In my view you are correct to think that files shouldn't be loaded by onclick event. I agree with you that you should load files beforehand.

    The correct place to load is before any js code that uses them. JSs are placed in the bottom of the page because the DOM has to be already loaded in order for the JS code to work. So it's natural that you describe the page and then load the code that runs on it.

    Also 600+ records even with 30 fields is a minimal amount of data that fits gently in memory. I would load all jsons beforehand and use them directly from a variable in memory. If you think this will grow a lot (by a lot I mean 100.000+ records), then I would use localstorage for that.

    I'll give you another option though: in one of my systems I load to memory aprox. 25000 records in a full blown memory database and this happens in much less than 1s and a select to this database is imediate. You have full sql available. This could be a good aproach to you. I'm talking about SQLite compiled to javascript: https://github.com/kripken/sql.js/

    I tested some memory databases and I recommed this one strongly.

    Edit

    Answering to @expresso_coffee:

    I use the following code to import json to SQLite (I use requireJs):

    define(['jquery', 'sqlite', 'json!data/data.json'],
    function($, sqlite, jsonData) {
    
        self = {};
    
        var db;
    
        function createDb() {
            return new Promise((res)=>{
                db = new sqlite.Database();
                db.run("CREATE VIRTUAL TABLE usuarios USING fts4(field1 int, field2 text, field3 text, field4 text, field5 text, field6 text, field7 text);");
                res(1);
            })
        }
    
        function populateDB( jsonData ) {
            return new Promise((res)=>{
                var stmt = db.prepare("INSERT INTO table values (?,?,?,?,?,?,?)");
                db.run("BEGIN TRANSACTION");
                jsonData.list.forEach((rec)=>{
                    stmt.run([rec.field1, rec.field2, rec.field3, rec.field4, rec.field5, rec.field6, rec.field7);
                })
                stmt.finalize;
                db.run("END");
                updateDOM();
                res(1);
            });
        }
    
        (...)
    

    This is the code that loads the 25000 records in a split second.