Create a React JS WEB APP that can work offline via Service Workers and can query a SQlite file without any backend. I'm running a Linux OS.
I'm struggling with SQlite usage on a React Js web application. I'm using create-react-app and trying to query a SQlite database.
I can't quite install sqlite3 as a npm package, because it requires a dependency called aws-sdk and after installing this package, it outputs the following:
/node_modules/sqlite3/node_modules/node-pre-gyp/lib/util/compile.js
Module not found: Can't resolve 'npm' in '/home/user/Documents/snv3/node_modules/sqlite3/node_modules/node-pre-gyp/lib/util'
./node_modules/sqlite3/node_modules/node-pre-gyp/lib/util/nw-pre-gyp/index.html 1:0
Module parse failed: Unexpected token (1:0)
You may need an appropriate loader to handle this file type.
> <!doctype html>
| <html>
| <head>
Then...I moved on searching for another sqlite supporting library like better-sqlite3
When better-sqlite3 is imported, this is the output on my browser screen:
TypeError: Class is undefined
exports.wrap
node_modules/babel-loader/lib/index.js??ref--6-oneOf-2!/home/user/Documents/projectFolder/node_modules/better-sqlite3/lib/util.js:14
11 | };
12 |
13 | exports.wrap = function (Class, methodName, wrapper) {
> 14 | var originalMethod = Class.prototype[methodName];
15 |
16 | if (typeof originalMethod !== 'function') {
17 | throw new TypeError("Missing method ".concat(methodName));
I have no clue on what that is supposed to mean. There is nothing on the troubleshooting section of the library, so it's impossible to debug.
Let's try something built in Javascript and lightweight.
Oh no, my memory heap just exploded trying to compile it:
Starting the development server...
<--- Last few GCs --->
tart of marking 2696 ms) (average mu = 0.196, current mu = 0.078) alloca[19981:0x317f660] 30702 ms: Mark-sweep 1387.3 (1425.7) -> 1382.8 (1424.7) MB, 2779.6 / 0.0 ms (+ 0.1 msin 91 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 2806 ms) (average mu = 0.104, current mu = 0.010) allocati[19981:0x317f660] 33503 ms: Mark-sweep 1386.7 (1424.7) -> 1385.2 (1425.7) MB, 2780.3 / 0.0 ms (average mu = 0.056, current mu = 0.007) allocation failure scavenge might not succeed
<--- JS stacktrace --->
==== JS stack trace =========================================
0: ExitFrame [pc: 0x30bcb58041bd]
Security context: 0x346742d1e789 <JSObject>
1: get [0x146f266b8549] [/home/user/Documents/projectFolder/node_modules/@babel/traverse/lib/path/index.js:~99] [pc=0x30bcb6347fd5](this=0x2c6f95aa7759 <JSFunction NodePath (sfi =0x3a532511d061)>,/* anonymous */=0xb1cce8a8af9 <Object map = 0x3450d4054639>)
2: /* anonymous */(aka /* anonymous */) [0xb1cce8a8899] [/home/user/Documents/projectFolder/node_modul...
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
1: 0x8b8210 node::Abort() [/usr/local/bin/node]
2: 0x8b825c [/usr/local/bin/node]
3: 0xac1d1e v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/usr/local/bin/node]
4: 0xac1f38 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/usr/local/bin/node]
5: 0xeb11f2 [/usr/local/bin/node]
6: 0xebd14a v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/usr/local/bin/node]
7: 0xebdab4 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/usr/local/bin/node]
8: 0xec03e5 v8::internal::Heap::AllocateRawWithRetry(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/usr/local/bin/node]
9: 0xe888c4 v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationSpace) [/usr/local/bin/node]
10: 0x112a2ae v8::internal::Runtime_AllocateInNewSpace(int, v8::internal::Object**, v8::internal::Isolate*) [/usr/local/bin/node]
11: 0x30bcb58041bd
Aborted (core dumped)
Okay, forget sqlite libraries, let's try PouchDB!
So, after all this mess, I've given up on using SQlite libraries and attempted something that was offline first friendly: Pouch DB. I've tought that it was possible to load a SQlite file to Pouch.
So I've asked this question: What is the correct way to load prebuilt SQlite databases with or without PouchDB in a React App
Nope, it's a noSQL database, it is not a possibility...
Then i had to convert all my SQlite database to .json files, did it all manually...
Next, let's create the database and import the data from the .json files.
Edit 1:
My SQlite file have something close to 5k regstries each table. So every .json file, that I have converted from a SQlite table, have 5 k objects or docs as you prefer. And there are 19 of those .json files, each one representing a table.
AAAAND...I need to change between the .json files (tables) to do the queries, but after the file conversion, there is no _rev attribute, so every time I load the registries using bulkDocs, conflicts happen between the tables, because they lack the _rev attribute.
This is the code of my db service:
import PouchDB from 'pouchdb';
import PDBFind from 'pouchdb-find';
import PDBSilverLining from 'pouchdb-silverlining';
import TableEnumerator from '../utils/tableEnumerator';
import {
Example_Table_Name
} from '../../db'
PouchDB.plugin(PDBFind);
PouchDB.plugin(PDBSilverLining);
class DbService {
constructor() {
this._db = new PouchDB('database');
}
static async dbLoad (table) {
const dbInstance = new DbService()._db
try {
const respose = await dbInstance.bulkDocs(TableEnumerator(table))
} catch(e) {
console.error(e)
}
}
static query(query) {
const dbInstance = new DbService()._db
return dbInstance.sql(query)
}
}
export default DbService;
Docs examples:
Table 1:
{ "_id": "table1_010101012",
"Procedure": "XXXXX",
"field4": "",
"field5": "2B",
*insert here some more irrelevant data* }
Table 2:
{ "_id": "table2_0555444777",
"Procedure": "YYYYYYYYYYY",
"field4": "",
"field5": "2B",
*insert here some more irrelevant data* }
On the first (dbLoad(Table1)) call, all docs are loaded and the _rev attributes are created in the first table.
When I do another call (dbLoad(Table2)), using Table2.json, they conflict, because the new file lacks the _rev attr and when Pouch creates this attribute, they are the same!
Edit 2:
I've tried to modify my code to this:
import PDBLoad from 'pouchdb-load';
PouchDB.plugin(PDBLoad);
static async dbLoad (table) {
const db = new PouchDB(table);
try {
db.get('_local/preloaded').then(function (doc) {
}).catch(function (err) {
if (err.name !== 'not_found') {
throw err;
}
// we got a 404, so the local docuent doesn't exist. so let's preload!
return db.load('table_name.json').then(function () {
// create the local document to note that we've preloaded
return db.put({_id: '_local/preloaded'});
});
}).then(function () {
console.log({include_docs: true})
return db.allDocs({include_docs: true});
})
} catch(e) {
console.error(e)
}
}
The .json file was in the same directory of the load function, but it doesen't loads the data.
So I'm sticking with the bulkDocs version.
I'm sorry for the very long post and after all the contextualization, there comes the questions:
I hope everything is well explained, and I will be happy to clarify any questions that you might have.
Thank you for your time!
Okay, so I was able to solve my problem by creating a code like this.
import PouchDB from 'pouchdb';
import PDBFind from 'pouchdb-find';
import PDBSilverLining from 'pouchdb-silverlining';
import TableEnumerator from '../utils/tableEnumerator';
PouchDB.plugin(PDBFind);
PouchDB.plugin(PDBSilverLining);
class DbService {
static async dbLoad (table) {
const db = new PouchDB(table);
try {
await db.bulkDocs(TableEnumerator(table))
} catch(e) {
console.error(e)
}
return db
}
static async query(query, dbReference) {
return dbReference.sql(query)
}
}
export default DbService;
Where the switch statement is the name of the table and returns the table docs, for inserting on IndexedDB. For every .json file I'm creating a new Database and then for doing the queries, as I'm using React, I'm saving the reference for the DB on the components state.
I think there is a lack of optimization as everytime I change between tables I'm calling the function that uses bulkDocs. Maybe I'm supposed to check if the tables are already inserted on IndexedDB.
For future readers, this is how you can solve the problem.
Search for the question answer, and do the sqlite3 commands accordingly outputting the .json files.
When issuing the command SELECT, if you have any columns that have special characters, like blank space or dots in it's name use square rackets operator on the column name [ ]:
sqlite3 example ".output rslt.json" "SELECT '{ \"id\": \"someTable' || SUBSTR(\"000000000\" || id, LENGTH(\"000000000\" || id) - 8, 9) || '\", \"anattr\": \"' || anattr || '\", \"anothe.rattr\": \"' || [anoth.erattr] || '\" },' FROM some_table;";
There was a problem that occurred to me, some table rows had double quotes like that: "Attr": "Lorem ipsum "dolor" ". This causes a parsing error. For fixing it use the escape character operator \, before the quotes: "Lorem ipsum \"dolor\" ". That should fix double quotes.
You can easily use a regex for matching this occurrences: \"dolor\", this will find double quotes with your desired word within. For a quick replace write \"dolor\" and replace all the occurrences, just before doing it check which files you are modifying!!!!
As no one manifested anything about this, I'm answering my own question with my last update:
Okay, so I was able to solve my problem by creating a code like this.
import PouchDB from 'pouchdb';
import PDBFind from 'pouchdb-find';
import PDBSilverLining from 'pouchdb-silverlining';
import TableEnumerator from '../utils/tableEnumerator';
PouchDB.plugin(PDBFind);
PouchDB.plugin(PDBSilverLining);
class DbService {
static async dbLoad (table) {
const db = new PouchDB(table);
try {
await db.bulkDocs(TableEnumerator(table))
} catch(e) {
console.error(e)
}
return db
}
static async query(query, dbReference) {
return dbReference.sql(query)
}
}
export default DbService;
Where the switch statement is the name of the table and returns the table docs, for inserting on IndexedDB. For every .json file I'm creating a new Database and then for doing the queries, as I'm using React, I'm saving the reference for the DB on the components state.
I think there is a lack of optimization as everytime I change between tables I'm calling the function that uses bulkDocs. Maybe I'm supposed to check if the tables are already inserted on IndexedDB.
This creates a DATABASE for each .json file!!! Remember, this is not optimal! I will be working on a optimized approach for this.
Thanks and thanks Martin for your help!