Is there a built-in solution to prevent duplicate Blob
objects within different records in IndexedDB
?
Say I have this schema for a music store: id, title, album, artwork
, and I wanted to add 2 songs from the same album to this store (so they most likely have the same artwork asset). Is there a built-in way to automatically store the artwork only once?
What I've tried:
I tried to set a unique
flag in artwork
index, but after checking the size of the database before and after the insertion of the second song (using chrome://settings/cookies
), the artwork is stored twice.
I then tried to store the artworks in a separate store (with only id
and artwork
as a schema) with the same flag, but that didn't work too.
var database = new Dexie('TheDatabase');
database.version(1).stores({artworks: '++id, &artwork'});
var dbArt = database.artworks;
var artworkBlob = getBlob(image);
dbArt.put(artworkBlob);
//later:
dbArt.put(artworkBlob);
Am I misusing the unique
flag in any way? Is it not supported in Blob
objects?
Even though IndexedDB supports storing Blobs, it does not support indexing Blobs. Indexable properties can only be of type string, number, Date or Array<string | number | Date>. If not, IndexedDB will just silently ignore to index that particular object.
Also, in your sample code, you are not referring the artworks table, and you are trying to put the blob by itself instead of putting the document containing the blob property.
So what you could do instead, is to compute a hash / digest of the blob content and store as a string that you may index using a unique index.
var dbArt = new Dexie('TheDatabase');
dbArt.version(1).stores({
artworks: `
++id,
title,
album,
&artworkDigest` // & = unique index of the digest
});
var artworkBlob = getBlob(image); // get it somehow...
// Now, compute hash before trying to put blob into DB
computeHash(artworkBlob).then(artworkDigest => {
// Put the blob along with it's uniqely indexed digest
return dbArt.artworks.put({
title: theTitle,
album: theAlbum,
artwork: artworkBlob,
artworkDigest: artworkDigest
});
}).then(()=>{
console.log("Successfully stored the blob");
}).catch(error => {
// Second time you try to store the same blob, you'll
// end up here with a 'ConstraintError' since the digest
// will be same and conflict the uniqueness constraint.
console.error(`Failed to store the blob: ${error}`);
});
function computeHash (blob) {
return new Promise((resolve, reject) => {
// Convert to ArrayBuffer
var fileReader = new FileReader();
fileReader.onload = () => resolve(filerReader.result);
fileReader.onerror = () => reject(filerReader.error);
fileReader.readAsArrayBuffer(blob);
}).then (arrayBuffer => {
// Compute Digest
return crypto.subtle.digest("SHA-256", arrayBuffer);
}).then (digest => {
// Convert ArrayBuffer to string (to make it indexable)
return String.fromCharCode.apply(
null, new Uint8Array(digest));
});
};
Side note: In IndexedDB 2.0, ArrayBuffers are indexable (but not Blobs still). However, I would never recommend to index such a large value in any database. Still better to index the digest.