Search code examples
node.js3dgeometry

I want to take a GLB file on a Node.js server and compress it using the draco3d library


I want to take a GLB file on a Node.js server and compress it using the draco3d library https://www.npmjs.com/package/draco3d

At the same time, in console.log("inputGeometry =", inputGeometry); I get approximately the following data:

inputGeometry = Uint8Array(23606456) [
  103, 108,  84,  70,   2,   0,   0,   0, 184,  52, 104,   1,
   60,  39,   1,   0,  74,  83,  79,  78, 123,  34,  97,  99,
   99, 101, 115, 115, 111, 114, 115,  34,  58,  91, 123,  34,
   98, 117, 102, 102, 101, 114,  86, 105, 101, 119,  34,  58,
   50,  44,  34,  99, 111, 109, 112, 111, 110, 101, 110, 116,
   84, 121, 112, 101,  34,  58,  53,  49,  50,  54,  44,  34,
   99, 111, 117, 110, 116,  34,  58,  52,  51,  53,  54,  44,
   34, 109,  97, 120,  34,  58,  91,  50,  54,  48,  46,  48,
   44,  49,  56,  50,
  ... 23606356 more items
]

Here console.log("encodedData = ", encodedData); I get data like this: encodedData = 0

And in the end I get the error:

node:buffer:328 throw new ERR_INVALID_ARG_TYPE( ^

TypeError [ERR_INVALID_ARG_TYPE]: The first argument must be of type string or an instance of Buffer, ArrayBuffer, or Array or an Array-like Object. Received type number (0) at new NodeError (node:internal/errors:393:5) at Function.from (node:buffer:328:9) at /Users/mac/Documents/_Models3d/server3d__2/server3d__2/controllers/rackTypeController.js:627:58 at FSReqCallback.readFileAfterClose [as oncomplete] (node:internal/fs/read_file_context:68:3) { code: 'ERR_INVALID_ARG_TYPE' }

How can I get the Draco model in encodedData? What data needs to be passed to EncodeMeshToDracoBuffer?

Thank you very much in advance!

Below code:

          user = req.user.id;

          const filePath = "user" + user + "/model" + id;
          const filePathStatic = "static/" + filePath;

          if (req.files) {
            const { imgs } = req.files;
            const { glb } = req.files;

            if (glb) {
              const fileName = `model.glb`;
              const fileNameDraco = `model.drc`;

              try {
                const dirpath = path.resolve(__dirname, "..", filePathStatic);

                const pathGLB = path.resolve(dirpath, fileName);
                const pathDRC = path.resolve(dirpath, fileNameDraco);

                fs.mkdirSync(dirpath, { recursive: true });
                glb.mv(pathGLB);

                // model3d = filePath + "/" + fileName;
                model3d = filePath + "/" + fileNameDraco;

                /*********** DRACO ************** */
                const dracoEncoderModule = draco3d.createEncoderModule({});

                dracoEncoderModule
                  .then((module) => {
                    const encoder = new module.Encoder();

                    fs.readFile(pathGLB, (err, data) => {
                      if (err) {
                        console.log("ERROR ", err);
                      } else {
                        const inputBuffer = data;

                        const inputGeometry = new Uint8Array(inputBuffer);

            console.log('inputGeometry = inputGeometry')

                        const encodedData =
                          encoder.EncodeMeshToDracoBuffer(inputGeometry);

            console.log('encodedData = encodedData')

                        fs.writeFileSync(pathDRC, Buffer.from(encodedData));
                      }
                    });
                  })
                  .catch((error) => {
                    console.error("Error dracoEncoder:", error);
                  });

                /************************* */
              } catch (e) {
                console.error("upload glb ERROR", e);
              }
            }
          }

Ultimately, I want to implement this scenario: user uploads model on the server this model is compressed other users are shown an already compressed model, which loads much faster


Solution

  • The draco3d package understands low-level vertex data, but not complex file formats like glTF. In the code snippet above, it looks like you're assuming the contents of the entire GLB is equivalent to a 'mesh', when it may be a complete scene with many meshes and materials.

    One way to process that entire scene – or rather, each mesh within it — would be using draco3d in combination with glTF Transform:

    import { NodeIO } from '@gltf-transform/core';
    import { KHRONOS_EXTENSIONS } from '@gltf-transform/extensions';
    import { draco } from '@gltf-transform/functions';
    import draco3d from 'draco3d';
    
    // Configure I/O.
    const io = new NodeIO()
      .registerExtensions([KHRDracoMeshCompression])
      .registerDependencies({
        'draco3d.decoder': await draco3d.createDecoderModule(), // Optional.
        'draco3d.encoder': await draco3d.createEncoderModule(), // Optional.
      });
    
    // Read original file.
    const document = await io.read('uncompressed.glb');
    
    // Configure compression.
    await document.transform(draco({method: 'edgebreaker'}));
    
    // Write compressed file.
    await io.write('compressed.glb', document);