I want to render dynamic pages using getServerSideProps
. In the future, the data will come from a database but for development purposes, I saved all data locally in JSON files.
Note: I have about 13 GB of data saved in about 16.000 JSON files with each file corresponding to one dynamic page (The largest file is about 7 MB).
Why does next.js use some much memory (>4GB) when fetching one file? What is the recommended approach to generating dynamic pages from large amounts of JSON data?
Thank you for your help.
My project structure is the following:
Data saved in /data/stocks/
Accessed from /pages/stocks/[ticker].js
The data is fetched with the following code:
export const getServerSideProps = async (context) => {
const { ticker } = context.params;
// read tickers from json
const tickers = require('../../data/tickers.json');
const id = tickers[ticker];
const zeroPad = (num, places) => String(num).padStart(places, '0')
const cik = 'CIK' + zeroPad(id, 10)
// get data from json
const data = require('../../data/' + cik + '.json');
return {
props: {
cik: cik,
}
}
}
Output:
<--- Last few GCs --->
[27496:0000014E055DB500] 3518357 ms: Mark-sweep (reduce) 2038.9 (2075.1) -> 2038.7 (2073.3) MB, 288.3 / 0.0 ms (+ 49.7 ms in 13 steps since start of marking, biggest step 7.0 ms, walltime since start of marking 353 ms) (average mu = 0.434, current mu =
<--- JS stacktrace --->
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
1: 00007FF64BA07A1F v8::internal::CodeObjectRegistry::~CodeObjectRegistry+114207
2: 00007FF64B996096 DSA_meth_get_flags+65542
3: 00007FF64B996F4D node::OnFatalError+301
4: 00007FF64C2CB2CE v8::Isolate::ReportExternalAllocationLimitReached+94
5: 00007FF64C2B58AD v8::SharedArrayBuffer::Externalize+781
6: 00007FF64C158C7C v8::internal::Heap::EphemeronKeyWriteBarrierFromCode+1468
7: 00007FF64C155D94 v8::internal::Heap::CollectGarbage+4244
8: 00007FF64C153710 v8::internal::Heap::AllocateExternalBackingStore+2000
9: 00007FF64C171420 v8::internal::FreeListManyCached::Reset+1408
10: 00007FF64C171AD5 v8::internal::Factory::AllocateRaw+37
11: 00007FF64C1871AB v8::internal::FactoryBase<v8::internal::Factory>::NewRawOneByteString+75
12: 00007FF64C17FABC v8::internal::Factory::NewStringFromUtf8+124
13: 00007FF64C2C7C0A v8::String::NewFromUtf8+202
14: 00007FF64B8AA9E1 v8::internal::OSROptimizedCodeCache::OSROptimizedCodeCache+33841
15: 00007FF64B9B16E7 v8::internal::Malloced::operator delete+3447
16: 00007FF64C285CA6 v8::internal::Builtins::code_handle+172790
17: 00007FF64C285899 v8::internal::Builtins::code_handle+171753
18: 00007FF64C285B5C v8::internal::Builtins::code_handle+172460
19: 00007FF64C2859C0 v8::internal::Builtins::code_handle+172048
20: 00007FF64C3590C1 v8::internal::SetupIsolateDelegate::SetupHeap+494673
21: 0000014E0735B74B
I used a simple workaround:
Serve the files using a simple flask backend on a different port and fetch the data just like any other JSON data.
Flask Backend:
from flask import Flask
app = Flask(__name__)
@app.route("/")
def index():
return "<p>Hello World!</p>"
@app.route("/stocks/<cik>")
def data(cik):
with open(f"python/stocks/{cik}.json") as f:
data = f.read()
return data
app.run(debug=True)
getServerSideProps
export const getServerSideProps = async (context) => {
const { ticker } = context.params;
// read tickers from json
const tickers = require('../../data/tickers.json');
const id = tickers[ticker];
const zeroPad = (num, places) => String(num).padStart(places, '0')
const cik = 'CIK' + zeroPad(id, 10)
// get data from localhost
const res = await fetch(`http://localhost:5000/stocks/${cik}`)
const data = await res.json()
return {
props: {
cik: cik,
data: data
}
}
}