In an effort to remove the use of jQuery from my code, I tried to replace the $.Deferred();
by new Promise()
.
I noticed the usage are slightly different, and I'm still learning how it works.
Here is a simplified extract from my code:
function do_match (resolve, reject) {
fetch( /* ... */).then (async function(response) {
/* do some suff */
document.getElementById("match").insertAdjacentHTML('beforeend', '<div class="player"></div>');
resolve("done");
});
}
function do_myMarket () {
var elements = document.querySelectorAll('.player');
//here elements is sometimes null...
}
p1 = new Promise(do_match);
p1.then(do_myMarket, null);
While I would have expect do_myMarket
to only be called after the promise is resolved, if the fetch is not fast enough, do_myMarket
can be called before the elements are available in the page.
Putting breakpoints if elements
is null and resolve()
confirmed me this behavior.
Am I missing something? Why would this happen?
After some readings from @VLAZ and more testing, I found out it's because of the async
in the unnamed function.
The promise p1
was resolved by the return value of the fetch
function, which would not wait for completion because of the async
keyword, thus making resolve("done");
useless.
And I tried, same behavior with or without the call to resolve
.
This comes from, what I think now, as a wacky example from MDN:
// Function to do an Ajax call
const doAjax = async () => {
const response = await fetch('Ajax.php'); // Generate the Response object
if (response.ok) {
const jVal = await response.json(); // Get JSON value from the response body
return Promise.resolve(jVal);
}
else
return Promise.reject('*** PHP file not found');
}
}
// Call the function and output value or error message to console
doAjax().then(console.log).catch(console.log);
The above is all antipattern if I understood correctly.
The correct way is the page dedicated to the .json() method:
function doAjax() {
fetch(/* ... */)
.then(response => response.json())
.then(data => {
//...
})
.catch(console.error);
}