I'm trying to write code where a query is executed three times and the average execution time is displayed.
The error is:
Variable `times` not defined (line 14, column 6 (offset: 257))
"WITH times, i, end_time - start_time AS execution_time"
Can you help me please :)
Here's the code:
WITH []
AS times
UNWIND range(1,3) AS i
WITH times, i
WITH apoc.date.currentTimestamp() AS start_time
MATCH (a:Article)
WHERE a.appearsInJournal = "Journal of Science"
RETURN a.title, a.publicationDate
WITH apoc.date.currentTimestamp() AS end_time
WITH times, i, end_time - start_time AS execution_time
SET times = times + execution_time
WITH times, avg(times) AS avg_time
RETURN toFloat(avg_time)
[UPDATED]
A WITH
clause will drop all existing variables, except for the ones that are specified in the WITH
clause. Not all your WITH
clauses are specifying times
. That explains your error message.
But your main question is about how to use Cypher to determine the average execution time for a Cypher query.
Here is one approach. It executes your query
5 times and returns the average execution time in milliseconds, ignoring the timing of the first execution (with is generally more expensive due to the extra overhead to preprocess a Cypher query that has not been seen recently):
WITH '
MATCH (a:Article)
WHERE a.appearsInJournal = "Journal of Science"
RETURN a.title, a.publicationDate
' AS query
UNWIND RANGE(1, 5) AS i
WITH i, query, datetime.realtime().epochMillis AS start
CALL apoc.cypher.doIt(query, NULL) YIELD value
WITH i, MAX(datetime.realtime().epochMillis-start) AS execTime
SKIP 1
RETURN AVG(execTime)
This approach also includes the time to execute the apoc.cypher.doIt
procedure as well, but that overhead seems to be under a millisecond. Also, this approach only seems to work if query
returns non-NULL results.