-1

I was trying to figure out a way to find out how much time the CURL command given below is taking.

curl -XGET --header 'Content-Type: application/json' http://localhost:9200/elastiknn100knew11/_search -d '{"query": {
   "elastiknn_nearest_neighbors": {  
            "field": "my-vec", 
            "vec": {   
              "values":[Array of Numbers]
            },
            "model": "lsh",                       
            "similarity": "l2",
            "candidates": 50
        }
  },
"fields": ["imageName"],
"_source": false
}'

So, I came up with a bash script.

#!/bin/bash
start=$(( $(gdate +%N) ))
echo $start

curl -XGET --header 'Content-Type: application/json' http://localhost:9200/elastiknn100knew11/_search -d '{"query": {
   "elastiknn_nearest_neighbors": {  
            "field": "my-vec", 
            "vec": {   
              "values":[Array of Numbers]
            },
            "model": "lsh",                       
            "similarity": "l2",
            "candidates": 50
        }
  },
"fields": ["imageName"],
"_source": false
}'
dur=$(( $(gdate +%N) -$start))
echo $dur

By this although i got the difference in nanoseconds, but there could be a possibility that I run my command at 12:59:59:9DigitNanoSeconds and the query gets executed by 1:00:00:9DigitNanoSeconds then our output would come to be in negative.

I tried this situation out and got the time as negative in this scenario. So, is there any other alternative for this or maybe something we could do for this scenario?

Abby
  • 35
  • 4
  • Are there any reasons you don't use the [`time`](https://www.gnu.org/software/time/) utility? – Cyber Tailor Jul 28 '22 at 19:48
  • I have to write multiple curl commands in one single bash file and then get the total time for each curl command, if I use the time utility it gives me the total time to execute all the curl commands which isn't something required in this particular scenario. – Abby Jul 28 '22 at 19:54
  • 2
    What's wrong with `time curl`? You can time each individual command instead of the whole script if you want to. – tjm3772 Jul 28 '22 at 20:20
  • Yes but in my scenario, it is mandatory to use a script instead of individual commands as all the queries will be inside a loop, that is why I can't execute the individual commands. – Abby Jul 28 '22 at 20:28
  • What granularity do you need? Try `echo $SECONDS; sleep 3; echo $SECONDS` – Mark Setchell Jul 28 '22 at 20:47
  • 1
    @Abby there is no problem with timing a loop: `time for((i=0;i<10;i++));do sleep .5;done` – Léa Gris Jul 28 '22 at 21:49

1 Answers1

-1

@Author,

I tried testing following script:
    #!/bin/bash
    start=$(( $(( 10#$(date +%N) )) ))
    curl -XGET --header 'Content-Type: application/json' http://localhost:9200/elastiknn100knew11/_search -d '{"query": {
     "elastiknn_nearest_neighbors": {
     "field": "my-vec",
     "vec": {
     "values":[Array of Numbers]
     },
     "model": "lsh",
     "similarity": "l2",
     "candidates": 50
     }
     },
    "fields": ["imageName"],
    "_source": false
    }'
    end=$(( $(( 10#$(date +%N) )) ))
    if [ $end -lt $start ]
    then
            end=$(( 1000000000 + $end ))
    fi
    dur=$(( $end - $start ))
    echo start $start
    if [ $end -lt $start ]
    then
            echo duration 1 second $dur nanoseconds
    else
            echo duration 0 second $dur nanoseconds
    fi

If you need to know the reason for 10#$(date +%N) view:
    Value too great for base (error token is "08")

Sample output:     $ ./73158339.sh     curl: (7) Failed to connect to localhost port 9200 after 2252 ms: Connection refused     start 931500     duration 0 second 413980700 nanoseconds