Max steps per query limit reached

Hi,

This is regarding the new Intuit Ocular software purchase.
When we are trying to find a flow from source to sink, we are hitting the Max steps per query limit reached error.
Is there a way we can specify the maximum query limit to ocular?

Thanks,
Tejas.

1 Like

Hi,

In Ocular the maximum number of steps performed per reachableBy query can be configured with config.maxStepsPerQuery = putNumberHeer.
After the configured amount of steps the algorithm stops and returns the so far calculated results.

2 Likes

Thanks markus…that helped. However, now when i am trying to use cpg2sp, i am still getting the same error.

I am trying the below command:
./cpg2sp.sh --cpg ./cpg.bin.zip --max-steps-per-query 10000000000 -o report.sp

Still it spits out Max steps per query reached limit of 10000. Stopped query.

Any suggestions?

Thanks,
Tejas.

1 Like

I am glad that i could help you with the --max-steps-per-query config value.

The error message is a little bit vague there. The limit it talks about is can actually be configured by the cpg2sp command line parameter --max-call-steps-per-query.
What it does is that it configures the step limit for individual call site resolution tasks. A call site resolution determines which actual methods could be called at a dynamically dispatched call.
I advise you to not increase this value too much because it will heavily increase the run time.

1 Like

Hey Markus,

We were able to increase the maximum steps per query the automation uses by this flag --max-call-steps-per-query.
Now, we are running into a different issue.
“Exception in thread “main” java.lang.OutOfMemoryError: GC overhead limit exceeded”

We tried the following config for a CPG of size 19MB:

  1. export _JAVA_OPTS="-Xmx$4G"
  2. –max-call-steps-per-query 100000000

Currently, for us, it is important to get the automation of security profiles to get up and running. Can you help us with the right config to be used for a CPG of this size?

Thanks,
Tejas.

1 Like

Hi Tejas,

If you run into the overhead limit, you need to give the process more RAM or analyse less code at once.
So far there are no command line options to reduce RAM consumption. In general 4G RAM is on the low end of memory and i suggest using 8 - 16 G of RAM for and analysis run.

Towards your second parameter --max-call-steps-per-query 100000000:
This will almost certainly result in extremely long analysis times. I suggest you stick with the default at first and than increase it gradually to see the impact on runtime.

1 Like