I recently found an automatically created connection string specifying "Pooling=False" and wondered as to why it was set up like that. From my understanding pooling is virtually always advantageous as long as it is not totally mis-configured.
Are there any reasons for disabling pooling? Does it depend on the OS, the physical connection or the used DBMS?
If it's a single threaded app, pooling seems unnecessary. Was it on a resource constrained device? Is startup time important to the application? These are some factors that might lead to the decision to turn off pooling.
In general, I think you are right that pooling is beneficial. If it's a typical web app then I would inquire about it.