One should probably differentiate between the two basic usage scenarios
Online. That is, the image is dynamically generated when a user is viewing a particular WEB-page. This means that the time it takes to generate the image will add to the delay the user is experiencing when trying to view the page. (The library supports a caching mechanism to reduce the number of times an image has to generated, see Efficient graph generation using the built-in cache subsystem for a thorough discussion). For this scenario one should probably keep the images as basic as possible in order to have as small latency as possible.
In practice this means that the number of data points to visualize should be kept in the order of hundreds and not thousands. In later sections we will discuss in details what can be done to improve the performance of the library.
Offline. That is, the images are generated by some "batch" processing (possible command line based). In this scenario the delay is not an issue and one could create much more complicated images and process many more data points. Even though the library in itself does not impose any restriction of the number of data points to process the memory and time limits set for PHP will.
In practice if you need to process images with sizes above 2000x2000 pixels resulting from processing 500,000 data points then it is probably better to find a more suitable way to produce these graphs rather than a PHP script (unless you are prepared to give PHP a couple of 100 MB of allowed memory)