Search code examples
phpcurlincludefopenfile-get-contents

PHP: GET # of characters from a URL and then stop/exit?


For parsing large files on the internet, or just wanting to get the opengraph tags of a website, is there a way to GET a webpage's first 1000 characters and then to stop downloading anything else from the page?

When a file is several megabytes, it can take the server a while to parse the file. This is especially the case when operating with many of these files. Even more troublesome than bandwidth is CPU/RAM conditions as files that are too large are difficult to work with in PHP as the server can run out of memory.

Here are some PHP commands that can open a webpage:

  1. fopen

  2. file_get_contents

  3. include

  4. fread

  5. url_get_contents

  6. curl_init

  7. curl_setopt

  8. parse_url

Can any of these be set to download a specific number of characters and then exit?


Solution

  • Something like that?

    <?php
    if ($handle = fopen("http://www.example.com/", "rb")) {
        echo fread($handle, 8192);
    }
    

    Got from php.net official functions doc examples...