(PHP 4 >= 4.3.0, PHP 5)
file_get_contents — Liest die gesamte Datei in einen String
Diese Funktion ist mit file() identisch, außer dass file_get_contents() die Datei in einem String zurückgibt, beginnend am angebenen offset über bis zu maxlen Bytes. Im Fehlerfall gibt file_get_contents() FALSE zurück.
file_get_contents() ist der empfohlene Weg, um den Inhalt einer Datei in einen String zu lesen. Es werden Techniken zur Speicherabbildung genutzt, um die Performance zu erhöhen, falls das Betriebssystem dies unterstützt.
Hinweis:
Falls Sie einen URI mit speziellen Zeichen, wie z.B. Leerzeichen, öffnen, müssen den URI mittels urlencode() enkodieren.
Name der zu lesenden Datei.
Hinweis:
Seit PHP 5 kann FILE_USE_INCLUDE_PATH genutzt werden, um eine Suche im include path auszulösen.
Eine gültige Context-Ressource, die mit stream_context_create() erstellt wurde. Falls Sie keinen eigenen Context benötigen, können Sie diesen Parameter mit NULL überspringen.
Die Position, an dem das Lesen im Originalstream beginnt.
Maximale Länge der zu lesenden Daten. Standardmäßig wird solange gelesen bis das Ende der Datei erreicht wird. Beachten Sie, dass dieser Parameter auf den Stream angewendet wird, der durch die Filter verarbeitet wird.
Die Funktion gibt die gelesenen Daten zurück. Im Fehlerfall wird FALSE zurückgegeben..
Beispiel #1 Holen und Ausgeben des Quelltextes der Startseite einer Webseite
<?php
$homepage = file_get_contents('http://www.example.com/');
echo $homepage;
?>
Beispiel #2 Suche im include_path
<?php
// <= PHP 5
$file = file_get_contents('./leute.txt', true);
// > PHP 5
$file = file_get_contents('./leute.txt', FILE_USE_INCLUDE_PATH);
?>
Beispiel #3 Lesen einen Abschnitts einer Datei
<?php
// Lese 14 Zeichen, beginnend mit dem 21. Zeichen
$section = file_get_contents('./leute.txt', NULL, NULL, 20, 14);
var_dump($section);
?>
Das oben gezeigte Beispiel erzeugt eine ähnliche Ausgabe wie:
string(14) "lle Bjori Ro"
Beispiel #4 Nutzung von Stream-Contexten
<?php
// Erzeugen eines Streams
$opts = array(
'http'=>array(
'method'=>"GET",
'header'=>"Accept-language: en\r\n" .
"Cookie: foo=bar\r\n"
)
);
$context = stream_context_create($opts);
// Öffnen der Datei mit den oben definierten HTTP-Headern
$file = file_get_contents('http://www.example.com/', false, $context);
?>
Version | Beschreibung |
---|---|
5.1.0 | Die offset- und maxlen-Parameter wurden hinzugefügt. |
5.0.0 | Context-Unterstützung wurde hinzugefügt. |
Hinweis: Diese Funktion ist binary safe.
Mit dieser Funktion können Sie eine URL als Dateinamen verwenden, falls Sie fopen wrappers ermöglicht haben. Mehr Details dazu, wie Sie den Dateinamen angeben müssen finden Sie bei fopen(). Eine Liste der unterstützten URL Protokolle, die Fähigkeiten der verschiedenen Wrapper, Hinweise zu deren Verwendung und Informationen zu den eventuell vorhandenen vordefinierten Variablen finden Sie unter Supported Protocols and Wrappers.
Bei SSL-Verbindungen zusammen mit Microsoft IIS hält sich dieser Webserver nicht an das Protokoll und schließt die Verbindung ohne ein close_notify zu senden. PHP quittiert dieses Fehlverhalten mit "SSL: Fatal Protocol Error", wenn das Ende der Daten erreicht ist. Eine mögliche Lösung besteht darin, den Level von error_reporting herabzusetzten und Warnings auszuschließen. Ab PHP 4.3.7 kann PHP fehlerhafte IIS-Serversoftware erkennen, wenn Sie einen Stream mit dem https://-Wrapper öffnen, und unterdrückt die Warnung für Sie. Falls Sie fsockopen() benutzen, um einen ssl://-Socket zu öffnen, müssen Sie selbst dafür Sorge tragen, die Warnung zu erkennen und diese zu unterdrücken.
My answer to the previous question :
file_get_contents() can only GET as you can see.
If you want to POST then the easiest solution would be the curl library!
http://php.net/manual/en/book.curl.php
Hi! How can I use file_get_contents() to send a $_FILE variable to a site trough POST method?
If that's not possible, whitch function could do that?
$header = file_get_contents('http://www.example.com/faq.jsp');
echo $header;
Fails with 500 Internal Server Error.
$opts = array('http'=>array('header' => "User-Agent:MyAgent/1.0\r\n"));
$context = stream_context_create($opts);
$header = file_get_contents('http://www.example.com/faq.jsp',false,$context);
//$header = file_get_contents('http://www.polama.com/faq.jsp');
echo $header;
Works!
For those having this problem when trying to get_file_contents(url):
Warning: file_get_contents(url): failed to open stream: HTTP request failed! in xx on line yy
If you are behind a SonicWall firewall, read this:
https://bugs.php.net/bug.php?id=40197
(this little line: uncheck a box in the internal settings of the firewall labled "Enforce Host Tag Search with for CFS")
Apparently by default SonicWall blocks any HTTP request without a "Host:" header, which is the case in the PHP get_file_contents(url) implementation.
This is why, if you try to get the same URL from the same machine with cURL our wget, it works.
I hope this will be useful to someone, it took me hours to find out :)
I experienced a problem in using hostnames instead straight IP with some server destinations.
If i use file_get_contents("www.jbossServer.example/app1",...)
i will get an 'Invalid hostname' from the server i'm calling.
This is because file_get_contents probably will rewrite your request after getting the IP, obtaining the same thing as :
file_get_contents("xxx.yyy.www.zzz/app1",...)
And you know that many servers will deny you access if you go through IP addressing in the request.
With cURL this problem doesn't exists. It resolves the hostname leaving the request as you set it, so the server is not rude in response.
If you want to insert tracking-scripts into your shopping-system, some scripts doesn't support intelligent detection of HTTPS, so i made a script i put on the server that rewrites 'http' to 'https' in the script, assuming everything has to be UTF-8 encoded (as a fallback it makes a redirect).
It is important that the HTTPS-source DOES exist!
<?php
function file_get_contents_utf8($fn) {
$opts = array(
'http' => array(
'method'=>"GET",
'header'=>"Content-Type: text/html; charset=utf-8"
)
);
$context = stream_context_create($opts);
$result = @file_get_contents($fn,false,$context);
return $result;
}
header("Content-Type: text/html; charset=utf-8");
$tPath = "URL YOU WANT TO MODIFY";
$result = file_get_contents_utf8("http://".$tPath);
if( $result == false){
header("Location: https://".$tPath); // fallback
exit();
}
else{
echo mb_ereg_replace("http","https",$result);
}
?>
If you are getting a failed to open stream message on your windows machine check your hosts file.
127.0.0.1 localhost
must be in it and the IP6 line must be commented
# ::1 localhost
At least as of PHP 5.3, file_get_contents no longer uses memory mapping.
See comments on this bug report:
http://bugs.php.net/bug.php?id=52802
here is another (maybe the easiest) way of doing POST http requests from php using its built-in capabilities. feel free to add the headers you need (notably the Host: header) to further customize the request.
note: this method does not allow file uploads. if you want to upload a file with your request you will need to modify the context parameters to provide multipart/form-data encoding (check out http://www.php.net/manual/en/context.http.php ) and build the $data_url following the guidelines on http://www.w3.org/TR/html401/interact/forms.html#h-17.13.4.2
<?php
/**
make an http POST request and return the response content and headers
@param string $url url of the requested script
@param array $data hash array of request variables
@return returns a hash array with response content and headers in the following form:
array ('content'=>'<html></html>'
, 'headers'=>array ('HTTP/1.1 200 OK', 'Connection: close', ...)
)
*/
function http_post ($url, $data)
{
$data_url = http_build_query ($data);
$data_len = strlen ($data_url);
return array ('content'=>file_get_contents ($url, false, stream_context_create (array ('http'=>array ('method'=>'POST'
, 'header'=>"Connection: close\r\nContent-Length: $data_len\r\n"
, 'content'=>$data_url
))))
, 'headers'=>$http_response_header
);
}
?>
read text per line and convert to array
for example, the input file is input.txt
the input file containt text below
one
two
three
four
five
++++++++++++++++++++++++++
read value per line
<?php
$data = file_get_contents("input.txt"); //read the file
$convert = explode("\n", $data); //create array separate by new line
for ($i=0;$i<count($convert);$i++)
{
echo $convert[$i].', '; //write value by index
}
?>
++++++++++++++++++++++++++
Output :
one, two, three, four, five,
For those that need to know the default of $maxLen, it's defined in the php source code as PHP_STREAM_COPY_ALL, and because that is not available to us mere mortal users, that constant is defined as ((size_t)-1) or -1
When using a URI with a login / password (HTTP or FTP, for an example), you may need to urlencode the password if it contains special characters.
Do not urlencode the whole URI, just the password.
Don't do :
urlencode('ftp://login:mdp%?special@host/dir/file')
Do :
'ftp://login:' . urlencode('mdp%?special') . '@host/dir/file';
Might seem obvious, but is worth noting.
If working file is bigger than 64kb and you getting deadlock. Your buffer is overflow. Here are two way how to avoid that.
1) use temporary file for descriptor
<?php
$descriptorspec = array(
0 => array("file", "/tmp/ens/a.ens","r"), // stdin is a pipe that the child will read from
1 => array("file", "/tmp/ens/a.html","w"), // stdout is a pipe that the child will write to
2 => array("file", "/tmp/ens/error-output.txt", "a") // stderr is a file to write to
);
?>
2) inline read using stream_set_blocking. PHP doesn't proper handle last part of file.
<?php
$READ_LEN = 64*1024;
$MAX_BUF_LEN = 2*$READ_LEN;
$url = "http://some.domain.com:5984/".$db."/".$member."/contents";
$src = fopen($url,"r");
$cwd = '/tmp';
$cmd['enscript'] = "/usr/bin/enscript";
$cmd['enscript-options'] = " -q --language=html --color -Ejcl -o -";
$descriptorspec = array(
0 => array("pipe", "r"), // stdin is a pipe that the child will read from
1 => array("pipe", "w") // stdout is a pipe that the child will write to
);
$ph=proc_open($cmd['enscript']." ".$cmd['enscript-options'],$descriptorspec,$pipes,$cwd);
stream_set_blocking($src,0);
stream_set_blocking($pipes[0],0);
stream_set_blocking($pipes[1],0);
$CMD_OUT_OPEN = TRUE; $k = 0;
while (!feof($pipes[1]) || !feof($src) || $k > 0) {
if (!feof($src) && $k+$READ_LEN <= $MAX_BUF_LEN) {
$input .= fread($src,$READ_LEN);
$k = strlen($input);
}
if ($k > 0) {
$l = fwrite($pipes[0],$input);
$k -= $l;
$input = substr($input,$l);
}
if ($CMD_OUT_OPEN && $k == 0 && feof($src)) {
fclose($pipes[0]);
$CMD_OUT_OPEN = FALSE;
}
$output = fread($pipes[1],$READ_LEN);
$outputn = str_replace("<H1>(stdin)</H1>","",$output);
echo $outputn;
}
fclose($pipes[1]);
$return_value = proc_close($ph);
?>
The offset is 0 based. Setting it to 1 will skip the first character of the stream.
Recently I experienced unexpected behaviour (for me) of file_get_contents('https://......')
Script gets images and file_get_contents('https://......') is in loop operator.
After several iterations Apache Server FastCGI timed out with Internal Server Error in ~30 seconds even if max_execution_time was set to 0. With fopen() fread() pair result was the same.
In different environment Apache mod_php the same script worked without problems.
I solved this changing https requests to http
If your file_get_contents freezes during several seconds, here is maybe your answer:
Beware that the default keepalive timeout of Apache 2.0 httpd is 15 seconds. This is true for HTTP/1.1 connections, which is not the default behavior of file_get_contents but you can force it, especially if you are trying to act as a web browser. I don't know if this is also the case for HTTP/1.0 connections.
Forcing the server to close the connection would make you gain those 15 seconds in your script:
<?php
$context = stream_context_create(array('http' => array('header'=>'Connection: close')));
$content = file_get_contents("http://www.example.com/test.html");
?>
Another way of resolving slowness issues is to use cURL or fsockopen. Bear in mind that contrary to the behavior of web browsers, file_get_contents doesn't return the result when the web page is fully downloaded (i.e. HTTP payload length = value of the response HTTP "Content-Length" header) but when the TCP connection is closed.
I hope this behavior will change in future releases of PHP.
This has been experienced with PHP 5.3.3.
If you want to check if the function returned error, in case of a HTTP request an, it's not sufficient to test it against false. It may happen the return for that HTTP request was empty. In this case it's better to check if the return value is a bool.
<?php
$result=file_get_contents("http://www.example.com");
if ($result === false)
{
// treat error
} else {
// handle good case
}
?>
[EDIT BY thiago: Has enhacements from an anonymous user]
Sometimes you might get an error opening an http URL.
even though you have set "allow_url_fopen = On" in php.ini
For me the the solution was to also set "user_agent" to something.
On Centos 5, and maybe other Red Hat based systems, any attempt to use file_get_contents to access a URL on an http port other than 80 (e.g. "http://www.example.com:8040/page") may fail with a permissions violation (error 13) unless the box you are running php on has its seLinux set to 'permissive' not 'enforcing' . Otherwise the request doesn't even get out of the box, i.e. the permissions violation is generated locally by seLinux.
In my dev environment with a relatively low-speed drive (standard SATA 7200RPM) reading a 25MB zip file in 10 times...
<?php
$data = `cat /tmp/test.zip`;
// 1.05 seconds
$fh = fopen('/tmp/test.zip', 'r');
$data = fread($fh, filesize('/tmp/test.zip'));
fclose($fh);
// 1.31 seconds
$data = file_get_contents('/tmp/test.zip');
// 1.33 seconds
?>
However, on a 21k text file running 100 iterations...
<?php
$data = `cat /tmp/test.txt`;
// 1.98 seconds
$fh = fopen('/tmp/test.txt', 'r');
$data = fread($fh, filesize('/tmp/test.txt'));
fclose($fh);
// 0.00082 seconds
$data = file_get_contents('/tmp/test.txt');
// 0.0069 seconds
?>
Despite the comment about file_get_contents being faster do to memory mapping, file_get_contents is slowest in both of the above examples. If you need the best performance out of your production box, you might want to throw together a script to check out which method is fastest for what size files on that particular machine, then optimize your code to check the file size and use the appropriate function for it.
A UTF-8 issue I've encountered is that of reading a URL with a non-UTF-8 encoding that is later displayed improperly since file_get_contents() related to it as UTF-8. This small function should show you how to address this issue:
<?php
function file_get_contents_utf8($fn) {
$content = file_get_contents($fn);
return mb_convert_encoding($content, 'UTF-8',
mb_detect_encoding($content, 'UTF-8, ISO-8859-1', true));
}
?>
if $filename has a relative path file_get_contents returns the uninterpreted sourcecode of the php-file with all comments etc.
I don't know whether this is a bug or intented or caused by server-configuration.
I think this behaviour should be included in the description of the function.
I recently upgraded my server to Slackware 12.0.
After this, a program of mine stopped working: the call to file_get_contents (to an URL served by a custom HTTP server) was returning false without generating any error!
After some investigations I saw this: my custom HTTP server closes the connection at the end of the content. This (without the header "Connection: close") seems to cause the problem I described.
To solve the problem I simply added that header to the answer of my custom HTTP server.
Setting the timeout properly without messing with ini values:
<?php
$ctx = stream_context_create(array(
'http' => array(
'timeout' => 1
)
)
);
file_get_contents("http://example.com/", 0, $ctx);
?>
This is a nice and simple substitute to get_file_contents() using curl, it returns FALSE if $contents is empty.
<?php
function curl_get_file_contents($URL)
{
$c = curl_init();
curl_setopt($c, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($c, CURLOPT_URL, $URL);
$contents = curl_exec($c);
curl_close($c);
if ($contents) return $contents;
else return FALSE;
}
?>
Hope this help, if there is something wrong or something you don't understand let me know :)
I decided to make a similar function to this, called file_post_contents, it uses POST instead of GET to call, kinda handy...
<?php
function file_post_contents($url,$headers=false) {
$url = parse_url($url);
if (!isset($url['port'])) {
if ($url['scheme'] == 'http') { $url['port']=80; }
elseif ($url['scheme'] == 'https') { $url['port']=443; }
}
$url['query']=isset($url['query'])?$url['query']:'';
$url['protocol']=$url['scheme'].'://';
$eol="\r\n";
$headers = "POST ".$url['protocol'].$url['host'].$url['path']." HTTP/1.0".$eol.
"Host: ".$url['host'].$eol.
"Referer: ".$url['protocol'].$url['host'].$url['path'].$eol.
"Content-Type: application/x-www-form-urlencoded".$eol.
"Content-Length: ".strlen($url['query']).$eol.
$eol.$url['query'];
$fp = fsockopen($url['host'], $url['port'], $errno, $errstr, 30);
if($fp) {
fputs($fp, $headers);
$result = '';
while(!feof($fp)) { $result .= fgets($fp, 128); }
fclose($fp);
if (!$headers) {
//removes headers
$pattern="/^.*\r\n\r\n/s";
$result=preg_replace($pattern,'',$result);
}
return $result;
}
}
?>
Seems file looks for the file inside the current working (executing) directory before looking in the include path, even with the FILE_USE_INCLUDE_PATH flag specified.
Same behavior as include actually.
By the way I feel the doc is not entirely clear on the exact order of inclusion (see include). It seems to say the include_path is the first location to be searched, but I have come across at least one case where the directory containing the file including was actually the first to be searched.
Drat.
If you're having problems with binary and hex data:
I had a problem when trying to read information from a ttf, which is primarily hex data. A binary-safe file read automatically replaces byte values with their corresponding ASCII characters, so I thought that I could use the binary string when I needed readable ASCII strings, and bin2hex() when I needed hex strings.
However, this became a problem when I tried to pass those ASCII strings into other functions (namely gd functions). var_dump showed that a 5-character string contained 10 characters, but they weren't visible. A binary-to-"normal" string conversion function didn't seem to exist and I didn't want to have to convert every single character in hex using chr().
I used unpack with "c*" as the format flag to see what was going on, and found that every other character was null data (ordinal 0). To solve it, I just did
str_replace(chr(0), "", $string);
which did the trick.
This took forever to figure out so I hope this helps people reading from hex data!
you'll find the http response headers in: $http_response_header
;o)
[Editors note: As of PHP 5.2.1 you can specify `timeout` context option and pass the context to file_get_contents()]
The only way I could get get_file_contents() to wait for a very slow http request was to set the socket timeout as follows.
ini_set('default_socket_timeout', 120);
$a = file_get_contents("http://abcxyz.com");
Other times like execution time and input time had no effect.
Use the previous example if you want to request the server for a special part of the content, IF and only if the server accepts the method.
If you want a simple example to ask the server for all the content, but only save a portion of it, do it this way:
<?php
$content=file_get_contents("http://www.google.com",FALSE,NULL,0,20);
echo $content;
?>
This will echo the 20 first bytes of the google.com source code.
the bug #36857 was fixed.
http://bugs.php.net/36857
Now you may use this code,to fetch the partial content like this:
<?php
$context=array('http' => array ('header'=> 'Range: bytes=1024-', ),);
$xcontext = stream_context_create($context);
$str=file_get_contents("http://www.fcicq.net/wp/",FALSE,$xcontext);
?>
that's all.
If, like me, you are on a Microsoft network with ISA server and require NTLM authentication, certain applications will not get out of the network. SETI@Home Classic and PHP are just 2 of them.
The workaround is fairly simple.
First you need to use an NTLM Authentication Proxy Server. There is one written in Python and is available from http://apserver.sourceforge.net/. You will need Python from http://www.python.org/.
Both sites include excellent documentation.
Python works a bit like PHP. Human readable code is handled without having to produce a compiled version. You DO have the opportunity of compiling the code (from a .py file to a .pyc file).
Once compiled, I installed this as a service (instsrv and srvany - parts of the Windows Resource Kit), so when the server is turned on (not logged in), the Python based NTLM Authentication Proxy Server is running.
Then, and here is the bit I'm really interested in, you need to tell PHP you intend to route http/ftp requests through the NTLM APS.
To do this, you use contexts.
Here is an example.
<?php
// Define a context for HTTP.
$aContext = array(
'http' => array(
'proxy' => 'tcp://127.0.0.1:8080', // This needs to be the server and the port of the NTLM Authentication Proxy Server.
'request_fulluri' => True,
),
);
$cxContext = stream_context_create($aContext);
// Now all file stream functions can use this context.
$sFile = file_get_contents("http://www.php.net", False, $cxContext);
echo $sFile;
?>
Hopefully this helps SOMEONE!!!
This functionality is now implemented in the PEAR package PHP_Compat.
More information about using this function without upgrading your version of PHP can be found on the below link:
http://pear.php.net/package/PHP_Compat