checking for T1lib support... no
checking whether to enable truetype string function in GD... no
checking whether to enable JIS-mapped Japanese font support in GD... no
checking for fabsf... yes
checking for floorf... yes
If configure fails try --with-jpeg-dir=<DIR>
checking for png_write_image in -lpng... yes
If configure fails try --with-xpm-dir=<DIR>
If configure fails try --with-freetype-dir=<DIR>
configure: error: GD build test failed. Please check the config.log for
details.
I think i'm not the only one who is experiencing this problem, take a look here:
[url]
Why do PHP.net don't give much importance to this BUG?
I'm trying to compile PHP with curl enabled but I cannot get it work.
Here is the part of config.log
.... configure:32740: checking whether to enable calendar conversion support configure:32778: result: no configure:33078: checking whether to enable ctype functions configure:33116: result: yes configure:33416: checking for cURL support configure:33455: result: yes configure:33464: checking if we should use cURL for url streams configure:33482: result: no configure:33492: checking for cURL in default path configure:33497: result: found in /usr configure:33515: checking for cURL 7.10.5 or greater configure:33529: result: libcurl 7.15.5 configure:33538: checking for SSL support in libcurl configure:33542: result: yes configure:33560: checking how to run the C preprocessor configure:33678: result: gcc -E configure:33702: gcc -E conftest.c configure:33708: $? = 0 configure:33740: gcc -E conftest.c conftest.c:201:28: error: ac_nonexistent.h: No such file or directory configure:33746: $? = 1 configure: failed program was: | /* confdefs.h. */ | | #define PACKAGE_NAME "" | #define PACKAGE_TARNAME "" | #define PACKAGE_VERSION "" | #define PACKAGE_STRING "" ...
As you can see it shows "ac_nonexistent.h: No such file or directory" I installed both gcc and gcc-c++ libraries but it doesn't work.
I'm needing to move one of my own websites, as the current shared host doesn't provide curl or full PDO support.
I'm tempted by the basic fasthosts linux home package, as I've used them before for a client, but I'm not sure if they're shared hosting includes curl/PDO support (I know their dedicated servers do, as I've just checked).
I've just sent an email to fasthosts asking, but are there any other hosts I should be looking at?
The site's nothing critical, with pretty low bandwidth needs, but the current hosts uptime isn't anything to shout about.
I have a Linux-Debian server, I have Apache2, PHP ect setup for a basic website im running on it too, However I need cURL enabled, I cant seem to find any options in the php.ini files to enable cURL at all.
I know that you just uncomment it to enable it however I dont see it at all, even it being disabled. I opened it in notepad and used the SEARCH feature and it couldnt find it eithier.
I wanna install curl on my server its Centos 4.5 and SL told me I have around 30K mails pending on the serve can anyone tell me how to remove them from pending?
This is what I haved done so far. Where do I go from here? "Then use --with-curl in PHP's configure" this was the last step on the instructions but i do not know what it means.
I have been attempting to get curl installed on my server, but seem to have run into a snag. When I type:
curl-config --version
I get a response that I have: libcurl 7.16.0
However, when I try to install WHMCS, it still shows I do not have Curl with SSL support. I have used /scripts/easyapache, however there was never an option under 6 or 7 with CURL. I also tried with Cpanel WHM and used the Apache Update to 4.4.4 and checked both Curl options and OpenSSL. However still no luck.
I have also restarted services and rebooted the server.
after following the perfect server setup - centos5.2 guide I have setup a home server on my dsl connection and installed openfire with relative ease. I have a paid hosting server which runs my website but I want to have it access the userservice plugin of openfire to add/remove users (which is installed on my new home webserver).
After trying fopen and CURL to post GET data to my home server without any luck I did some reading and came accross the snoopy php class. The snoopy class now allows me to get the default apache test page on my home server but when i try to point snoopy to my openfire admin on port 9090 it throws up a timeout error (but i think this may be an error in the snoopy class?).
CURL and fopen allowed me to get data from google and some other sites but not from my home server.
CURL installed with SSL Support one of the programs I am trying to use on my servers needs the above programs. How do I find and install these? Running CentOS 5
I have one server with multiple websites hosted on it. It's powered with Cpanel / WHM.
When I run the command 'wget mydomain.com' from this server, I download a Cpanel / WHM landing page instead of the actual homepage from mydomain.com.
I can successfully load mydomain.com in a browser. I can also run the 'wget' command from my local computer and download the correct homepage.
When I run this command from the server mydomain.com is hosted on, Apache returns the home page for the default virtual host for my IP address (ie. cpanel landing page). I encounter this same problem when using CURL or PHP Sockets.
I am behind a firewall. Could this be causing the issue? Does anyone have any ideas how I could fix this?
This is a big problem as I have websites that need to use a web service from a domain hosted on the same machine.
Friend of mine asked me what "CURL named lookup issue resolved" means, he's got a CentOS 4 machine, with cPanel/WHM latest builds I tried recompiling apache and even cPanel at some point but no good..
Is there a way to configure the open_basedir to allow cURL to FOLLOW_LOCATION? I'd like to keep the open_basedir enabled for security but at the same time, I need my site to function properly and that's not a possibility with FOLLOW_LOCATION not working.
Starting proftpd: - mod_tls/2.2.1: compiled using OpenSSL version 'OpenSSL 0.9.8i 15 Sep 2008' headers, but linked to OpenSSL version 'OpenSSL 0.9.8g 19 Oct 2007' library
proftpd has its own set of issues obviously built with i headers and linked to g headers. Any ideas wtf I did? I recomplined and restarted everything. I removed the g and i libraries completely. OpenSSH seems happy and nothing is actually "wrong", the server is working fine but I'm really anal retentive this way...it's kinda how I feel "safer" at the OS level.
im trying to write a script interfacing to WHM again via remote access key. What I want to achieve is to remove an ip from iptables. using PHP script (CURL), any thoughts on how i can remove an ip from iptables? I know the ssh command how to do it but i dont know if it will work via a PHP (CURL) script connecting to WHM via remote access key.
I'm testing csf with cpanel and all is good at the beginning but i noticed that outgoing curl connections are blocked and i can't add any port to iptables due to curl uses a different one each time.
I have a script that does a PHP Curl call where it downloads a large file (~20meg) to the local server.
While its doing the Curl Call, I open up a new window and keep working with the site, (ie. opening a simple image) and the browser keeps saying "Waiting for reply". As soon as the Curl Call stops, then the server works again.
What could course this?
The server is dedicated and only me using it: Intel Core2Quad Q6600 2GB DDR RAM Linux - CentOS 5 32bit
While its doing the curl call, I ran uptime and the load is 0.0 0.0 0.0.
From my PHPINFO:
Code: Apache Version Apache/1.3.39 (Unix) mod_ssl/2.8.30 OpenSSL/0.9.8b Apache Release 10339100 Apache API Version 19990320 Hostname:Port localhost:80 User/Group apache(100)/500 Max Requests Per Child: 1000 - Keep Alive: off - Max Per Connection: 100 Timeouts Connection: 120 - Keep-Alive: 1 Server Root /etc/httpd Loaded Modules mod_php5, mod_ssl, mod_setenvif, mod_so, mod_headers, mod_expires, mod_auth_anon, mod_auth, mod_access, mod_rewrite, mod_alias, mod_userdir, mod_actions, mod_imap, mod_asis, mod_cgi, mod_dir, mod_autoindex, mod_include, mod_info, mod_status, mod_negotiation, mod_mime, mod_log_referer, mod_log_agent, mod_log_config, mod_env, mod_vhost_alias, http_core HTTP_CONNECTION Keep-Alive php.ini
Code: max_execution_time = 1000 ; Maximum execution time of each script, in seconds max_input_time = 1000 ; Maximum amount of time each script may spend parsing request data ;max_input_nesting_level = 64 ; Maximum input variable nesting level memory_limit = 128M ; Maximum amount of memory a script may consume (128MB) [MySQL] ; Allow or prevent persistent links. mysql.allow_persistent = Off
; Maximum number of persistent links. -1 means no limit. mysql.max_persistent = -1
; Maximum number of links (persistent + non-persistent). -1 means no limit. mysql.max_links = -1
httpd.conf
Code: MinSpareServers 5 MaxSpareServers 10 StartServers 5 MaxClients 150 MaxRequestsPerChild 1000 httpd-mpm.conf - changing this made no difference
I've tried Lighttpd and Apache. Server is QuadCore 1 cpu.
Either one when I have one long curl download occurring in Firefox, and open up another window to connect to the site it waits until the curl process completes.
In Lighttpd I have php 5.2.5 fastcgi and xcache.
There is no errors. It simply just waits until the process is completed and 1 second later the second request (2nd browser) starts.