I can not for some reason get cURL to install on my web server, can someone help me? I have uncommented out "extension=php_curl.dll", then restarted apache, and it still isn't working. What else do I need to do?
I have WAMP SERVER,PHP 5.3.0 and Apache server 2.2.11 I want to download a suitable CURL lib version for it. I studied the sites and found that this installation will not be easy it may not work if the version is not suitable I found here: [URL]
but I'm not sure to get the perfect result cause my version is different also I have another problem if I can not install it in a good way will my wamp server and databases go wrong or not?
cannot install PHP-CURL on IIS. I think I have tried everything.My phpinfo() output does not list cURL as available.I have C:PHP in my path and working correctly.I have edited the registry to read from C:PHPphp.ini. phpinfo() output shows the correct .ini path loaded.I have uncommented 'extension=php_curl.dll'I have ssleay32.dll and libeay32.dll availble in C:PHP
I have been coding PHP for years and have been a Sys Admin even longer. Unfortunately, I have always used the installer version for Windows since our servers are running IIS.
I currently have PHP 4.4.4 on a partcular development machine and now need cURL to be installed for a website we are developing. Is there any way to install cURL on an IIS machine without having to uninstall PHP, then manually reinstall PHP and add cURL support?
I try to install php curl extension in my win xp environment but failed. To follow the manual ,I have copied libeay32.dll and ssleay32.dll into %systemroot%/system32 dir ,the ';' in front of 'extension=php_curl.dll' also has been opened but in the test function phpinfo(), there is still not any clues but the other extensions like php_gd2.dll,php_pdo.dll can be opened or closed without any problems
Curl - Use Libcurl To Pull Files From Server A To Server B?
cURL and libcurl let you do things you normally do with a browser, right? Like, as if you have a jini sit inside a server and fire up a browser and do stuff. Ok then. I need a script to sit on server B, and click on download links on a server A and download files (from server A to server B).
I am new to curl, and not sure if all I need is to issue a simple get or something else. I know that Content-disposition header forces the browser to save the document. Does it have the same effect on libcurl too? If it does, I'll make my serving script on server A send that header, and then serve the file.
Web Server Type Set Up With Curl Capabilities On The Very Same Computer/server?
Regarding CSRF, if I were to have a web server type set up with curl capabilities on the very same computer/server that Iím accessing the site from, I could then use curl to fetch the page and scrape for the nonce before itís expired.This is all theory as Iíve had a script Iíve tried this on with different levels of CSRF protection (premade) and I was still able to create a bot for content generation that would log in to the site with CSRF set to maximum protection (no multiple tabs, etc.). I would simply spoof the user agent and grab the needed nonce for submitting to the next page.I did this all using php and directly using cURL through the command line from the script and was able to generate forum/news posts, upload pictures and pretty much spam my own site (which did happen during the testing of this bot ).Iím new to CSRF and working on my own script with hopes to keep it secure, using nonces seems kind of redundant this way, doesnít it? Itís easy to spoof user agents and referers using just curl. If I really wanted to automate requests and all that I could easily get to it on any other site if I were using this method.Since I am still learning on the whole CSRF protection is there any suggested reading on the subject besides this site and google searching that could be suggested by any of you?Also, are there methods of preventing what I just mentioned above?I've done some extensive research into using curl and am pretty handy with using it now, I just haven't put effort into preventing forged requests using curl.
Imagemagick Be Used Without Having To Install On Server?
I don't know anything about imagemagick, I have a need to convert images that users upload to gif format and resize and optimize said images as well. Before I look further into it, can someone clarify whether imagemagick is a software or a standalone script? Meaning do I need to have it installed on my server or can I just uload the files and use the script's commands? I'm referring to imagemagick for PHP.
i m using windows xp. i m really new to the programming world. i was trying to download apache server but was not able to choose from the below links i find on apache site. Unix Source: httpd-2.2.9.tar.gz [PGP] [MD5] Unix Source: httpd-2.2.9.tar.bz2 [PGP] [MD5] in32 Source: httpd-2.2.9-win32-src.zip [PGP] [MD5] Win32 Binary without crypto (no mod_ssl) (MSI Installer): apache_2.2.9-win32-x86-no_ssl-r2.msi [PGP] [MD5] Win32 Binary including OpenSSL 0.9.8h (MSI Installer): apache_2.2.9-win32-x86-openssl-0.9.8h-r2.msi [PGP] [MD5]
also tell me what is win32 binary, why havent they mentioned straightway windows xp. what is md5 and pgp.this is the link for downloading php software [URL]....it also has many links given below on the page
My company use Netscape Enterprise Server as a web server. But If I want to use PHP, how can I install? Is the install process same as Linux server? if it can't install. What Programming language should I might use..CGI, JSP or else?
So apache can ready the directory just fine but has problems uploading files because the permission is denied.The scripting language is PHP.How can I allow file uploads without making everything 777 permssion?
I was using dblib PDO driver to connect to MS-SQL on Ubuntu.Everything was working fine.When I deployed my code on stage server, environment was different.Stage server is running on Red Hat and dblib driver is not installed over there. how to install dblib PDO driver for PHP on Red Hat?