now when running the below php code i'm getting an error: "XML Parsing Error: junk after document element Warning: DOMDocument::load() [domdocument.load0]: I/O warning : failed to load external entity "" in line 5".
I have no idea why the error is thrown. please help/
i am call a web service which is locate on https protocol.and i get "failed to load external entity" error while calling it. as i search on google it is because in PHP5, SOAP class will not parse WSDL file located on a secure HTTPS connection. what is the solution? i don't want to use http instead of https.
I was given this the URL for a Web Service.And I need to consume the methods from that service, connecting through Hamachi with it. Anyways, the thing is that, when I use the StrikeIron Analyzer, I can recover the Web Service and see the methods there. On PHP, I create my object...
$soapClient = new SoapClient("http:18.104.22.168:13080/sc62server/PWS/AnswersWM.wsdl", array("soap_version"=>SOAP_1_2
I get an error...
Fatal error: SOAP-ERROR: Parsing WSDL: Couldn't load from 'http:22.214.171.124:13080/sc62server/PWS/AnswersWM.wsdl' : failed to load external entity "http:126.96.36.199:13080/sc62server/PWS/AnswersWM.wsdl" in C:\xampp\htdocs\ITSTK\Modules\Client\index.php
whenever i type a pear command i get the following error:
Failed loading /usr/lib/php5/20060613/xdebug.so: /usr/lib/php5/20060613/xdebug.so: cannot open shared object file: No such file or directory
This started happening after I uninstalled the extension. I uninstalled it b/c I realized that I had already had it installed via ubuntu. Now, I just keep getting that error. I guess I could install it again with pear, but rather just get rid of the error.
All of a sudden after everything was working great for over 4 months I get this error when sending external e-mails. When my script send e-mails internally there is no issue, but outside gives me this error. Any ideas on how I can track down the cause?
I have refer to many references but still got same error. one of my reference is [URL] here is my code
Code: [Select]<?php session_start(); include 'includes/db_connect.php'; include 'phpmailer/class.phpmailer.php'; include 'phpmailer/class.smtp.php'; $title="Application for Venue Change"; $subject=$title; $name=$_REQUEST["name"]; [Code]...
QuoteThis part of the error message - "Language string failed to load" means that you did not copy the "language" folder that is part of the phpmailer script to your server. Based on information above which is Language folder, I also include them into my server. (Download php mailer and put them into server) I only change class.phpmailer.php and class.smtp.php to configure my :
1. host 2. port 3. username 4. password
my path for php mailer is AVC_VT/phpmailer did i miss something.
I am having a problem with the initial page load of a site after refresh. I use Chrome for development but the problem is evident across all browsers. I get the message "Waiting for mysite.com..." and the loading indicator keeps spinning. This stays like this for about 10 seconds and then Chrome gives up and tells me "Unable to load resource 'foobah.png'" from within the developer console. The image is definitely at the path stated. Sometimes it loads sometimes it doesn't. If I remove reference to this image altogether the error will fall on a different image within the page. This is a regular but intermittent fault.
Does anyone have any experience of this type of page load problem?
Example URL: http://dev.thebeer.co/imgs/UI/inboxIcon.png Chrome Developer Tools Screens:
UPDATE: I have found that the following errors in http error log seem to be linked with the problem. Anyone know what they mean and how to solve them?
I am stumped. The code is exactly the same and they are both hosted by the same rackspace . . . yet one works fine and the other generates an error:
[domdocument.load]: failed to open stream: HTTP request failed!
The code is fairly simple:
$doc = new DOMDocument(); $page_url = **valid XML feed** $doc->load($page_url);
So I do not understand what is causing the error . Also: everything was working fine until yesterday - so it must have been something in one of the (unfortunately very numerous) patches implemented yesterday. Just need an Idea of where to start looking Some investigating reveals it is the feed's problem - (Indeed's job API) - switched to a different job feed provider & everything is working.
So I was writing a piece of functionality where I was getting the parent category ids of a category and wanted to get the url_key of one of the parents in Magento. I wanted to do this without having to load the category and found this method getAttributeRawValue, which can be found here /app/code/core/Mage/Catalog/Model/Resource/Eav/Mysql4/Abstract.php. Basically it seems unnecessary to load a whole entity then to just get the attribute value associated to that entity id.It gets me what I need, but I am wondering if it is all that much better then loading the category and getting the attribute. Is loading the category essentially doing the same thing but doing it for every attribute?
problems loading a XML file from a url. It works fine when run locally using xammp, but once uploaded to my website (which is hosted by sourceforge.net) the file fails to load. The class I wrote to parse the rss xml file is posted below.
I have a database, that stores items and a link to an image for every item. Now the images can't be storen localy (there wouldn't be anough space and other reasons as well), so I link to images on other servers.
When I wan't to display the items (with it's image), I run while statement with mysql_fetch_array like this (simplified):
But the image load takes forever, because some of the external images have large sizes, however I only need it 200px wide. Is there a way to smaller the image 'on the go', or something to speed the loading up?
How can I import a variable from an external file? What I want to do is to have a configuration file in which I can write all my website settings and then to import these settings to every file, so I can set the website skin and things like that..
how one would load large xml files, from an external url, and display the contents in the quickest fashion.. Right now I'm loading using "simplexml_load_file" and it takes about 12 seconds to load the file, which is far too long. I did copy the file locally and it loads super fast.. Maybe there's a way to read that external xml file and save it locally and then I could just read that one.
I'd like to grab the load time of an external PHP document. That is, to have a script load the page contents as if it were just another user behind a browser and return in seconds how long that took. Right now I can determine that the server is up and get a ping-like response time, but that isn't much use to me. I don't know where to start, or the difficulty of what I'm asking.
I'm creating a site for a university where we have to include the university's main navigation at the top of the page. Rather than re-create their navigation, I want to load it dynamically.First I thought I could use an iframe and just trim it to the navigation.However, when a link is clicked it opens in that frame, when i want it in a new window.So, my new plan is to pull in the nav with curl (or another way if someone has a suggestion..). Whats the best way to go about this? Fairly new at curl so I'd love some code.
im currently working on creating a small application for a website i'm working on the calendar im using is in the cakephp plugins folder. I have to display events depending on the city or view id i want the url like this.
i also want to load the event model and not have load a 'cities' database table i already have all the information i need in two tables events and event_locations i just need to run a condition to get events according to the id. here the original link