|
|
Include file from another domain? (php)
|
|
|
|
Mac Elite
Join Date: Mar 2001
Location: CO
Status:
Offline
|
|
I've got a file on one site that I'd like to access with an include on another site (with another host, unfortunately).
It seems that both these hosts DISallow allow_url_fopen and allow_url_include
Nor do they allow ini_set to turn those features ON.
Any other workaround within PHP to pull a text file from another domain?
(I've not studied 'scraping' processes, but I know they exist, so some PHP process ought to be able to tap this.)
Many thanks for any clarification.
|
TOMBSTONE: "He's trashed his last preferences"
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Oct 2001
Location: San Diego, CA, USA
Status:
Offline
|
|
I don't think you can if your host has disabled it. You might try asking them to crack the cell door a little bit. Otherwise, if you don't need to process the data with PHP, could an iframe possibly work?
|
Chuck
___
"Instead of either 'multi-talented' or 'multitalented' use 'bisexual'."
|
|
|
|
|
|
|
|
Junior Member
Join Date: Oct 2003
Status:
Offline
|
|
I've had similar issues, Chuckit. Thanks for the good news/bad news.
The iframe idea leads to some work-arounds.
But with reduced dynamic-data possibilities (having to export all PHP processing output into static files, I guess?).
Trying to dialogue with hosts about willingness to "crack the cell door" with the average shared-server host doesn't sound like much fun [ah, the costs of shared hosting].
(
Last edited by macsfromnowon; Oct 27, 2008 at 03:31 AM.
Reason: expand)
|
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Mar 2001
Location: yes
Status:
Offline
|
|
Ideas:
- create a cronjob to automatically rsync/scp the text file to your server periodically and then include it
- file_get_contents() ( http://us3.php.net/manual/en/functio...t-contents.php - this will not parse PHP for you, but it will read in HTML as a string. This include file needs to be accessible via the web though)
|
|
|
|
|
|
|
|
|
Mac Elite
Join Date: Mar 2001
Location: CO
Status:
Offline
|
|
Thanks guys,
file_get_contents() sounds promising...
That'll mean that I need to be exporting with php from a db to text (html) file regularly.
The host suggested I explore cURL (which my host includes in the PHP config).
Sounds like that will enable me to set up a dynamic (php SELECTing from db) web page on the source site and enable *anyone* to get the latest version of the resulting HTML page with the very latest db content.
Am I grasping correctly the potential of cURL ?
|
TOMBSTONE: "He's trashed his last preferences"
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Mar 2001
Location: yes
Status:
Offline
|
|
CURL just fetches content, and can simulate HTTP requests such as sending basic auth authentication data, GET/POST data, etc. CURL is not a PHP interpreter, so it itself will not query a database, it will just simulate somebody doing this by triggering the remote PHP file.
You need to be careful with what you are doing. What happens if you are querying the database at the precise time that the file is being regenerated? What if you are trying to write to it at this precise time?
I would suggest setting up MySQL data replication, if this is an option. If it is, you are set. You will need access to a MySQL user with replication privileges, and you'll need to open up port 3306 on your firewall.
Otherwise, I would suggest setting up a mysqldump cronjob on the remote server, scping yourself this dump file, and regenerating this data on your server to a copy of the table. The advantage of doing this is that mysqldump will add the appropriate locks to prevent the kind of problem I'm describing here as far as conflicting writes. To prevent the problem with reads I was describing, I would suggest concocting a PHP script to go through each record row by row in a copy of this table that would be populated by your mysqldump file, update records that have changed, and insert ones that are new. This will help prevent people reading your records while they are being regenerated.
|
|
|
|
|
|
|
|
|
Mac Elite
Join Date: Mar 2001
Location: CO
Status:
Offline
|
|
Thanks, Besson,
I wonder if I need to go *quite* that hard-core. I don't have to "guarantee" to-the-nanosecond uptodateness.
I'm just thinking that if my other site's page is CURLing to get a page which is a script wherein my other site uses PHP to SELECT from db (and generate content in HTML) then visitors to the CURLing site will have essentially as "up to date news" as will visitors to my primary site.
Does that make any sense. (i.e., it's no more a serious issue for viewers to seek "latest news" from main than from CURLing site - even if I might be trying to INSER a new record at the same moment).
|
TOMBSTONE: "He's trashed his last preferences"
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Mar 2001
Location: yes
Status:
Offline
|
|
If you don't need up-to-the-second updates, why not just repopulate your table with a dump file generated on the other server? It seems far more simple to me to handle this at a lower level rather than at the application level. For one, any host that hosts your site will need to have the PHP libcurl extension available, you will probably want to write some code to back out of making the update when the curl fetch job fails (i.e. the network is done, or whatever), etc.
On the other server, if you have SSH access and can do a:
Code:
mysqldump -u <yourusername> -p <yourtable> > dumpfile.sql
then all you have to do is scp that dump file (dumpfile.sql) to your server, and create a cronjob that will:
Code:
mysql -u <yourusername> -p <yourtable> < dumpfile.sql
|
|
|
|
|
|
|
|
|
Mac Elite
Join Date: Mar 2001
Location: CO
Status:
Offline
|
|
I get it now. Very nice solution.
Muchos!
|
TOMBSTONE: "He's trashed his last preferences"
|
|
|
|
|
|
|
|
Mac Elite
Join Date: Mar 2001
Location: CO
Status:
Offline
|
|
FYI:
I have also found that I can create a customized .php page in target (generated by php from MySQL db) and then have the client page use:
cURL function (which I'd never explored before)
The function employs a URL to the target page (including GET values which target can be made to process in SELECTing records).
Don't know how I lived without it!
|
TOMBSTONE: "He's trashed his last preferences"
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Forum Rules
|
|
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
|
HTML code is Off
|
|
|
|
|
|
|
|
|
|
|
|