Today I had a problem with a site I had a PHP script on, running as a cron job. Lo and behold, as I went through the directories via FTP, I found several hundred small files all looking somewhat like the PHP script path – here’s an example with details tweaked to protect the guilty:
myscript.php_p=1234
myscript.php_p=1234.1
myscript.php_p=1234.2
...etc
The cron job to run it was this one:
*/15 * * * * wget http://egwebsite.com/myscript.php?p=1234
Which loads the myscript.php web page every 15 minutes, doing its business.
WGET is a handy tool to grab pages via URL rather than direct PHP calls – in this case, I needed to pass a parameter via $GET (?p=1234), and so a direct PHP call was awkward. I also wanted the option to call the file via my browser from time to time, so storing it out of the HTML document directories was not an option either.
However, I had cruft – and the cruft in my root directory was obviously related to this – in fact, each file seemed to be the page output, same as if I had it on my browser.
What to do?
- The obvious answer in hindsight would be to dig around for the WGET option to not cache or log, and so not create any file. However, I thought originally the problem was ALL cron jobs, not just ones using WGET. Nonetheless, if you want, here is an option I found online (although didn’t test it, so Caveat emptor) which sends all output to /dev/null, effectively trashing it:
wget -qO- http://egwebsite.com/myscript.php?p=1234 &> /dev/null
- Call PHP directly. While I was blaming cron generally, I tried a direct PHP call, like this:
php /home/egwebsitecom/myscript.php >/dev/null 2>/dev/null
On HostGator at least, this worked. One awkward thing: the command line parameters. To call that, I ended up creating two scripts – myscript0.php which called myscript.php.
In myscript0.php I simply did this:
define("MYKEYVALUE",1);
include 'myscript.php';and in myscript.php I did this:
if (defined('MYKEYVALUE'))
{
// code to set the GET parameter(s) directly
}
else
{
// code to set parameter(s) from the URL GET variables
}
Now my file could be called directly in the browser, or via PHP in cron, bypassing WGET.
A few notes:
- While trying these out, I recommend you add an email address via cron in cPanel to get reports – this is the easiest way to confirm everything is working.
- As well, leave off the null output for now (the >/dev/null 2>/dev/null) so the result will be emailed to you, helping with troubleshooting.
- Finally, while testing, bump up the cron time to say every 1 or 2 minutes for a little while, so you get responses fast – but don’t forget to slow it down after you’re done – servers don’t like a once a minute cron job!
Thanks for the article. If you don’t have privilege to setup cron job, a third party webcron (like: Easycron.com) may be a good option for you.
I have been reading many articles of yours, or i presume they are your so I must congratulate you for some of the great information you publish. I am a novice of php but found one article on the internet where you made a php multiple random image script using the following:
<?php
// rotate images randomly but w/o dups on same page – format:
// – rotate image #0 – use ‘i=1’
// for second, etc
// (c) 2004 David Pankhurst – use freely, but please leave in my credit
$images=array( // list of files to rotate – add as needed
“bomb.jpg”,
“frown.jpg”,
“grim.jpg”,
“smile.jpg” );
$total=count($images);
$secondsFixed=10; // seconds to keep list the same
$seedValue=(int)(time()/$secondsFixed);
srand($seedValue);
for ($i=0;$i
What I am wondering being new to php, is there a way to link the pictures to a web page so that when the pictures change they can be clicked upon to take that person to a page associated to the picture? I have tried several ways with your script, but not yet been able to get results. Keep up the great contributions, great reading all over the net
I answer these questions on the post for the link code: https://www.utopiamechanicus.com/article/not-so-random-image-rotation-in-php-for-html-the-sequel/