Search Results

Search found 5564 results on 223 pages for 'scripts'.

Page 126/223 | < Previous Page | 122 123 124 125 126 127 128 129 130 131 132 133  | Next Page >

  • can server 2008's task scheduler run a php file?

    - by rg89
    Hello. I have a server 2008 64 bit machine with php5 via fastcgi installed. I want to run a .php script every day at 3 AM. I set up a task and "Last Run Result" says "%1 is not a valid Win32 application" The event properties describe more failure: "Task Scheduler failed to launch action "D:\InetPub\tools\something\build.php" in instance "{88cc01f4-9554-4b8f-9836-34d806337d7f}" of task "\Something". Additional Data: Error Value: 2147942593." Task Category: Action failed to start Is it possible to run scripts using the task scheduler? If not, how should I go about automating the execution of a php script? Thanks

    Read the article

  • How to reliably run a batch job every 5 seconds?

    - by Benjamin
    I'm building an application where the sending of all notifications (email, SMS, fax) will be asynchronous. The application will write the notifications to the database, and a batch job will read these notifications and send them with the appropriate transport. I was first reading at ways to run cron faster than the minute, and realized this was a bad idea. The batch scripts are written in PHP, and I guess that writing a proper daemon would be quite an overhead (though I'm open to any suggestion, as PHP car run indefinitely as well). What I have in mind is a solution that would: Run the PHP script every 5 seconds Check that the previous run has finished, or abort (never 2 concurrent batches running) Kill the script if live for more than x minutes (a security in case it hangs) Start with the system (if a reboot occurs) Any idea how to do this?

    Read the article

  • What's the best Flash Blocker for the Chrome Browser on a Mac?

    - by Bryan Schuetz
    Looking at the extensions gallery for Chromium there seems to be a number fo flash-blocking extensions available: A couple with very similar names even. I've been using ClicktoFlash in Safari and am used to it just flat out working everywhere. Unfortunately after using FlashBlock by Ruzanow for a bit I've noticed it gets a bit "Hinky" at times (blocking the flash by collapsing the div so you can't click to enable it, etc.) I have a feeling there may be some other extensions/scripts out there not listed above that are better. Ultimately I'd like to find a flash blocker that works as well as ClickToFlash does in Safari.

    Read the article

  • Delete temporary files from batch script in xp

    - by Keith Bentrup
    I'm looking for a good batch script that would quickly find & clean all the known safe temporary folders/files from Windows (as many variants as possible) machines (e.g. the windows temp folder, all users IE temp folders, etc.). I'm fond of UI tools like CCleaner (over Cleanmgr.exe), but when I'm trying to clean several computers quickly and/or with minimal involvement, it would be nice to have a script. Plus with a script, I could chain several scripts together. Maybe one to then fire up various antivirus and/or malware detectors. Anyone have a good one or can point to a good resource?

    Read the article

  • Ubuntu from console/command-line/shell

    - by Xolve
    Earlies linux distros though required lot of manual work they were quite good to use from commandline. If the X-server didn't start or you just want a shell to work they all supported. Network was configured by init; sound was up and ready; new devices inserted would be configured and their configureation was placed in fstab. Also there were small scripts I found on many distros which on X used windows while on console they switched to ncurses. But now this all needs GUI with a desktop manager (KDE, GNOME) for the new paradigms :'-( require GUI (NetworkManger, hal etc.). So if on just command line you have to be root, looks like they believe only geeky admins need that, and need to edit config files or type big commands. Any way so that this is easy in Ubnubtu through shell again.

    Read the article

  • rsync server side limit bandwidth/connection

    - by c2h2
    In a VOIP application, I have upto 3000 clients rsync audio files from there linux server in a daily, server is placed at a data center (10Mbps in/out bound), the server works as a VOIP sip server running FreeSWITCH (low ping latency should be ensured.) Therefore I would like to have server side control of rsync which controls: Limit total outbound bandwidth. Limit total number of connections. (Reject clients while at max number of connection and let it retry after a specific time frame.) OPTIONAL: list/kill individual connections. Normally I would use ssh + rsync + pem_keys with some extra options, but above requirements are not feasible by simple command lines. Can anyone point me some direction. or show some scripts/tools? I would also probably integrate them and release on github. Thanks!

    Read the article

  • Is there some way Linux editors can tell the programming language without the file extension?

    - by vfclists
    I am editing some scripts on Linux without the languages file extensions, and it seems that the editors, namely vi, nano and gedit are not applying syntax highlighting because the filenames don't use the language extensions. Is there some parameters to be passed or some setting that can enable them to recognize the language? Update: After some googling I realize that bash has that ability, at least to do some parsing or check the shebang at the top determine the language. By default Ubuntu does not install the complete vim package, so after installing it, the shell files are recognized. I don't know about nano or gedit, but vi and its graphical counterpart will do.

    Read the article

  • Tomcat / Railo stop responding with no error output

    - by andrewdixon
    This is going to sound very vague and I'm sure it will be voted down for not giving enough information however I don't really have any to give as you will see. We have an AWS instance running Amazon Linux, Apache, Tomcat and Railo and from time to time the Tomcat/Railo simply stops responding to requests and there are no errors output in the catalina.out file or any of the other log files in the Tomcat logs directory. When I issue the command to restart Tomcat/Railo the restart scripts sits there for a while then says that Tomcat has not responded so it has killed it off and then it starts up again and everything is fine until it happens again, anything from a couple of minutes to a couple of days later. I have done my best to check other logs on the server but have found no messages at all to indicate why Tomcat/Railo has given up and stopped responding. Can anyone suggest any reason why it might be doing this and / or any other log file(s) that we could check to see what is happening. Thanks. Andrew.

    Read the article

  • What happens if I run caspol.exe multiple times?

    - by Maclovin
    Hi there! Caspol.exe is used to modify security policy for the machine policy level, the user policy level, and the enterprise policy level. What I use it for, is setting up av trust between the client and an area on some server. I went through the scripts on the server, and found an interesting script that sets up full trust via caspol between a client in one zone, and an application on the server. That script has been running every day, for every logon, since it was implemented. Can someone tell me the consequences? I guess there is about 500 trusts between the client computer and the server, all which points to the same thing.

    Read the article

  • if exist !SOMEPATH! not working in batch file

    - by akash
    I have a batch script in which i am using multiple if exist statement, the problem is all statements are working except one . Following variables are set SETLOCAL ENABLEDELAYEDEXPANSION SET basedrive=E: SET tfworkspace=!basedrive!\TFS SET envdefault=%1 SET projenv=!envdefault! echo subapp=!subapp! subappservice=!subappservice! SET tfworkspacepath=!tfworkspace!\!releasebranch!\!app!\!subapp! SET tfworkspacepathservice=!tfworkspace!\!releasebranch!\!app!\!subapp!\sourcecode\build\!projenv! This statement works, if exist "!tfworkspacepath!" (robocopy "!tfworkspacepath!"\sourcecode\messagebroker\ /E /NFL /NJS /NDL /ETA "!basedir!\!messagebroker!" ) else SET /a foldererror=1 This statement doesn't work, by does not work i mean even thou the path does not exist it it still tries to robocopy. if exist !tfworkspacepathservice! ( robocopy !tfworkspacepathservice! /E /NFL /NJS /NDL /ETA "!basedir!\!scripts!") else SET /a foldererror =!foldererror!+1 I am new to batch writing, please guide me

    Read the article

  • server performance metrics report and practicality

    - by Anjesh
    I have a need of preparing web server (apache-php) performance report containing important metrics like CPU usage, disk io, memory usage on user basis. Couple of domains are hosted in the same server and they run from separate users using fcgi. The reason being sometimes some hosted applications take lots of cpu usage, making the server slow for other applications (running as separate users). i am planning to develop scripts for this, as i can't seem to find any simple utilities for this purpose. This script will take snapshots of the user wise metrics at defined periods say 15 minutes and record it. Any abnormalities will be reported via emails. How practical is that? also would be interesting to know what else need to be recorded.

    Read the article

  • Magento Onepage Success Conversion Tracking Design Pattern

    - by user1734954
    My intent is to track conversions through multiple channels by inserting third party javascript (for example google analytics, optimizely, pricegrabber etc.) into the footer of onepage success . I've accomplished this by adding a block to the footer reference inside of the checkout success node within local.xml and everything works appropriately. My questions are more about efficiency and extensibility. It occurred to me that it would be better to combine all of the blocks into a single block reference and then use a various methods acting on a single call to the various related models to provide the data needed for insertion into the javascript for each of the conversion tracking scripts. Some examples of the common data that conversion tracking may rely on(pseudo): Order ID , Order Total, Order.LineItem.Name(foreach) and so on Currently for each of the scripts I've made a call to the appropriate model passing the customers last order id as the load value and the calling a get() assigning the return value to a variable and then iterating through the data to match the values with the expectations of the given third party service. All of the data should be pulled once when checkout is complete each third party services may expect different data in different formats Here is an example of one of the conversion tracking template files which loads at the footer of checkout success. $order = Mage::getModel('sales/order')->loadByIncrementId(Mage::getSingleton('checkout/session')->getLastRealOrderId()); $amount = number_format($order->getGrandTotal(),2); $customer = Mage::helper('customer')->getCustomer()->getData(); ?> <script type="text/javascript"> popup_email = '<?php echo($customer['email']);?>'; popup_order_number = '<?php echo $this->getOrderId() ?>'; </script> <!-- PriceGrabber Merchant Evaluation Code --> <script type="text/javascript" charset="UTF-8" src="https://www.pricegrabber.com/rating_merchrevpopjs.php?retid=<something>"></script> <noscript><a href="http://www.pricegrabber.com/rating_merchrev.php?retid=<something>" target=_blank> <img src="https://images.pricegrabber.com/images/mr_noprize.jpg" border="0" width="272" height="238" alt="Merchant Evaluation"></a></noscript> <!-- End PriceGrabber Code --> Having just a single piece of code like this is not that big of a deal, but we are doing similar things with a number of different third party services. Pricegrabber is one of the simpler examples. A more sophisticated tracking service expects a comma separated list of all of the product names, ids, prices, categories , order id etc. I would like to make it all more manageable so my idea to do the following: combine all of the template files into a single file Develop a helper class or library to deliver the data to the conversion template Goals Include Extensibility Minimal Model Calls Minimal Method Calls The Questions 1. Is a Mage helper the best route to take? 2. Is there any design pattern you may recommend for the "helper" class? 3. Why would this the design pattern you've chosen be best for this instance?

    Read the article

  • Managing service passwords with Puppet

    - by Jeff Ferland
    I'm setting up my Bacula configuration in Puppet. One thing I want to do is ensure that each password field is different. My current thought is to hash the hostname with a secret value that would ensure each file daemon has a unique password and that password can be written to both the director configuration and the file server. I definitely don't want to use one universal password as that would permit anybody who might compromise one machine to get access to any machine through Bacula. Is there another way to do this other than using a hash function to generate the passwords? Clarification: This is NOT about user accounts for services. This is about the authentication tokens (to use another term) in the client / server files. Example snippet: Director { # define myself Name = <%= hostname $>-dir QueryFile = "/etc/bacula/scripts/query.sql" WorkingDirectory = "/var/lib/bacula" PidDirectory = "/var/run/bacula" Maximum Concurrent Jobs = 3 Password = "<%= somePasswordFunction =>" # Console password Messages = Daemon }

    Read the article

  • Changing wallpaper depending on time of day via script or batch file?

    - by Patrick
    I want to have 2 different wallpapers that change according to time of day (6 and 22 hours respectively) and only want to display the night one after 22 hours and the day one only after 6 hours and until 22 hours. I didn't find a program that can do this after a standby, so I thought it should be easy to realize with the task scheduler running a script. Now the question is not only how to realize such a script, but also if the script should include the time checking or the task scheduler. I'm not sure what would work better with long times of the PC being in standby. I tried a few scripts already from similar questions and hoped I could modify to them to my needs, but they didn't work at all. Anyone able to help me? TIA.

    Read the article

  • Change Win7 taskbar position (overriding GPO, Registry Editor, Admin. Rights)

    - by diegocavazos53
    I run the computer center of my Faculty and the problem is that users manage to change the Win7 taskbar position. I don't really know how they do this as far as I have applied many group policies that are specific to the taskbar (like locking it). I have also disallowed users from entering new registry keys or executing the command prompt (or employing scripts). They have regular user rights and many Win7 tweaking programs need administrator rights to make changes to the GUI. So in other words, the taskbar is locked, there is a policy that sets its position to the lower part of the screen, users can't see the control panel, add registry keys, use the command prompt and don't have admin. rights. How do they keep moving the taskbar position to the upper part of the screen? Any ideas would be greatly appreciated. Thank you.

    Read the article

  • Mac failing (failed?) hard drive - is all hope lost?

    - by Daniel
    It's a 500 GB Seagate laptop hard drive that came with my Macbook Pro. Apple partition format. Already replaced and now have it external, connected via SATA/USB adapter. Trying to get just a few files that I worked on while out of town when it crashed (and thus did not have my time machine backup drive). Drive will not mount, but OS X Disk Utility detects it and can read the capacity, model number, and even the name of the partition, which leads me to believe all hope may not be lost. Failed attempts so far: Disk Utility verify+repair says drive cannot be repaired and that I should back up immediately (lovely) Disk Warrior says it cannot rebuild the directory due to hardware failure Data Rescue quick & deep scans immediately failed PhotoRec says "error reading sector" for every sector (at least for the few minutes I let it run before closing it to explore other options) What else can I try here? Again, I'm just looking for a few, small files (python scripts to be specific) - not a full recovery.

    Read the article

  • how to set global PATH on OS X?

    - by lajos
    I'd like to append to the global PATH variable on OS X so that all user shells and GUI applications get the same PATH environment. I know I can append to the path in shell startup scripts, but those settings are not inherited by GUI applications. The only way I found so far is to redefine the PATH environment variable in /etc/launchd.conf: setenv PATH /usr/bin:/bin:/usr/sbin:/sbin:/my/path I coulnd't figure out a way to actually append to PATH in launchd.conf. I'm a bit worried about this method, but so far this is the only thing that works. Does anyone know of a better way?

    Read the article

  • Locked out by changing file permissions

    - by Valeriy
    I just locked my root account (and all other accounts if it matters) completely out of the RHEL 5.4 by changing permissions on every file to 400. Now I have "Permission denied" on any command that I try to run, including chmod itself. Any idea on how to recover? The only access I have to the server is via terminal or SSH. (If anyone cares how it happened, I was running a hardening script and one of the lines was supposed to change permission on some config files in /etc directory. It has couple of variables that had not been set, so the command essentially evaluated to chmod -R 0400 /* Ouch! This is sure a great lesson on checking the scripts even more carefully in the future but what can I do now?

    Read the article

  • My powershell script wont save a file when run using Task Scheduler, do I need to specify a specific argument?

    - by EGr
    I have a script that downloads a temporary Excel file, copies parts of it to a new file, and saves it to a specific location on the network. The problem I'm having is that the new file is never created/saved. If I run the script locally (through cmd.exe, powershell, or powershell ise), it WILL save the file locally, or to the network. If I try running the script via a schedule or on-demand via Task Scheduler, the temporary file is created, but the final document is never created or saved. Is there a specific argument I need to pass, or anything I could be doing wrong? This is the command I'm currently using: powershell.exe -file C:\path\to\my\powershell\script\thescript.ps1 Since it calls environment variables, and other variables relative to the scripts positon, I also set "Start in" to C:\path\to\my\powershell\script\

    Read the article

  • Control scheduled tasks execution

    - by SJuan76
    We are a small shop. I am mainly a programmer, but due to being the only one that risks to manage our servers, the task has fallen on me (yet I it is still a secondary function so I cannot give it too much time). Over the course of years we have needed to create a decent number of .bat scripts that run as scheduled tasks in our servers (dump DB servers, SVN servers, copy files, etc.). Manually checking that everyone has proceeded ok is a time consuming task. I could get them to send an email on completion, but then I would get swarmed by lots of emails each morning. If I setup them to only e-mail on failure, I might miss the instances where the error causes the task to abort (or even not to start). Are there other alternatives? We are currently using Windows 2003 R2, but we are thinking of adding some Linux server soon, so a cross-platform solution would be best.

    Read the article

  • Log with iptalbes which user is delivering email to port 25

    - by Maus
    Because we got blacklisted on CBL I set up the following firewall rules with iptables: #!/bin/bash iptables -A OUTPUT -d 127.0.0.1 -p tcp -m tcp --dport 25 -j ACCEPT iptables -A OUTPUT -p tcp -m tcp --dport 25 -m owner --gid-owner mail -j ACCEPT iptables -A OUTPUT -p tcp -m tcp --dport 25 -m owner --uid-owner root -j ACCEPT iptables -A OUTPUT -p tcp -m tcp --dport 25 -m owner --uid-owner Debian-exim -j ACCEPT iptables -A OUTPUT -p tcp -m limit --limit 15/minute -m tcp --dport 25 -j LOG --log-prefix "LOCAL_DROPPED_SPAM" iptables -A OUTPUT -p tcp -m tcp --dport 25 -j REJECT --reject-with icmp-port-unreachable I'm not able to connect to port 25 from localhost with another user than root or a mail group member - So it seems to work. Still some questions remain: How effective do you rate this rule-set to prevent spam coming from bad PHP-Scripts hosted on the server? Is there a way to block port 25 and 587 within the same statement? Is the usage of /usr/sbin/sendmail also limited or blocked by this rule-set? Is there a way to log the username of all other attempts which try to deliver stuff to port 25?

    Read the article

  • Script to Add an IMAP account to Outlook 2010

    - by francisswest
    Im looking to for a script that allows me to add an IMAP account to an already existing email profile. All of our users have an Exchange account as their main email account. We have several shared IMAP accounts that these users need access to. Rather than go to each machine and add each account they need access to over and over, Im hoping there is a script of some nature where I can plug in the needed info (username, password, server, security settings, etc) I then plan to run said script remotely on each machine using powershell. I have found a couple scripts in my searches, but nothing recent, and the majority of them revolve around creating profiles, not adding accounts to an already existing profile. Thoughts?

    Read the article

  • I'd like to archive files from Ubuntu to Windows between two computers on a shared home network

    - by Wabbitseason
    I have an old laptop running Ubuntu 9.10 which I use as a LAMP environment for web development, and I have a comfortable, powerful desktop computer with Windows 7 installed on it. These two are connected to a home router so both can access the internet. I have been able to set up Samba so I can mount my Apache home directory so it is accessible from Windows and is mapped as a network drive. What I'd like to do is access some Windows folders from Linux so I could automatically create backups (with cron scripts) of my work to physically different locations on the Windows box. Perhaps at a later time I'd set up a local Subversion repository but I'd love to keep backups of that on the Windows drives too. Using Ubuntu's Places/Network menu I can see my desktop but I'm unable to log in to that despite having created the corrent username and password on Windows. All I can get is the following error message: "Unable to mount location. Failed to retrieve share list from server." What could be misconfigurated?

    Read the article

  • nginx: Disallow Acces to a Folder, except some subfolders

    - by user68202
    how it is possible to deny access to a folder, but execept some subfolders in it from "deny"? I tried something like this (in this order): #this subfolder shouldnt be denied and php scripts inside should be executable location ~ /data/public { allow all; } #this folder contains many subfolders that should be denied from public access location ~ /data { deny all; return 404; } ... which doesnt work correctly. Files inside the /data/public folder are accessible (all other in /data are denied as it should be), but PHP files are not executed anymore in the /data/public folder (if i dont add these restrictions, the php files are executable). What is wrong? How can it be correct? I think theres a better way to do it. It would be very nice if anyone can help me with this :).

    Read the article

  • Own website fails to load first time

    - by AmazingDreams
    I have a website running on a VPS, every time I first try to load the website the connection times out. If I press try again, it loads directly. I'm not sure whether this is a DNS issue or a server issue. As far as I know everything is set up correctly. Also, it has been doing this from the moment I got this server and set up my domain name. And that's about two to three months ago. You may take a look here: http://www.wegotcha.nl/ As you can see at this moment it's just an image, there are no scripts running in the background or anything. The only error Apache gives me is that favicon.ico cannot be found. It's an Apache webserver running on Ubuntu 12.04.1 (newest version) I update all packages almost every day (apt-get update && apt-get upgrade). I am merely an amateur on the area of webservers so any help will be appreciated. :)

    Read the article

< Previous Page | 122 123 124 125 126 127 128 129 130 131 132 133  | Next Page >