Search Results

Search found 9128 results on 366 pages for 'execute immediate'.

Page 12/366 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • Task Scheduler not able to execute .vbs scripts successfully

    - by Django Reinhardt
    Apologies if this has a really obvious answer! We have several daily tasks we run via a .vbs script on our server (through the Task Scheduler), and for months it has been fine, but recently we've hit a problem. The .vbs scripts stopped successfully executing (always timing out)... but could still be executed manually with no problems(!). Not knowing any good reason why the Task Scheduler should start having problems, we thought we'd try a little "creative thinking", and run the .vbs another way: Via a .bat file executed by the Task Scheduler. Again we hit weird issues, but with a little more debugging information, this time around. The .bat file run by Task Scheduler is nothing more than... CScript "C:\location\script.vbs" > Log.txt But after an attempt to run it, the Task Scheduler fails with the following error: 0x1: An incorrect function was called or an unknown function was called. The Log.txt (as output from the .bat file above) says: CScript Error: Initialization of the Windows Script Host failed. (Not enough storage is available to process this command. ) But get this: The .bat file executes perfectly (vbs script and all) if it's executed with a double click! There's only a problem when it's run by Task Scheduler. What the hell? We're running Windows Server 2008 R2 (x64) and yes, the Task Sheduler's results are the same whether the user is logged in or not. Also, the user that can run the scripts successfully manually, is also the same user that runs the scripts in Task Scheduler. Thanks for any help for this weird problem!

    Read the article

  • cron doesn't execute it's commands

    - by Silvio Keller
    I created an own small server with Debian. Last night i updated it. It created an error while generating the initrd and it didn't boot. Today i booted from another filesystem and did dpkg --configure -a with chroot. I also checked the filesystem. Now everything should be ok. But cron doesn't work:-( It is the same /etc/crontab-File but it doesn't work. I reinstalled cron and tried many things. Is there a way to see cron's log? I only readed about rsyslog, but i have not installed rsyslog, because the server is based on a minimal system (Freeagent Dockstar). Has someone an idea? Best regards Silvio Keller Update There is no file /var/log/syslog and dpkg -l|grep syslog gives me no output, so i think syslog is not installed. It is only a minimal system. cron -l gives: cron: can't lock /var/run/crond.pid, otherpid may be 687: Resource temporarily unavailable So i stopped cron with /etc/init.d/cron stop and executed cron -l again, this gives no output. At this moment i tried to start cron with /etc/init.d/cron start: Starting periodic command scheduler: cron failed! But there's no additional error info... But i see there's now in the background a proccess called cron -l which runs. If i stop it /etc/init.d/cron start works: Starting periodic command scheduler: cron. I used the crontab-file /etc/crontab, this worked for me always. Till i updated my kernel and the initrd it doesn't. The file's content is: SHELL=/bin/sh PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin # m h dom mon dow user command 17 * * * * root cd / && run-parts --report /etc/cron.hourly 25 6 * * * root test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.daily ) 47 6 * * 7 root test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.weekly ) 52 6 1 * * root test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.monthly ) 00 5 * * * root dummy 23 45 * * 7 root dummy 00 * * * * root dummy */1 * * * * root dummy 00 1 * * * root dummy 00 4 * * * root dummy */5 * * * * root dummy #00 */10 * * * root dummy 01 0 * * * root dummy 00 5 * * * root dummy 00 4 * * * root dummy # If i start crontab -e it creates a new file /tmp/crontab.vn87tv/crontab, which is unfortunaly on a tmpfs and which also doesn't work. Thanks & Best regards

    Read the article

  • Redmine with Apache 2 + Passenger nightmare --- site is up and available, but Redmine doesn't execute

    - by CptSupermrkt
    I was determined to figure this out myself, but I've been at it for a total of more than 10 hours, and I just can't figure this out. First, let me detail my environment (which I cannot change): Server version: Apache/2.2.15 (Unix) Ruby version: ruby 1.9.3p448 Rails version: Rails 4.0.1 Passenger version: Phusion Passenger version 4.0.5 Redmine version: 2.3.3 I have followed the Redmine instructions all the way through the test webserver to check that installation was successful with this command: ruby script/rails server webrick -e production The roadblock which I cannot overcome is getting Apache and Passenger to interpret and properly serve Redmine. I have searched pretty much every possible link within the first 10 pages or so of Google results. Everywhere I go I come across conflicting/contradicting/outdated information. We have a "weird" setup with Apache (which I inherited and cannot change). Redmine needs to be served through SSL, but Apache already has another website it's serving through SSL called Twiki. By "weird", what I mean is that our file structure is entirely different from all the tutorials out there on this version of Apache which have directories like "available-sites" and such. Here are the abbreviated versions of some of our config files: /etc/httpd/conf/httpd.conf (the global configuration file --- note that NO VirtualHost is defined here): ServerRoot "/etc/httpd" ... LoadModule passenger_module /usr/local/pkg/ruby/1.9.3-p448/lib/ruby/gems/1.9.1/gems/passenger-4.0.5/libout/apache2/mod_passenger.so PassengerRoot /usr/local/pkg/ruby/1.9.3-p448/lib/ruby/gems/1.9.1/gems/passenger-4.0.5 PassengerDefaultRuby /usr/local/pkg/ruby/1.9.3-p448/bin/ruby Include conf.d/*.conf ... User apache Group apache ... DocumentRoot "/var/www/html" So just to clarify, the above httpd.conf file does NOT have a VirtualHost section. /etc/httpd/conf.d/ssl.conf (defines the VirtualHost for ssl): Listen 443 <VirtualHost _default_:443> SSLEngine on ... SSLCertificateFile /etc/pki/tls/certs/localhost.crt </VirtualHost> /etc/httpd/conf.d/twiki.conf (this works just fine --- note this does NOT define a VirtualHost): ScriptAlias /twiki/bin/ "/var/www/twiki/bin/" Alias /twiki/ "/var/www/twiki/" <Directory "/var/www/twiki/bin"> AllowOverride None Order Deny,Allow Deny from all AuthType Basic AuthName "our team" AuthBasicProvider ldap ...a lot of ldap and authorization stuff Options ExecCGI FollowSymLinks SetHandler cgi-script </Directory> /etc/httpd/conf.d/redmine.conf: Alias /redmine/ "/var/www/redmine/public/" <Directory "/var/www/redmine/public"> Options Indexes ExecCGI FollowSymLinks Order allow,deny Allow from all AllowOverride all </Directory> The amazing thing is that this doesn't completely NOT work: I can successfully open up https://someserver/redmine/ with SSL and the https://someserver/twiki/ site remains unaffected. This tells me that it IS possible to have two separate sites up with one SSL configuration, so I don't think that's the problem. The problem is is that it opens up to the file index. I can navigate around my Redmine file structure, but no code ever gets executed. For example, there is a file included with Redmine called dispatch.fcgi in the public folder. https://someserver/redmine/dispatch.fcgi opens, but just as plain text code in the browser. As I understand it, in the case of using Passenger, CGI and FastCGI stuff is irrelevant/unused.

    Read the article

  • Task Scheduler not able to execute .vbs successfully

    - by Django Reinhardt
    Hi there, got this weird problem, which will hopefully have an obvious solution for some enlightened soul: We have several daily tasks we run via a .vbs script on our server (through the Task Scheduler), and for months it has been fine, but recently we've hit a problem. The .vbs script stopped successfully executing... but oddly it worked fine when ran manually! The error given in these circumstances was always "Timeout". We thought we try a little creative thinking, and run the .vbs another way: Via a .bat file. Again we hit weird issues, but with a little more debugging information, this time around. The .bat file is nothing more than... CScript "C:\location\script.vbs" > Log.txt But the Task Scheduler fails with the following error: 0x1: An incorrect function was called or an unknown function was called. The log.txt file says: CScript Error: Initialization of the Windows Script Host failed. (Not enough storage is available to process this command. ) But get this: The .bat file executes perfectly (vbs script and all) if it's executed with a double click! There's only a problem when it's run by Task Scheduler. What the hell? We're running Windows Server 2008 R2 (x64) and yes, the Task Sheduler's results are the same whether the user is logged in or not. Also, the user that can run the scripts successfully manually, is also the same user that runs the scripts in Task Scheduler. Thanks for any help for this weird problem!

    Read the article

  • Is it possible for root to execute a command as non-root

    - by adnan kamili
    I am root user and suppose i want to run any application as another user. is it possible, without switching to another user. Something like # google-chrome user=abc I am actually executing a cli program as a non root user. I have set the sticky bit on and i am using setuid. So the program runs with root privileges. Now i am using system() with in the program to invoke gui app. But i dont want to run it as root. so i want to temporarily drop root privileges only for that call.

    Read the article

  • Can IIS7 execute php files?

    - by Roman
    I have IIS7 on my computer. It can display html files. But if I put a PHP program in the corresponding folder and then try to open it with the Chrome browser I do not see the expected page. Instead Chrome "downloads" this file. Is it beacuse IIS7 cannot display the PHP files?

    Read the article

  • procmail don't execute php script

    - by Phliplip
    Hi, I have setup a kannel SMS gateway on my FreeBSD 7.2 - the service works great. I'm now trying to setup a email2sms feature. For this i have created a system user called kannel and all mails are forwarded to this user. In the home dir of kannel i have the following files. -rw-r--r-- 1 kannel kannel 81B 17 jan 09:50 .procmailrc lrwxr-x--- 1 root kannel 58B 14 jan 13:24 email2sms.php @ -> some-what-some-where -rw-rw-rw- 1 root kannel 5,8K 17 jan 09:52 log.email2sms -rw------- 1 kannel kannel 1,3K 17 jan 09:50 procmail.log -rw-r----- 1 root kannel 606B 14 jan 13:28 rawmail.txt The file email2sms.php is a symlink to the a php script (ZendFramework Application) that takes the email from STDIN, and uses ZendFramework to parse that mail into an object. It then do a http request to the SMS gateway. The php-script works. Content of .procmailrc LOGFILE=$HOME/procmail.log VERBOSE=yes :0 | php email2sms.php >> log.email2sms From last sent email i have this in procmail.log procmail: [97744] Mon Jan 17 09:50:40 2011 procmail: [97744] Mon Jan 17 09:50:40 2011 procmail: Assigning "LASTFOLDER= php email2sms.php >> log.email2sms" procmail: Executing " php email2sms.php >> log.email2sms" procmail: Notified comsat: "kannel@:/home/user/kannel/ php email2sms.php >> log.email2sms" From [email protected] Mon Jan 17 09:50:40 2011 Subject: asdf as Folder: php email2sms.php >> log.email2sms 2600 But there is no new output to log.email2sms, and the script should output the subject of the email. If i sudo as the kannel user and pipe a file with raw email to the script, it executes just fine. [root@webserver /home/user/kannel]# /home/user/kannel/ sudo -u kannel cat rawmail.txt | php email2sms.php >> log.email2sms And the command outputs to log.email2sms as desired. Any ideas guys?

    Read the article

  • SQL server agent job to execute SSIS package fails, package succeds if run manually

    - by growse
    I've got a SSIS package installed on a SQL server (SQL Server 2012). It's fairly simple and just fetches data from a remote data source and adds it into a local table. The remote connection string is using SQL server authentication, while the local connection is using Windows auth. The remote connection password is protected, and the package was imported setting the protection level to Rely on server storage and roles for access control. If I run the SSIS package manually, it works. If I run it from the command line using dtexec, it works. If I use runas to switch to the domain account that the SQL server agent is running under, and then run the package using dtexec, it works. If I create a SQL Agent job with a single step to run the package, it fails, providing very little detail as to what's going on. I'm guessing it's not able to get the password to log into the remote SQL server, because it fails very quickly. Also, if I tick 'log to table' and view the resulting file, I get the following: Description: ADO NET Source has failed to acquire the connection {0D8F2CD4-A763-4AEB-8B52-B8FAE0621ED3} with the following error message: "Login failed for user 'username'.". If I try to add the password in the connection string manually under data sources in the job step dialog, it refuses to save it, always seeming to remove the 'password' bit of the connection string. I thought that SQL server agent jobs always ran under the context of the account which the SQL server agent is running under. This account is a sysadmin on the local SQL server, and the package works using dtexec under that account, so why would it fail when trying to run as an agent job?

    Read the article

  • Apache suEXEC execute script on user dir basis?

    - by Blame
    Iam looking for a way to run a script with different users. I dont want to hardcode the users in the config... and I found some information that it should be possible that the user goes to... lets say: Code: http://localhost/~user1/myscript.cgi and the script gets executes as user 'user1'. Does anybody know if that is possible? If not, do I have to make a new vhost config for every user? Thanks a lot! Greets, Kodak

    Read the article

  • permission denied when trying to execute a binary I burned to a CD-R

    - by user16654
    On a UBUNTU karmic machine, I burned a cd from the command prompt using: cdrecord -v speed=16 dev=0,1,0 /FPS.iso The CD now contains an executable and some files. I tested the cd by loading it onto another machine (Red Hat 5.3) and when I try to run the program I get the following message: bash: ./FPS1_1: Permission denied I can open other files like text documents (the executable also comes with shared libraries). I realized I had burned the cd as root so I burned another one as another user but I still got the same problem. How can I remove this permission or what is the problem? P.S. the image was in / if that helps

    Read the article

  • can't execute scripts compiled with shc

    - by serilain
    I'm trying to use SHC to compile a shell script so that I can set the SUID bit on it and obfuscate what it's doing (I'm attempting to have it run as part of all new users' .bashrc). As a test, I wrote a script that's simply: #!/bin/bash env And compiled it using shc -r -f script.sh However, when I try to run the resulting script by simply doing ./script.sh.x, even after setting it to 777 (just for testing purposes), I get "Operation not permitted; killed" unless I run it as sudo (which I don't want to have to do). Am I running afoul of some Ubuntu permissions that won't let me run binaries created by shc? Thanks!

    Read the article

  • Execute remote shell commands on windows XP embedded

    - by BartD
    The following situation: We have Windows XP Embedded clients that have all admin shares disabled and only have read-only shares (for security reasons). What we want to do is run remote shell (dos) commands on these machines. At first we looked at PsExec & BeyondExec applications (and all sorts of variants), but all of them rely on having at least an admin$ share, which are disabled on our systems. Telnet is not secure enough, as is RSHD servers. So we looked at the next obvious solution: and SSH server. We also prefer an open-source or freeware solution that is still maintained. I looked at freeSSH server for Windows, but that didn't run stable, I tried installing copSSH, WinSSH & openSSH for Windows, but none of these applications seem to work on Windows XP Embedded. The services can either not be installed or cannot be started. I don't know why. Some kind of dependency that is missing. So are there any other solutions out there? I don't care about having to an agent installation locally of some kind on each system, as long as the size of the software is small enough. Can someone suggest some alternatives to what I've already mentioned? Thank you very much.

    Read the article

  • Using MRTG's threshold feature to execute a php script

    - by Dan Fried
    I've set up mrtg using the online manual and the only online tutorial I found on the subject of thresholds, and the threshold just isn't firing. In my mrtg.cfg file, the relevant lines are ThreshDir: /path/to/mrtg/thresh ThreshMaxI[performance]: 1 ThreshMaxO[performance]: 1 ThreshProgI[performance]: /path/to/mrtg/scripts/alert.php ThreshProgO[performance]: /path/to/mrtg/scripts/alert.php The paths are right, because if I enter the paths wrong I get an error on executing mrtg. websitePerformance checks how long it takes to download the homepage, in milliseconds, so it should be exceeding the max every time. Alert.php is working fine when invoked directly from the shell, and when I point to a nonexistent script it tells me the script is not executable. No error messages are being generated, that I can find. The thresh directory is always empty. Why isn't the threshold being triggered by results that are greater than 1? Anyone have any suggestions?

    Read the article

  • Execute build task in Hudson with root privilages

    - by jensendarren
    I have a build script which executes apt-get and therefore requires root privileges. What is the best way to run this script in Hudson? Currently the only solution I have found that works is to add an entry to the sudoers file for the user hudson like so: hudson ALL=(ALL) NOPASSWD:ALL However, although my build script now runs without error in Hudson, I am not entirely comfortable with this solution. Is there a better way?

    Read the article

  • unable to properly execute binaries from PHP

    - by Lowgain
    I was building an app on a SUSE box, and had a binary called create_group for instance, which had a suid bit and allowed my PHP app to call exec('create group grpname'); and create a new group (there are others for users, etc). The binary was a small c script that calls setuid(0) and then runs the user creation stuff. This worked perfectly on the SUSE box I recently moved my project to Ubuntu and everything works fine except these binaries. I can run them from the shell and they work okay, but when I get the PHP app to run them it just does nothing. Is there anything Ubuntu would be doing differently that I'm missing?

    Read the article

  • pf not execute udp port specific block rule

    - by seaquest
    The traffic I want to block can be sniffed as below with tcpdump: 19:16:22.391164 IP 95.95.95.95.2036 > 10.10.10.10.443: UDP, length 8192 So I wanted to write a rule block any udp destination port 443 traffic. block drop quick on igb3 inet proto udp to any port 443 Traffic does not match and does not blocked. However, It matches and blocks if I write rule as below: block drop quick on igb3 inet proto udp to 10.10.10.10 Do you have any remarks? I am using pf in Freebsd.

    Read the article

  • Load multiple commands to execute in AutoCAD command

    - by JNL
    I am trying to load the .arx file for a particular program, once the User clicks on the customized tab in AUtoCAD. Requirements are: 1.When User Clicks on the customized tab, it loads the .arx 2.Runs the program 3.Unload the .arx file. I achieved Point 2 by acedRegCmds-(); I tried getting point 1 and 3 working, by using acedArxLoad(_T("C:\Example\Example.arx")); command, but it did not work. Any help with the same would be highly appreciated Also just curious, should loading multiple commands in LISP work for this?

    Read the article

  • How do i allow users to execute commands via ssh without allocating a psuedo-terminal

    - by Dani El
    I need to allow users to run a limited set of commands. But not to allow them to create interactive sessions. Just like GitHub does. If you try to ssh without a command it greetings you and close the session. I can acquire this by using ForceCommand some-script But getting in some-script i then need to eval user's input. Perhaps any other NoTTY-like option in sshd_config? --- UPDATE --- i'm looking for a pure SSH / Bash solution, not Perl/Python/etc. hacks.

    Read the article

  • Permission denied when trying to execute a binary burned to a CD-R

    - by user16654
    On an Ubuntu 9.10 (Karmic Koala) machine, I burned a CD from the command prompt using: cdrecord -v speed=16 dev=0,1,0 /FPS.iso The CD now contains an executable and some files. I tested the CD by loading it onto another machine (Red Hat 5.3) and when I try to run the program I get the following message: bash: ./FPS1_1: Permission denied I can open other files like text documents (the executable also comes with shared libraries). I realized I had burned the CD as root so I burned another one as another user but I still have the same problem. How can I remove this permission or what is the problem? P.S. the image was in / if that helps

    Read the article

  • How do I allow users to execute commands via ssh without allocating a pseudo-terminal

    - by Dani El
    I need to allow users to run a limited set of commands. But not to allow them to create interactive sessions. Just like GitHub does. If you try to ssh without a command it greetings you and close the session. I can acquire this by using ForceCommand some-script But getting in some-script i then need to eval user's input. Perhaps any other NoTTY-like option in sshd_config? --- UPDATE --- i'm looking for a pure SSH / Bash solution, not Perl/Python/etc. hacks.

    Read the article

  • Linux: Tool to monitor every process, execute-command, shortly, monitor what's happening at the moment

    - by Bevor
    Hello, due to a freeze problem of my Ubuntu 10.10 (it is not isolatable) I though about logging every executable of the kernel somehow in any file to see what happens last when a freeze occures the next time to not lose valuable information. I found acct but this is obviously not what I'm looking for. Actually it logs just user commands and those things. I need something which logs in a much "deeper" level. The best would be some kind of script which records every interrupt. Does anybody know some tool like that?

    Read the article

  • EXECUTE master.dbo.xp_delete_file folder permission issue

    - by Alex
    I'm trying to run a Maintenance Cleanup Task to remove .bak files older than 2 days (simple enough). Been trying all variety of .bak, BAK, .*., and editing the path, but the files are still not getting removed even though I receive a "job succeeded" log message. I'm not at the point where I believe it's a folder permission issue. How do I make sure my SA has the proper permissions to remove files from a folder? Thanks.

    Read the article

  • Execute a program by double-clicking the taskbar and/or the desktop, Windows 7

    - by JoePerkins
    It would be so useful, and not very difficult, because in fact I have Directory Opus installed (a very powerful windows explorer alternative) and it does exactly that, but only by double-clicking the desktop, not the taskbar. Similar options with middle-click would also be nice, such as scrolling the taskbar (maybe cycling directly through the Alt-Tab window). 7 Task Tweaker is close to this, actually, but it doesn't do what I would like.

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >