Search Results

Search found 25651 results on 1027 pages for 'shell script'.

Page 407/1027 | < Previous Page | 403 404 405 406 407 408 409 410 411 412 413 414  | Next Page >

  • scp using a password on the command line

    - by spierepf
    I am trying to write a script that will deploy a build created on my desktop machine (windows/cygwin) to a machine in my test environment (linux). I would like to use scp to copy the build to the target machine. The only account on the target machine is root, and I cannot create a special user for this task. The root user is unable to log in using an ssh key (I suspect that this is configured on the ssh server, but I do not know which configuration options control this). At any rate, I cannot change the configuration of the ssh server. My desktop machine uses Cygwin, and I have ssh installed. What I need is the command-line-fu that will allow me to put the password on the command line. I am aware of the dangers of having a plaintext password in a shell script, but that is not a concern here.

    Read the article

  • Massive 404 attack with non existent URLs. How to prevent this?

    - by tattvamasi
    The problem is a whole load of 404 errors, as reported by Google Webmaster Tools, with pages and queries that have never been there. One of them is viewtopic.php, and I've also noticed a scary number of attempts to check if the site is a WordPress site (wp_admin) and for the cPanel login. I block TRACE already, and the server is equipped with some defense against scanning/hacking. However, this doesn't seem to stop. The referrer is, according to Google Webmaster, totally.me. I have looked for a solution to stop this, because it isn't certainly good for the poor real actual users, let alone the SEO concerns. I am using the Perishable Press mini black list (found here), a standard referrer blocker (for porn, herbal, casino sites), and even some software to protect the site (XSS blocking, SQL injection, etc). The server is using other measures as well, so one would assume that the site is safe (hopefully), but it isn't ending. Does anybody else have the same problem, or am I the only one seeing this? Is it what I think, i.e., some sort of attack? Is there a way to fix it, or better, prevent this useless resource waste? EDIT I've never used the question to thank for the answers, and hope this can be done. Thank you all for your insightful replies, which helped me to find my way out of this. I have followed everyone's suggestions and implemented the following: a honeypot a script that listens to suspect urls in the 404 page and sends me an email with user agent/ip, while returning a standard 404 header a script that rewards legitimate users, in the same 404 custom page, in case they end up clicking on one of those urls. In less than 24 hours I have been able to isolate some suspect IPs, all listed in Spamhaus. All the IPs logged so far belong to spam VPS hosting companies. Thank you all again, I would have accepted all answers if I could.

    Read the article

  • Automate new AD user's home folder creation and permission setup

    - by vn.
    I know if we setup a base folder or a profile path in the Profile tab of an AD user, we can copy it and the folder creation and permission setup will be automated. My problem is that not all my users have a roaming profile and the home folder linking is done thru GPO. When I copy from these users, the home folder isn't created automatically and I have to create it manually and change permission and ownership on that folder, located on the fileserver. What should I do? A script may be nice but it'd have to be run everytime a new user is created and I don't think we can link a script to an AD user creation? I'd like to avoid any manual steps and keep my GPO that way. Using a W2008r2 DC on w7 client boxes. Thanks.

    Read the article

  • What DNS server to use for dynamic load-balancing of website?

    - by Marki555
    I will have 2 servers in different datacenters (different countries) and I want to use DNS load-balancing mainly for High Availability of website hosted on those 2 servers. It is just ad tracking site, which records hit in local database and returns few lines on html code. I want to return 2 A records each time because of DNS pinning in browsers (if one server fails, browser will try second A record which it has already cached). Both servers will be acting also as DNS servers for redundancy. Now comes my proposed solution: I will use BIND and have both servers as a master for that zone. On each server there will be running script, which will periodically test availability (http) of both servers and remove IP from DNS in case of failure. Now the questions :) 1) Is BIND suitable for this solution? I think BIND performance is good and it is easy to manipulate the zone file via script. And as I will modify the zone only in case of failure/maintenance, the modifications (and thus bind reload) won't be often. 2) I plan to use TTL of 5 minutes. The website will have about 1000-3000 req/s but from distinct clients (each IP only 1-3 requests), so I think the DNS load won't be too much. I suppose their ISPs will cache the responses for those 5 mins. Is there any reason to lower the TTL even more? 3) Is my master-master approach good? Or should I make one of the servers master and the other one slave? Right now each server can monitor both itself and the other one. If only webservice fails, both DNS nodes will notice it. If the whole server fails, then the remaining DNS node will notice it and the failed node will not answer DNS queries anyway. 4) Is it a big issue when one NS server does not respond to queries? If yes, I can make a third DNS, so anytime at least 2 of them would accept queries... 5) Should I rewrite the zone file via script, or just use dynamic DNS update (for example via nsupdateutility)?

    Read the article

  • How do I allow a (local) user to start/stop services with a scheduled task?

    - by Mulmoth
    Hi, on a Windows 2008 R2 server I have two small .cmd-scripts to start/stop a certain service. They look like this net start MyService and net stop MyService I want to execute these script via scheduled task, and I thought it would be best to create a local user for this job. The user is not member of the Administrators group. But the scripts fail with exit code 2. When I logon with this local user and try to execute these script in command line, I see a message like (maybe not exactly translated from german to english): Error code 5: Access denied It doesn't matter whether I start the command line as Administrator or not. How can this local user gain rights to do the job?

    Read the article

  • How to Read XML and Generate SQL Insert

    - by hackerkatt
    I am trying to write a VB Script to read a XML file (downloaded daily) and insert the information into a MSSQL DB. The content of the XML is a list if CDRs (Call Data Records). I need to parse the file and insert the cdr's into a table. I'm a Ruby,Perl,PHP,Javascript,SQL,... programer. But I've really never written any VB Script. I've done some googling and find a number of examples on how to generate XML from a SQL Query, but not the reverse. Any help/suggestions would be greatly appreciated. Thank you!

    Read the article

  • Don't run cron job if already running

    - by webnoob
    Hi All, I know this question has been asked already but I either didn't understand the answer or it didn't apply to me. I have a php script that I am calling every 1 minute using CPanel to set up the Cron Job. The nature of the script means that it could overrun for just over the minute so I need to know how to stop the next one running if the first one hasn't completed. I have a VPS running CENTOS 5.5 and have access to WHM and CPanel. I have never used Linux before (only just got the server yesterday) so I have no idea what I am doing and would appreciate some help if possible. If I need to provide more information please let me know (I don't know what info you would need at the moment). Thanks.

    Read the article

  • Cron job failing to backing up a Postgres database

    - by user705142
    I'm unsure what's going on here: I've got a backup script which runs fine under root. It produces a 300kb database dump in the proper directory. When running it as a cron job with exactly the same command however, an empty gzip file appears with nothing in it. The cron log shows no error, just that the command has been run. This is the script: #! /bin/bash DIR="/opt/backup" YMD=$(date "+%Y-%m-%d") su -c "pg_dump -U postgres mydatabasename | gzip -6 > "$DIR/database_backup.$YMD.gz" " postgres # delete backup files older than 60 days OLD=$(find $DIR -type d -mtime +60) if [ -n "$OLD" ] ; then echo deleting old backup files: $OLD echo $OLD | xargs rm -rfv fi And the cron job: 01 10 * * * root sh /opt/daily_backup_script.sh It produces a database_backup file, just an empty one. Anyone know what's going on here?

    Read the article

  • rsnapshot stats

    - by Obscur Moirage
    I'd like to retrieve the following stats from rsnapshot files synced added files modded files deleted files Is there a feature to retrieve these in rsnapshot, or is there another product that's able to do it? EDIT: As requested, I'll try to show that I'm not just asking what I want to do without any research. I wasn't able to locate any rsnapshot feature doing this. Maybe I'm searching in a wrong direction. So, I've built a not very pretty script, called each time before rsnapshot is ran. This Perl script stores each file MD5, in order to compare backup files structures between rsnapshot updates. I'm pretty sure it's worthless to show this code here. I think that keeping an eye on what change on a server, for example, is a useful feature. So, I'm asking. @pauska Most of the time, I'm trying to search for an answer myself, which is not the case here. Thanks

    Read the article

  • Title: Better logging for cronjob output

    - by Stefan Lasiewski
    I am looking for a better way to log cronjobs. Most cronjobs tend to spam email or the console, get ignored, or create yet another logfile. In this case, I have a Nagios NSCA script which sends data to a central Nagios sever. This send_nsca script also prints a single status line to STDOUT, indicating success or failure. 0 * * * * root /usr/local/nagios/sbin/nsca_check_disk This emails the following message to root@localhost, which is then forwarded to my team of sysadmins. Spam. forwarded nsca_check_disk: 1 data packet(s) sent to host successfully. I'm looking for a log method which: Doesn't spam the messages to email or the console Don't create yet another krufty logfile which requires cleanup months or years later. Capture the log information somewhere, so it can be viewed later if desired. Works on most unixes Fits into an existing log infrastructure. Uses common syslog conventions like 'facility' Some of these are third party scripts, and don't always do logging internally.

    Read the article

  • Is it okay to use a language that isn't supported by your company for some tasks?

    - by systempuntoout
    I work for a company that supports several languages: COBOL, VB6, C# and Java. I use those languages for my primary work, but I often find myself to coding some minor programs (e.g. scripts) in Python because I found it to be the best tool for that type of task. For example: An analyst gives me a complex CSV file to populate some DB tables, so I would use Python to parse it and create a DB script. What's the problem? The main problem I see is that a few parts of these quick & dirty scripts are slowly gaining importance and: My company does not support Python They're not version controlled (I back them up in another way) My coworkers do not know Python The analysts have even started referencing them in email ("launch the script that exports..."), so they are needed more often than I initially thought. I should add that these scripts are just utilities that are not part of the main project; they simply help to get trivial tasks done in less time. For my own small tasks they help a lot. In short, if I were a lottery winner to be in a accident, my coworkers would need to keep the project alive without those scripts; they would spend more time in fixing CSV errors by hand for example. Is this a common scenario? Am I doing something wrong? What should I do?

    Read the article

  • crontab still sending emails even with > /dev/null

    - by user2344668
    I have a crontab (root) that runs a script and output is set to /dev/null but I always get the emails whenever it runs. I only want to receive error emails. # Rackspace driveclient update (12pm MST) 0 12 * * * /root/scripts/driveclient-update > /dev/null The only way I can get it to turn off is to use /dev/null 2&1 but then I won't get error emails. This is happening on three different CentOS servers, two are 6.3 and one is 6.4. NOTE: I have read over and over that /dev/null is supposed to send stdout there and prevent the email if there is nothing but stdout from the script, so at works for at least some people; I cannot figure out why it is not working on these servers. Here's an example of where /dev/null is supposed to work: http://www.alphadevx.com/a/384-Suppressing-Cron-Job-Email-Notifications

    Read the article

  • Apache+FastCGI Timeout Error: "has failed to remain running for 30 seconds given 3 attempts, its restart interval has been backed off to 600 seconds"

    - by Sadjad Fouladi
    I've recently installed mod_fastcgi and Apache 2.2. I have a simple cgi script as below (test.fcgi): #!/bin/sh echo sadjad But when I invoke 'mysite.com/test.fcgi' I see "Internal Server Error" after a short period of time. The error.log file shows this error message: [Tue Jan 31 22:23:57 2006] [warn] FastCGI: (dynamic) server "~/public_html/oaduluth/dispatch.fcgi" has failed to remain running for 30 seconds given 3 attempts, its restart interval has been backed off to 600 seconds This is my .htaccess file: AddHandler fastcgi-script .fcgi RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^(.*)$ django.fcgi/$1 [QSA,L] What could the problem be? Is it my .htaccess file?

    Read the article

  • How to configure in crontab with condition statement for checks

    - by chz
    We like to monitor the NAS storage mounted on a linux box. We only like to be notified via mail when the usage exceeds a certain number say 80. We have only seen in linux books where most of them are calling shell scripts at certain times. How do we write inside crontab to only mail us if it exceeds 80 ? Usual eg 2 2 * * * /home/someUser/script.sh 2&1 | mail [email protected] Looking for solution like below 2 2 * * * if [ someNumber "80" ] ; then /home/someUser/script.sh | mail [email protected] Sincerely

    Read the article

  • Repeat the csv header twice without "Append" (PowerShell 1.0)

    - by Mark
    I have prepared a PowerShell script to export a list of system users in CSV format. The script can output the users list with Export-csv with single header row (the header row at top). However my requirement is to repeat the header row twice in my file. It is easy to achieve in PowerShell 3.0 with "Append" (e.g. $header | out-file $filepath -Append) Our server envirnoment is running PowerShell 1.0. Hence I cannot do it. Is there any workaround? I cannot manually add it myself. Thank you.

    Read the article

  • How can I too many files upload more fast way to Cloud files in Rasckspace?

    - by andy kim
    I have a lot of image files, it's all I want to upload to RackSpace cloud files about a million in a single directory the fastest and most efficient way. but I'm use uploading python-cloudfiles script is very slow and I want to know different ways or python script code. because one by one connection upload is very slow. I think one files tar and uncompress directory is better way. but cloudfiles do not support this way. Who know any other way?

    Read the article

  • Hack/software to easily move screen mapping of a wacom tablet?

    - by Riyaah
    I would like to move/manipulate the mapping of my Wacom tablet to my screen as easily as with Inklet: http://tenonedesign.com/inklet.php Is anyone aware of any software or hack to acheive this on Mac OS X? If not I'm thinking about trying to write a script that centers the mapping on the current cursor position, and then bidning that to a button on my tablet. Combined with a script for toggling between fullscreen/part of screen-mapping that should be enough. My idea so far is to edit the .prefs file since it's plain XML, and then restart the tablet driver. But any pointers relating to this would be most welcome, since my knowledge of this kind of things is rudimentary.

    Read the article

  • Receiving a notification on a Cluster Group Failover

    - by Diego
    Hi all, I have several Windows Cluster set up and I have the need of keeping track of failovers. I'd need to receive a notification of sort whenever a group fails over. I've seen some examples around, but I can't rely on the approach of just sending a message whenever the resources are stopped/restarted as it would generate too many false alarms. In few words, I need to be notified if and only if the group really fails over. I was thinking that probably the best way is monitoring the System Event Log, but, if possible, I'd prefer not having to write a script/program from scratch for this issue. Is there any script/product that already does it? Thanks.

    Read the article

  • linux installation cd enviroment

    - by haw3d
    i recently make a custom linux system, for my special need. its on my HDD, but i want to create a cd for installation. multiple day ego, i found a livecd and create install script for that, but in power failure my HDD is gone, and i cant found that live cd again. my install script is based on recoverin tar.gz backup. my requorement is: based on glibc (not uclibc) recognize every devices have you any suggestion? excuse me for my bad english.

    Read the article

  • Using socat to exec php cli

    - by RoyHB
    There are multiple client programs that periodically connect to a port on my server and send a single line of text. When a connection to the port is made I need to start a PHP CLI script that processes the data. There may be many of the remote scripts running/connecting at more or less the same time so I think it would be best if socat forked a process for each connection to run the script. I've gotten socat to do most of what I need, using the command socat tcp-l:myport,fork exec:mypath/socatTest.php I can read the input on php://stdIn. All is good. The problem is that the process doesn't seem to fork, so if a second external program sends data while another is doing the same it gets a connection refused error. Where have I gone wrong?

    Read the article

  • Running 'dd' command at startup?

    - by Usman Ajmal
    Hi, I have set a script to run at Linux startup. The script contains a following line of code dd if=/dev/sda2 of=/dev/sda5 ?> result.txt Now, when my Linux Desktop appear, result.txt contain dd: opening '/dev/sda2': Permission denied If I prefix the dd command with sudo as: sudo dd if=/dev/sda2 of=/dev/sda5 ?> result.txt the result.txt contains sudo: no tty present and no askpass program specified Is there a way I can get around this problem? What I want is to copy 2nd parititon to 5th when a user logs in no matter if he is root, admin, Desktop or an unprivileged user. Thanks a lot as always.

    Read the article

  • Why is the "file" command get confused on .py files?

    - by pythonic metaphor
    I have several python modules that I've written. Randomly, I used file on this directory, and I was really surprised by what I saw. Here's the resulting count of what it thought the files were: 1 ASCII Java program text, with very long lines 1 a /bin/env python script text executable 1 a python script text executable 2 ASCII C++ program text 4 ASCII English text 18 ASCII Java program text That's strange! Any idea what's going on or why it seems to think python modules are very often java files? I'm using CentOS 5.2.

    Read the article

  • rsync assigns deny permission

    - by user773478
    Currently a script is used to copy files using rsync (version 2.6.9 protocol version 29) from Linux/Unix servers to W2K3 server using very basic command such as "rsync -v source_server::share_name/file_name /cygdrive///file_name" The script further makes copy of this downloaded file for other purposes. This is part of a larger middleware that is being moved to new hardware on W2K8R2 Second part of making copy of the file does not work using more recent rsync client version 3.0.7 protocol version 30 (shows up as cwRsync in add/remove programs) Reason being rsync assigns special permissions to file that includes deny. The user (service account) which downloads the file is in local admin group. The file can be copied elsewhere using rsync. It can be deleted. But cannot be opened or copied locally by same user as deny permission supersedes.

    Read the article

  • Open Terminal Here, as Root (OS X)

    - by cwd
    There is a pretty awesome applescript called "Open Terminal Here" ( http://www.entropy.ch/software/applescript/ ) which you can add to your finder's toolbar and click when you want to launch a terminal console which is set to that directory. Sometimes I need to be root, and so I end up starting terminal, doing something like sudo -i and then I have to change back to the previous directory because the sudo command is landing me in /var/root. I'm using sudo -i because I like it to load things like aliases / the bash profile. The script is applescript, and here's the important part of how it works: ... set cmd to "cd " & quoted form of the_path & " && echo $'\\ec'" ... tell application "Terminal" activate do script with command cmd How do I get this to load as root?

    Read the article

  • Open application in background without losing current window focus. Fedora 17, Gnome 3

    - by Ishan
    I'm running a script in the background which loads an image with feh depending on which application is currently in focus. However, whenever the script opens the image, window focus is lost to feh. I was able to circumvent this by using xdotool to switch back to the application that was originally in focus, but this introduces a short annoying period of time where the focus is switched from feh to the application. My question is this: is there any way to launch feh in the background such that window focus is NOT lost? System: Fedora 17, Gnome 3, Bash Thanks a ton!

    Read the article

< Previous Page | 403 404 405 406 407 408 409 410 411 412 413 414  | Next Page >