Search Results

Search found 5695 results on 228 pages for 'logoff scripts'.

Page 122/228 | < Previous Page | 118 119 120 121 122 123 124 125 126 127 128 129  | Next Page >

  • Linux: multiple network connections - 3G/4G / Wifi / LAN / etc; how can i set a preferred network connection to use?

    - by Alex
    I've been looking at how I can setup a laptop that has multiple network interfaces, but a problem exists if all the connections are active, i.e. 3G, WiFi and LAN are all connected, I would like it to default to LAN. I would like to set "weights" or "priority" to each connection, so that if the LAN is unplugged, it'll default to WiFi - if its in range and working, otherwise, it'll switch and use the 3G dongle; I've been looking around and I can see that the "metric" counter for route isn't being used for recent kernels. I thought that would be able to set the preferred gateway / connections - but according to the man page: man route: OUTPUT Metric The 'distance' to the target (usually counted in hops). It is not used by recent kernels, but may be needed by routing daemons. So I'm confused, are there any scripts / apps / anything that can detect active network connections, and by way of configuration, send my default gateway network traffic through that interface if its active / alive?

    Read the article

  • One server, Two APC UPS on redundant power supplies : How to trigger shutdown ?

    - by Falken
    I have a server racked and its redundant power supplies plugged in two APC Smart-UPS 3000 XLM. Each UPS is connected to two different mains power sources. Two instances of apcupsd are running, each one connected to its own UPS. They can both detect when an UPS is on Battery, and each UPS can then trigger a shutdown on the server. Question is : How NOT to shutdown if ONLY ONE UPS runs out of battery ? Note : Smart-UPS 3000 XLM has a "Power Sync" Function that is able to connect to its peer and detect its status. But when I pulled the plug out of one of them, the Shutdown order was sent anyway. I'm thinking about modifying the shutdown scripts to check with "apcaccess" if the other ups is down. Any experience on this would be appreciated !

    Read the article

  • Missing eth0 configuration file

    - by Godric Seer
    I have two servers both running Scientific Linux 6 on the same network. Since I want SSH access to both of them, I want to give them both static IPs so I can setup port forwarding and not worry how my router assigns local IPs. I found that I need to edit the configuration file /etc/network-scripts/ifcng-eth0, however that file does not exist. The network card works fine, and I am able to ssh as long as I access the router and find the local ip. Can I simply make my own configuration file, or did I miss some step in configuring the system that I need to complete?

    Read the article

  • XUbuntu vsftpd couldnt restart

    - by Fara
    # sudo /etc/init.d/vsftpd restart Rather than invoking init scripts through /etc/init.d, use the service(8) utility, e.g. service vsftpd restart Since the script you are attempting to invoke has been converted to an Upstart job, you may also use the stop(8) and then start(8) utilities, e.g. stop vsftpd ; start vsftpd. The restart(8) utility is also available. vsftpd start/running, process 3237 then I tried this # service vsftpd start vsftpd start/running, process 3275 # service vsftpd stop stop: Unknown instance: # service vsftpd restart stop: Unknown instance: vsftpd start/running, process 3315 # sudo service vsftpd restart stop: Unknown instance: vsftpd start/running, process 3358 I couldn't get the vsftp resrated when ever I try the restart the above happens ! How to restart ? Please advice

    Read the article

  • Fourier transform software

    - by CFP
    Hello everyone! After spending a lot of time searching for this, I thought that some SuperUser gurus might know the answer :) I'm searching for an open source application to compute an FFT, that could: * Import a list of points from a text file (in any format, I could write conversion scripts if needed), for example 0,1; 1,2; 4,5 * Compute the associated discrete transform, and output the list of coefficients Ideally, it would also display the plot and the associated fourier decomposition on the same graph, to allow comparison, but this is not absolutely needed. It can be either on Windows or on Linux/Unix. Can you think of a solution? Thanks, CFP.

    Read the article

  • Can Acrobat 11 be made to do OCR using multiple CPU cores?

    - by tarcman.
    OCR processing takes time. Using multiple CPU cores would speed up processing. Acrobat 10 was not a multithreaded application. How about Acrobat 11? Does 11 by default do OCR using multiple CPU cores (if available)? If not, are there any workarounds, e.g. scripting, to help make Acrobat 11 do OCR using multiple CPU cores? Either through Acrobat's built in scripting language or using external scripts that launch and direct multiple single thread instances of Acrobat to in parallell to parts of the processing job. Note: This question is not too localized (not limited to a specific moment in time) because (1) Adobe does not release new major Acrobat versions very often (Acrobat 10 was released two years ago) and (2) Adobe Acrobat is a widely used application.

    Read the article

  • jquery ajax and php arrays

    - by sea_1987
    Hi There, I am trying to submit some data to a PHP script, however the PHP scripts expects the data to arrive in a specific format, like this example which is a dump of the post, Array ( [save] => Add to shortlist [cv_file] => Array ( [849709537] => Y [849709616] => Y [849709633] => Y ) ) The process is currently that a user selects the product they want using checkboxes and then clicks a submit button which fires the PHP scripts, The HTML looks like this, div class="row"> <ul> <li class="drag_check ui-draggable"> <input type="checkbox" id="inp_cv_849709537" name="cv_file[849709537]" class="cv_choice" value="Y"> </li> <li class="id"><a href="/search/cv/849709537">849709537</a></li> <div class="disp"> <li class="location">Huddersfield</li> <li class="status"> Not currently working </li> <li class="education">other</li> <li class="role"> Temporary </li> <li class="salary">£100,000 or more</li> <div class="s">&nbsp;</div> </div> </ul> <dl> <dt>Current Role</dt> <dd>Developer </dd> <dt>Sectors</dt><dt> </dt><dd> Energy &amp; Utilities, Healthcare, Hospitality &amp; Travel, Installation &amp; Maintenance, Installation &amp; Maintenance </dd> <dt>About Me</dt><dt> </dt><dd></dd> </dl> <div class="s"></div> </div> I am needing to use AJAX instead now, but I need to send the data to PHP in the format it expects here is what I have so far, $('#addshortlist').click(function() { var datastring = ui.draggable.children().attr('name')+"="+ui.draggable.children().val()+"&save=Add to shortlist"; alert(datastring); $.ajax({ type: 'POST', url: '/search', data:ui.draggable.children().attr('name')+"="+ui.draggable.children().val()+"&save=Add to shortlist", success:function(){ alert("Success"+datastring); }, error:function() { alert("Fail"+datastring); } }); return false; }); I would really appreciate any help

    Read the article

  • Protect me from this perl syn flood script [closed]

    - by Luka
    Possible Duplicate: How to best defend against a “slowloris” DOS attack against an Apache web server? As everybody here I was interested in hacking in a period of time, using a perl scripts. CSF is protecting me from every perl script which can make damage. But not from this one here: http://pastebin.com/CfRiSVkQ It's Syn Flood script, when I attack my dedicated server from another dedicated with 100MBPS link csf is detecting the attack and he always block attackers address but I am flooded and sites are down, I get email from csf, but attack is still damaging sites! Then I need to restart httpd, csf and sites are online again...

    Read the article

  • How do I configure custom routes when an interface is configured?

    - by ManicDee
    Other Superuser questions have addressed the issue of adding custom routes to access e.g.: multiple networks of a corporate network through one interface, while accessing the Internet through another interface. So assuming that I have a script to add specific routes when en0 is configured, and a separate script to add specific routes when en1 is configured, is there some way I can trigger those scripts to run automatically when Mac OS X/Darwin starts and configures those interfaces? Back in my Linux days, it was possible to add an option in /etc/network/interfaces along the lines of: iface eth0 inet dhcp up /usr/local/sbin/eth0-routes-up Is there something similar for Mac OS X?

    Read the article

  • Powershell and long-running external tools?

    - by leeand00
    I'm trying to compact a MS-Access database using JetComp.exe using a powershell script. Here is the operative lines: # 4. Run JetComp LogWrite("Begin: Running JetComp") .\JETCOMP.EXE -src: $srcDB -dest: $dstDB | Out-Null #Run this command and wait for it to finish... IfErrorExit("Error Compacting Database") LogWrite("End: Running JetComp") The JETCOMP.EXE program seems to complete long before it is actually finished and the $dstDB ends up being smaller than the compact should even make it. Initially ($srcDB) it's about 1.8 GB and by the time the command finishes it's about 300,000 kb (about 0.29 gb) that's a pretty long way off from 1.8 gb which when compacted manually ends up being about 1.6 gb. Is there some sort of timeout I don't know about in powershell scripts? P.S. I know that when running JETCOMP.EXE manually, that the system often detects it as "not responding" even though it's actually getting the job done, and waiting long enough will allow it to complete.

    Read the article

  • Launchd execute command when folder contents modified or changed.

    - by ThomasReggi
    For the past two days i've been trying to get a launchd plist to execute a script "program" when the contents of a folder on my desktop contents have modified or changed. I've gone through tons of configuration settings and have tried the Users/me/Desktop/folderinquestion and Users/me/Desktop/folderinquestion (with and without trailing slash). The script executes only when something is added or removed but it doesn't recognize when files are updated and subdirectories. launchd is really my last hope to getting this to working i've already exausted folder actions, and bash scripts, uninstallable linux methods etc. I have used Lingon to create my plists and have followed this youtube tutorial. Any help would be greatly appreciated. This is what I have right now, and like I said does not work when folder is modified or changed IDEA: Thinking about creating two separate plists that reference each other one plist can watch a folder for additions and subtractions and when one occurs it can create another plist that watches every file in the folder, this could also be recursive and count in for subdirectories.

    Read the article

  • Getting XAMPP to work with multiple version of PHP

    - by Pennf0lio
    How can I install XAMPP to work with different versions of PHP? I use XAMPP because some of the scripts are buggy when run in WAMP. I use WAMP because it supports different versions of PHP. But now I would like to streamline it down to just XAMPP so that my web development would be easier to manage. Is it possible to configure XAMPP to work with more than one version of PHP? Or is it something that I have to look for in an alternative solution? Note: I'm running on Windows 7.

    Read the article

  • Perl throwing 403 errors!

    - by Jamie
    When I first installed Perl in my WAMP setup, it worked fine. Then, after installing ASP.net, it began throwing 403 errors. Here's my ASP.net config: Load asp.net module LoadModule aspdotnet_module "modules/mod_aspdotnet.so" Set asp.net extensions AddHandler asp.net asp asax ascx ashx asmx aspx axd config cs csproj licx rem resources resx soap vb vbproj vsdisco webinfo # Mount application AspNetMount /asp "c:/users/jam/sites/asp" # ASP directory alias Alias /asp "c:/users/jam/sites/asp" # Directory setup <Directory "c:/users/jam/sites/asp"> # Options Options Indexes FollowSymLinks Includes +ExecCGI # Permissions Order allow,deny Allow from all # Default pages DirectoryIndex index.aspx index.htm </Directory> # aspnet_client files AliasMatch /aspnet_client/system_web/(\d+)_(\d+)_(\d+)_(\d+)/(.*) "C:/Windows/Microsoft.NET/Framework/v$1.$2.$3/ASP.NETClientFiles/$4" # Allow ASP.net scripts to be executed in the temp folder <Directory "C:/Windows/Microsoft.NET/Framework/v*/ASP.NETClientFiles"> Options FollowSymLinks Order allow,deny Allow from all </Directory> Also, what are the code tags for this site?

    Read the article

  • Python server does not excecute PHP script: permission denied

    - by krisvandenbergh
    I am trying to execute a PHP file through a Python server. However, I get the following error: File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/CGIHTTPServer.py", line 255, in run_cgi os.execve(scriptfile, args, os.environ) OSError: [Errno 13] Permission denied The python server is running though. What have I done so far? Chmod'ed recursively all directories to (chmod -R a+x) (I know this is not secure but its just for testing purposes) for both Python installation directories and my scripts. Tried to find out if python server is running as root through ps aux grep py I am out of ideas. What could be going wrong else? Thanks for the feedback.

    Read the article

  • Workaround for starting zabbix agent on Ubuntu. update-rc.d LSB mismatch

    - by bryan kennedy
    I am trying to run the zabbix-agent (1.8.1) on boot on Ubuntu (lucid 10.04). Zabbix is installed just fine, and it manually starts just fine with /etc/init.d/zabbix-agent start However it doesn't start on boot, because when I run: sudo update-rc.d zabbix-agent default I get: update-rc.d: warning: zabbix-agent start runlevel arguments (none) do not match LSB Default-Start values (2 3 4 5) update-rc.d: warning: zabbix-agent stop runlevel arguments (none) do not match LSB Default-Stop values (0 1 6) After googling around I found some cryptic information about this being a possible bug with Zabbix's startup scripts, but I can't find a workaround. I'm trying to understand how the update-rc.d system works, but I'm not getting very far. How can I modify this setup to start the zabbix-agent on startup?

    Read the article

  • How can I suppress /etc/issue without losing error messages?

    - by Andy
    Is it possible to tell the ssh client to not print the connects of /etc/issue to stdout when connecting to a remote host, but to print out any other diagnostic (e.g. error) messages? Either using ssh -q or having LogLevel quiet in ~/.ssh/config suppresses the /etc/issue printing, but also turn off error messages. I've tried touching ~/.hushlogin as well - that stops /etc/motd being printed, but doesn't affect /etc/issue. The most obvious solution is just to remove /etc/issue, but company policy dictates the file be there with dire warnings about unauthorised access. This is non-negotiable. Unfortunately, I've got a bunch of scripts that run across quite a few hosts via ssh, and the log files are a) very large and b) full of legalese. Since quite a lot of stuff runs unattended, I don't want to lose any error messages that are printed.

    Read the article

  • Is using Capistrano for user maintenance tasks on university lab feasible?

    - by danielkza
    I've been looking around for tools to replace some legacy scripts for creating and maintaining accounts in a university computer lab ecosystem consisting of things like: LDAP and Kerberos for authentication User home storage and web pages Entries on an SQL database Printing quotas Mailing lists, etc. I'd also like to automate machine and VM membership for Kerberos and Puppet if possiible. I've found Capistrano, and while the basic principle of running tasks on remote hosts through SSH seems to fit, and the DSL in Ruby looks quite nice, I've found most documentation is related to application deployment, not generic tasks. I'm also not aware of any good way to parameterize tasks so I can pass on the user information for creation. Is something about Capistrano I am missing, or is it not the correct tool for this job? Are there any more userful alternatives?

    Read the article

  • What is the difference between "su --command" and "su --session-command"?

    - by oliver
    Running # su - oliver --command bash gives a shell but also prints the warning bash: no job control in this shell, and indeed Ctrl+Z and fg/bg don't work in that shell. Running # su - oliver --session-command bash gives a shell without printing the warning, and job control indeed works. The suggestion to use --session-command comes from Starting a shell from scripts using su results in "no job control in this shell" which states "[a security fix for su] changed the behavior of the -c option and disables job control inside the called shell". But I still don't quite understand this. When should one use --command and when should one use --session-command? Is --command (aka -c) more secure? Or should one always use --session-command, and --command is just left in for backwards compatibility? FWIW, I'm using RHEL 6.4.

    Read the article

  • wamp and xamp Php versioning

    - by Pennf0lio
    Hi, I need advice installing XAMPP with different php versions. How can I install XAMPP supporting different version of php? Before I formated my computer I used both WAMP and XAMPP for specific purposes. I used XAMPP because some of the scripts are buggy when run in WAMP, I used WAMP because it support different versions of PHP. Now I want that feature that support different version of PHP to be added to XAMPP. So I could just install 1 software for my Web Development and It would be easy for me to manage. If this is not possible can you give me an alternative solutions? Thanks Note: I'm running on Windows 7

    Read the article

  • Log of cron actions on OS X

    - by Doug Harris
    Does the cron which comes with OS X log its actions anywhere? I'm not looking for output of any particular cron job, but rather log of what cron is doing. On a couple linux machines I've checked, there's /var/log/cron which has contents like: Apr 26 11:00:01 localhost crond[27755]: (root) CMD (/root/bin/mysql-backup) Apr 26 11:01:01 localhost crond[27892]: (root) CMD (run-parts /etc/cron.hourly) Apr 26 11:07:01 localhost crond[28138]: (root) CMD (/usr/local/bin/python /home/ user1/scripts/pythonscript.py) Apr 26 11:18:18 localhost crontab[28921]: (user2) LIST (user2) Apr 26 11:18:22 localhost crontab[28929]: (user2) BEGIN EDIT (user2) Apr 26 11:18:59 localhost crontab[28929]: (user2) REPLACE (user2) This shows when jobs ran, when users viewed or edited crontabs, etc. This stuff is nowhere that I've found on my Snow Leopard machine.

    Read the article

  • Rebuilding /etc/rc?.d/ links

    - by timday
    A regular filesystem check on a Debian Lenny system triggered an fsck, and that nuked a handful of links in the /etc/rc?.d hierarchy (unfortunately I didn't keep a list). The system seems to boot and run normally, but I'm worried its storing up trouble for the future. Is there an easy (fairly automatic) way of rebuilding this piece of the system ? As I understand it, the links are generally manipulated by package postinst scripts using update-rc.d (and I haven't made any changes from the installed defaults). Without any better ideas, my plan is one of: Diff a listing with another similar system to identify which packages need their links repairing. Wait until the system is upgraded to Squeeze (hopefully not too long :^) and assume the mass package upgrade will restore all the missing links.

    Read the article

  • test of ICMP block

    - by Marcos
    In my bash scripts I have been using something like: until fping -u google.com; do echo "$0[$$] Network/DNS down?? $(date)" 1>&2 && sleep $(($RANDOM%(1 + ++trynum * 1) +1)).222; done to test for online connectivity. It halts in place, sleeping growing random intervals, until it can ping google.com again. Problem: At some sites ICMP pings are blocked altogether, and web pages are still reachable. What's a short way to test for this general case? Based on that test I will switch over to an http-based test like the exit status of curl -s google.com >/dev/null if that is a good one.

    Read the article

  • Easy Deployment Split Tunnel VPN Connection

    - by Joey Harris
    I was wondering if anybody could offer some insight as to how I can mass deploy VPN connection settings that support split tunneling. It has to work on both Mac and Windows systems though if a script is used, it obviously can be 2 separate scripts for both platforms. I will be setting up a Windows server with a file server and Exchange server and to access the file server I will have the clients go through VPN because we will have sensitive data. I don't want the servers network to be bogged down with the clients normal internet traffic so I will be needing some way to setup split tunneling on the clients without them having to put in a few commands every time to setup the static routes. Ive looked at Cisco VPN client but I want to try and stick with windows RRAS and avoid buying a Cisco VPN endpoint. Im basically looking for a good VPN client that can support split tunneling and mass deployment.

    Read the article

  • Arch Linux Terminal (via ssh) + Dropbox but sync only 1 single selected folder

    - by Norfeldt
    Sorry for the weird title... I'm (still) quite new to linux and are doing shh commands to an Arch Linux device that has not screen output options. So everything has to be done in the terminal (not my super element). I use the linux device to play around with python (which is quite fun). Now I would like sync my script folder with dropbox. Since I don't have enought space to sync all my dropbox files to the device, I would like to know how I can set it up in a way that it only syncs with the folder I choose. At the time being I have not installed dropbox because I'm afraid that it will immediately begin to sync all my dropbox folders onto my linux device. BONUS INFO: I already have created a folder in my dropbox that I that contains some py scripts I would like to have synced with my linux device.

    Read the article

  • Xen and HyperVM build question on os template

    - by Levi De Haan
    I recently built a server with hypervm and xen, now i know xen from command line, but hypervm ties into our whmcs and so its a requirement, however my question is this, when i build a new o/s template my partition table is gone, and i know why, but i was wondering if anyone has built anything in hypervm for adding in partition tables, so i dont have to reinvent the wheel :). i can do it command line in the created vm with fdisk, and i have tracked down the creation scripts for hypervm but i am unsure if these insert directly into the machine as it looks like a lot of the things it does are externalized and are for xen to assign things like ip address etc.. oh and on an aside when i go in to modify the .cnf file to change the boot disk from cdrom to drive on windows when i boot using hypervm it overwrites my setting again..frustrating as heck, i've been trying to track down where in the code it does this, has anyone else had this problem and if so how did you fix it if you did?

    Read the article

< Previous Page | 118 119 120 121 122 123 124 125 126 127 128 129  | Next Page >