Search Results

Search found 29619 results on 1185 pages for 'external script'.

Page 324/1185 | < Previous Page | 320 321 322 323 324 325 326 327 328 329 330 331  | Next Page >

  • Jumping Login Box after Lighdm Multiple Monitor workaround

    - by Tom Gamon
    So I used this workaround to sort my resolution at the login screen when using multiple monitors with Lightdm. #!/bin/bash XCOM0=`xrandr -q | grep 'VGA1 connected'` XCOM1=`xrandr --output LVDS1 --primary --auto --output VGA1 --auto --right-of LVDS1` XCOM2=`xrandr --output LVDS1 --primary --auto` # if the external monitor is connected, then we tell XRANDR to set up an extended desktop if [ -n "$XCOM0" ] || [ ! "$XCOM0" = "" ]; then echo $XCOM1 # if the external monitor is disconnected, then we tell XRANDR to output only to the laptop screen else echo $XCOM2 fi exit 0; Found Here: How to force Multiple Monitors correct resolutions for LightDM? It works great. However, now when I am on my login screen, the login box seems to jump to between the two displays. Any advice as to how I could make it stay on one display? Thanks

    Read the article

  • Do all domains on the same shared hosting server have the same IP or ID

    - by silow
    Here's what I've got: siteA.com and siteB.com are hosted on hostgator. They're hosted on the same account of a shared hosting server (not VPS or dedicated). script.php is an external site that each of these 2 sites are accessing. I noticed that when siteA.com or siteB.com access the outside script.php, the script identifies them both as 1a.12.12ab.static.theplanet.com (apparently because hostgator uses theplanet.com servers). The fact that they're identified as the same value isn't surprising because after all they're hosted on the same account /home/user123/public_html. What I'm wondering about is how about other websites that are hosted on the same shared hosting server, but under other accounts. Basically other websites that are under another developer's control, but just happen to share the same hardware (hosting server). Do they also have the exact same identifier 1a.12.12ab.static.theplanet.com or that changes by account?

    Read the article

  • Throttling bandwidth on a per group basis

    - by Robreylen
    I am wondering if it is possible to create a bandwidth shaping/throttling script that shapes traffic based on user group. That is, if user1, user2, are in user group group1, they will have 1mb/s download and 1mb/s upload, whilst if user3 and user4 are in group2, they will have 256kb/s download and 256kb/s upload. I've read a bit about this and I found some iptables and TC implementations of a per user solution, but I have not seen anything for a user group. Hopefully it can be simply implemented in form of a custom iptables rules and script running with TC or the like. Here is a script I was looking into that does a system wide throttle: http://atmail.com/kb/2009/throttling-bandwidth/ I assume it is possible to do user group throttling since it is possible for throttling on a per user basis. Thanks for any info you can provide for this question.

    Read the article

  • How to get the Host value inside ~/.ssh/config

    - by iconoclast
    Within a ~/.ssh/config or ssh_config file, %h will give you the HostName value, but how do you get the Host ("alias") value? Why would I want to do that? Well, here's an example Host some_host_alias HostName 1.2.3.4 User my_user_name PasswordAuthentication no IdentityFile ~/.ssh/some_host_alias.rsa.id LocalCommand some_script.sh %h # <---- this is the critical line If I pass %h to the script, then it uses 1.2.3.4, which fails to give it all the options it needs to connect to that machine. I need to pass some_host_alias, but I can't find the % variable for that. (And: yes! I'm aware of the risk of recursion. That's solved inside the script.) UPDATE: Kenster pointed out that I could just hard-code the Host value as an argument to the script. Of course this will work in the example I gave, but it won't work if I'm using pattern matching for the Host.

    Read the article

  • Is it possible to keep only one Database for both web and desktop applications?

    - by B4NZ41
    I'm experiencing a trouble with my business model, let me explain better. I'm developing a software for 1 year and few months, it's for the food industry, more exactly a software to: Delivery, Take Way, Table Reservation, POS, Accounts Payable and Receivable, Prints(receipt), Kitchen Monitors Orders, Customers Orders Control and Fiscal Area. Well, I had separated the software mainly in two areas, one is web area and the other is desktop area (Used by Admins only) and local installed. 1 - Web Area (Basically do the follow:) Show Catalog with the products Customers Make Orders Customers Pay for the Orders etc ... as mentioned above 2 - Desktop Area Manage Orders Manage Customers Manage Suppliers Manage Accounts Payable and Receivable etc ... as mentioned above The web area is hosted in an online web server (scripts and database are online). The Desktop area is hosted locally in a Linux machine with a local database and local scripts files. My question is: Is it possible to keep only one Database for both applications? If YES, please what is the best approach? Follow my technical specification environment Database: Actually I have two databases working and I would love to keep only one. Operating System: Linux (Kernel 2.6.X and above) or Windows (XP and above) For the Web Area Apache, PHP, Python, Java Script, Shell Script and MySQL. For the Desktop Area: PHP-GTK2, Apache, PHP, MySQL and Shell Script.

    Read the article

  • How do I get debuild to put the binary in /usr/bin?

    - by SammySP
    I have been recently trying to package a small Python utility to put on my PPA and I've almost got it to work, but I'm having problems in making the package install the binary (a chmod +x Python script) under /usr/bin. Instead it installs under /. I have this directory structure - http://db.tt/0KhIYQL. My package Makefile is like so: TARGET=usr/bin/txtrevise make: chmod +x $(TARGET) install: cp -r $(TARGET) $(DESTDIR) I've used $(DESTDIR), as I understand it to place the file under the debian subdir when debuild is run. I have the txtrevise script, my executable, under usr/bin folder under the root of my package. I also have the Makefile and usr/bin/textrevise in my tarball: txtrevise_1.1.original.tar.gz. However when I build this and look inside of the Debian package, txtrevise is always at the root of the package instead of under usr/bin and will be installed to / instead of /usr/bin. How can I get debuild to put the script in the right place? Thanks. Any help would be greatly appreciated. I'm stumped.

    Read the article

  • How to share/access to partition from ubuntu vmware

    - by chr
    I am beginner at Ubuntu. Here is my problem. I have Ubuntu installed on my external HDD and i am running XP through vmware on Ubuntu, because my internal disk is dead atm. External HDD have ext4 (37gb) and 2 NTFS partition (36gb and 220gb). My question is, how i can access that 220gb (or 36gb) NTFS partition from vmware XP? I was already try search for similar posts but no luck to solve my problems. Thank you in advance Regards

    Read the article

  • Automate setup of constrained kerberos delegation in AD

    - by Grhm
    I have a web app that uses some backend servers (UNC, HTTP and SQL). To get this working I need to configure ServicePrincipalNames for the account running the IIS AppPool and then allow kerberos delegation to the backend services. I know how to configure this through the "Delegation" tab of the AD Users and Computers tool. However, the application is going to be deployed to a number of Active Directory environments. Configuring delegation manually has proved to be error prone and debugging the issues misconfiguration causes is time consuming. I'd like to create an installation script or program that can do this for me. Does anyone know how to script or programmatically set constrained delegation within AD? Failing that how can I script reading the allowed services for a user to validate that it has been setup correctly?

    Read the article

  • init.d service died

    - by jerluc
    Adapting some code from a linux forum, I've added a service script to /etc/init.d on my ubuntu natty server to start/stop/restart node.js It literally was working the first day I made it, but then today, after viewing my website this morning, the server threw a 404, and upon further inspection, the node.js process was gone. So I went to start the service again, only this time, node.js didn't start at all, and ever since I haven't been able to get my service script working. Below is the entire script: #!/bin/sh # # Node Server Startup # case "$1" in start) echo -n "Starting node: " daemon node /usr/local/www/server.js echo touch /var/lock/subsys/node ;; stop) echo -n "Shutting down node: " killall node echo rm -f /var/lock/subsys/node rm -f /var/run/node.pid ;; status) status node ;; restart) $0 stop $0 start ;; reload) echo -n "Reloading node: " killall node -HUP echo ;; *) echo "Usage: $0 {start|stop|restart|reload|status}" exit 1 esac exit 0 Thanks for any help!

    Read the article

  • Installing 12.04 through Update Manager on a XP/ubuntu dual-boot

    - by Madeline Mcormick
    I currently have a dual-boot system running XP Pro SP3 with Ubuntu 10.04 LTS. I decided to upgrade to 12.04 using the Update Manager from the network and NOT using ISO CD version. Now that I am in the middle of 12.04 installation, I have this immense fear that this upgrade from update manager on the network server may affect my Win XP OS and may render it un-bootable. I tried backing up files while its upgrading to Ubuntu but it does not recognize any external media like external HDD. What should I do?

    Read the article

  • how to check if something is in the queue of torque?

    - by kloop
    I want to re-run some jobs that completed prematurely under torque. These jobs are run through .job scripts (using qsub). However, I don't want to re-run a job which is already in the queue. Given a script filename, how can I know whether it is already in torque's queue (using qstat?) or not? I prefer to do it programmatically, of course, so any oneliner that searches for a given script name would be great. I will note that I can grep submit_args in qstat -f, but I can't get it to display the whole script name when it is too long. This is crucial. EDIT: I managed to solve it using the following command: qstat -x | perl -pi -e 's/\<\//\n/g' | grep job$ | grep -v submit_args | perl -pi -e 's/Job_Id\>\<Job_Name\>//' works because all my scripts end in the string "job".

    Read the article

  • who deleted my files?

    - by akalter
    I have some linux servers. On two of our server we have MySQL. We have daily backup on both machine. But the scripts are different. I saw both scripts. On one of them I saw the "delete older files" algorithm, but in the other this is happening but not from the script. I am trying to discover who deletes my files, because of that I want to use same script on both machine because of that in the script with the deletion I also copy the files to the another server, and I want to do that in both servers. Who have an idea who deleted my older backups? Thank you!

    Read the article

  • Windows, Apache and MSSQL Authentication

    - by user1114330
    I have a create database script written in perl. I remember it working just fine another machine. A couple years later using a Vista machine I am trying to use it again and it keeps failing. The main difference is that now I am using Apache instead of IIS. In the script the IUSR account is granted permissions as it needs to write to the database as a part of another program. IIS has been uninstalled on this machine but the IUSR account still exists. The NT AUTHORITY\IUSR is also seen in the logins drop down in MSSQL(2012). The machine is running Vista Home Edition. However when running the script I get errors that say that NT AUTHORITY\IUSR cannot be found. I tried also with COMPUTERNAME\IUSR just for the heck of it and of course it was not found. I also tried with IUSR alone and for some reason the user isn't being "found"? Any ideas?

    Read the article

  • Run a service after networking is ready on Ubuntu?

    - by TK Kocheran
    I'm trying to start a service that depends on networking being started, whenever the computer is rebooted. I have a few questions: Is this easily possible from an /etc/init.d script? I have tried creating a script here (conforming to the standards), but I'm really doubtful that it's even running on boot, let alone working. When I test it manually, it works. I've seen the new Upstart service, but as far as how that actually works, I'm completely in the dark. How can I make a script that runs on boot which runs after networking has been started? If I could run it after connected to wireless network, even better :)

    Read the article

  • File type actions for ruby scripts

    - by Kovags
    Hello, I just installed the Ruby interpreter and created the file test.rb. In the Folder Options, I created the rb file type and an action called Run and assigned the application C:\Ruby192\bin\ruby.exe "%1"" So It's possible for to get into the Windows XP command line and run the script simply by doing this: C:\>test.rb But when I need to send parameters to the script, I can't simply do the following: C:test.rb parameter1 parameter2 I'll have to do the following instead: C:\Ruby192\bin\ruby.exe c:\test.rb parameter1 parameter2 I just noticed that I'm able to edit the action the following way to pass more parameters: C:\Ruby192\bin\ruby.exe "%1" "%2" "%3"" That allows me to give 2 parameters to the script, but for some cases I need to pass a handful of parameters and it doesn't seem right for me to append "%5" "%6" "%7" ad nauseam. What's the canonical way to do it?

    Read the article

  • Kill child process when the parent exits

    - by kolypto
    I'm preparing a script for Docker, which allows only one top-level process, which should receive the signals so we can stop it. Therefore, I'm having a script like this: one application writes to syslog (bash script in this sample), and the other one just prints it. #! /usr/bin/env bash set -eu tail -f /var/log/syslog & exec bash -c 'while true ; do logger aaaaaaaaaaaaaaaaaaa ; sleep 1 ; done' Almost solved: when the top-level process bash gets SIGTERM -- it exists, but tail -f continues to run. How do I instruct tail -f to exit when the parent process exits? E.g. it should also get the signal. Note: Can't use bash traps since exec on the last line replaces the process completely.

    Read the article

  • Ubuntu 11.10 USB 3.0 HDD

    - by Chazm
    I have a problem with my external HDD (WD My Book Essential 1TB) working on USB 3.0 port. I'm using dual boot setup with Windows 7 and Ubuntu 11.10, both 64 bit. While I'm running Windows and rebooting back to Windows everything works well. When I'm switchin to ubuntu everything works great as well. But after 1st reboot from ubuntu neither windows nor ubuntu mount the external drive. I have to reboot the device manually. I suspect that the problem is with unmounting the device on shutdown on ubuntu. The case concern only usb 3.0. When i plug the same device into usb 2 port the problem doesn't persist. Does any1 hit the same problem or have a clue what might goin wrong?

    Read the article

  • Where to apply these akonadi settings for mysql-server?

    - by piedro
    I Am using an external mysql-server to work with akonadi. (using KDE4). Now I could solve all problems with it, but I still have to apply some settings to the "mysql-global.conf"-file of the server. That's what is suggested for example: # wait 365d before dropping the DB connection (default:8h) wait_timeout=31536000 So I tried to change this setting via the mysql console. But it's not reflected anywhere in the /etc/mysql/my.cnf file. (etc/akondai/mysql-global.conf seems to have no effect on the mysql-server either!) My question: Where to put these (or similar) settings to apply them in a way that akonadi won't drop the connection with an external server (globally I guess?)?

    Read the article

  • How can I pull data from PeopleSoft on demand?

    - by trpt4him
    I work in IT at a university and I'm working with about 5 different departments to develop a new process for students to apply to a specific school within the university (not the university as a whole). We're using a web-based college application vendor and adding the applicant questions for the school itself to the main university application. Currently the main application feeds into PeopleSoft. The IT staff here is building a new table to hold just our school's applicant data. I want to be able to access that data from PeopleSoft for use in external applications, but our IT staff doesn't really seem to understand what I'm requesting, as they simply tell me I can have access to the PS query tools. The problem is, I don't want to run just ad hoc queries, I want to be able to connect from outside PeopleSoft and show current data within the external app. I am unable to find documentation or get a clear answer to my question. Does PeopleSoft support access via a web services API or anything similar, and does that sound like the right direction for me to take?

    Read the article

  • mouse pad freezes

    - by abrewmeister
    I have an acer aspire running net book remix 11.10. the mouse pad freezes up completely and I must use an external mouse. if I reboot the computer the mouse pad will work again. i seem to have this problem happen in the middle of either using firefox or a few time while using thunderbird. anyone else having this problem. an external mouse works but scrolling is choppy or it scrolls part of the way down a page and then jumps to the top again. I don't recall having this problem after upgrading to 11.10 so I think it is related to an update. my Linux skills are limited.

    Read the article

  • Need ability to set configuration options using single method which will work across multiple server configurations.

    - by JMC Creative
    I'm trying to set post_max_size and upload_max_filesize in a specific directory for a web application I'm building. I've tried the following in a .htaccess file in the script directory. (upload.php is the script that needs the special configuration) <Files upload.php> php_value upload_max_filesize 9998M php_value post_max_size 9999M </Files> That doesn't work at all. I've tried it without the scriptname specificity, where the only thing in the .htaccess file is: php_value upload_max_filesize 9998M php_value post_max_size 9999M This works on my pc-based xampp server, but throws a "500 Misconfiguration Error" on my production server. I've tried also creating a php.ini file in the directory with: post_max_size = 9999M upload_max_filesize = 9998M But this also doesn't always work. And lastly using the following in the php script doesn't work either, supposedly because the settings have already been compiled by the time the parser reaches the line (?): <?php ini_set('post_max_size','9999M'); ini_set('upload_max_filesize','9998M'); ?>

    Read the article

  • Dual monitor with different resolutions problem

    - by Mackemint
    I'm on an LG R405 with a PM965 chipset and a NVIDIA GeForce Go 8400M GS gaphics card. I'm trying to use an external monitor (Benq senseye3 G2222HDL) through a VGA out port. All was fine when installing except for the screen layouts were flip-flopped (computer defalut setup so external monitor was to the right instead of left). When I moved the secondary monitor to its real position in the setup, I started getting strange problems. I loose representation of large portions of the screen, on both screens, and have to reboot. If I plug the monitor in after rebooting I get more problems than when booting up with the monitor already plugged in, but in both cases I can barely see what I'm doing. Please advice! Regards /M

    Read the article

  • Copy any file with a specific file extension in subfolders into a folder

    - by Onyxius
    I found a script on here that would use 7zip and extract all the files in all the sub-folders of a specific folder and put them in their own folder using the script below. What I need is add to it or maybe use another script if i have to and specify where i want those files to go instead of putting them in their own folder within the folder. I don't know how to do this and hope someone would be able to help. Thanks for the help @echo on FOR /D /r %%F in ("*") DO ( pushd %CD% cd %%F FOR %%X in (*.rar *.zip *.tar) DO ( "C:\Program Files\7-zip\7z.exe" x -o"%%~nX" "%%X" ) popd )

    Read the article

  • Run a specific command from a directory

    - by Cameron Kilgore
    I have a bash script where I need to run an init utility within a directory with a configuration file defined. I don't think it's possible to explicitly tell the utility to run the file as an argument, so what I need to do is go to the directory with the config file, and then run the command. I have some logic in place, but its not working -- the utility never runs. Is there any way I can tell the script to go to this directory, and then run the script? cd /var/www/testing-dev.example.co eval "standardprofile"

    Read the article

  • Remote location's status "Disconnected", how to keep it connected (OK)? [net-use-command]

    - by AZ
    I have an script that keeps running, under some scenarios, it needs to contact a server (\\us-sign). From time to time, if this server remains uncontacted for a while, the next time my script needs it, it will ask for my credentials. What I found is that, after such thing happening and using net use, such server will be displayed as disconnected. if I type "net use \\us-sign", it won't ask me neither user nor password, what makes me believe, my credentials for such server are still "valid", nonetheless, its status will remain "disconnected". This script is supposed to help us automate some procedures, but the need to keep a watch on it shall it request for credentials, it kind of defeats the purpose. How can I keep its status "OK" no matter how long it is not being contacted?

    Read the article

< Previous Page | 320 321 322 323 324 325 326 327 328 329 330 331  | Next Page >