Search Results

Search found 5682 results on 228 pages for 'lord of scripts'.

Page 130/228 | < Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >

  • remastering knoppix: which version and what method should i use

    - by Stan_
    I'd like to remaster knoppix (mainly add and configure some software). I downloaded newest version (KNOPPIX-ADRIANE_V6.2CD-2009-11-18-EN.iso) but later i read that it has some other window manager as default, not kde... and i want to have kde on my remaster. Is kde included on that iso but it's not default or it's not included at all? If it's not there what knoppix version should i get for my remaster? My other question... I've seen some remastering scripts (with menus, etc) on knoppix forums, do any of these works with version i have? Or with version i should have if i need kde?

    Read the article

  • Apache rewrite module, 404 not found

    - by Eneroth3
    I've been having some problems with rewriting directory styled addresses into query strings for my php scripts. Here's the code RewriteRule ^(\w+)/?(\w+)?/?(\w+)?/?$ /index.php?section=$1&category=$2&subcategory=$3 [QSA] This line works perfectly fine on both my local wamp and lamp server, and my friend's lamp server. However on the web hotel I've been using (freehostia) I only get a 404 error when trying to browse a "directory" that isn't really there (supposed to be generated by php). I've tried connecting their support but they only say 3rd party applications aren't their job. I know rewriteEngine is turned on because some basic redirect attempts have worked. Perhaps this line of code could be better written? It's quite important that extra queries are appended and would be nice (but not necessary) if the last slash could be left out. Any help is appreciated :)

    Read the article

  • always true rewritecond

    - by Matt
    My university provides a public_html file in each student's Linux directory so tat each student can have a webpage. I want to put all my PHP scripts into that file and place the index in a sub-directory called webroot. I'm trying to work out a way to have an .htaccess file in the public_html that will redirect ALL requests in that folder to be redirected. There's lots of advice on redirecting any file that doesn't exists but I want to redirect regardless of the existence of a file. Can I use something like RewriteCond TRUE?

    Read the article

  • Excel chart won't update, based on calculated cells

    - by samJL
    I have an Excel document (2007) with a chart (Clustered Column) that gets its Data Series from cells containing calculated values The calculated values never change directly, but only as a result of other cells in the sheet changing When I change other cells in the sheet, the Data Series cells are recalculated, and show new values - but the Chart based on this Data Series refuses to update automatically I can get the Chart to update by saving/closing, or toggling one of the settings (such as reversing x/y axis and then putting it back), or by re-selecting the Data Series Every solution I have found online doesn't work - I have Calculation set to automatic - Ctrl+Alt+F9 updates everything fine, EXCEPT the chart - I have recreated the chart several times, and on different computers - I have tried VBA scripts like: Application.Calculate Application.CalculateFull Application.CalculateFullRebuild ActiveWorkbook.RefreshAll DoEvents None of these update or refresh the chart I do notice that if I type over my Data Series, actual numbers instead of calculations, it will update the chart - it's as if Excel doesn't want to recognize changes in the calculations Has anyone experienced this before or know what I might do to fix the problem? Thank you

    Read the article

  • Active Directory Profile Slows down machine

    - by boburob
    I have a strange issue with an Active Directory profile. When the user logs onto a machine with his profile, the whole machine becomes incredibly slow and unresponsive, with programs hanging and taking an age to load everything. If I log into the same machine with any other profile nothing happens. I took a look at his original profile, any start up programs, login scripts, etc and could not see what could be causing this. The machine is not running out of memory or CPU. Nothing strange is appearing in the event log and I can see nothing running under his profile which may cause this. So I created the user a new profile to test this on and exactly the same thing happens on the first login. The only thing which would of been carried across is the security groups the user is assigned to, yet I have other users with the same groups who do not experience these issues so I am now at a complete loss on where to go next!

    Read the article

  • Configuring Linux Network

    - by Reiler
    Hi I'm working on some software, that runs on a Centos 5.xx installation. I'ts not allowed for our customers to log in to Linux, everything is done from Windows applications, developed by us. So we have build a frontend for the user to configure network setup: Static/DHCP, ip-address, gateway, DNS, Hostname. Right now I let the user enter the information in the Windows app, and then write it on the Linux server like this: Write to /etc/resolv.conf: Nameserver Write to /etc/sysconfig/network: Gateway and Hostname Write to /etc/sysconfig/network-scripts/ifcfg-eth0: Ipaddress, Netmask, Bootproto(DHCP or Static) I also (after some time) found out that I was unable to send mail, unless I wrote in /etc/hosts: 127.0.0.1 Hostname All this seems to work, but is there a better/easier way to do this? Also, I read the network configuration nearly the same way, but if I use DHCP, I miss som information, for instance the Ip-address. I know that I can get some information from the commandline (ifconfig), but I dont get for instance Hostname, Gateway and DNS. Is there a commandline tool that will display this?

    Read the article

  • "killed" message from cron.daily, but not when run from command line

    - by Dan Stahlke
    On Fedora 17, I put a file into /etc/cron.daily with the following contents: cd / su dstahlke /home/dstahlke/bin/anacron-daily.sh exit 0 For some reason, I get a mail every day that just says /etc/cron.daily/dstahlke-daily: ...killed. I tried with and without the exit 0 line above (I noticed that some system scripts have that and others don't, I'm not sure of the purpose). Running /etc/cron.daily/dstahlke-daily from the command line as root produces no ...killed message. Other than the message, everything seems to work fine. Putting set -x in the above script, as well as in the /home/dstahlke/bin/anacron-daily.sh script shows that the ...killed message happens just after the latter script terminates (or perhaps just after the su command finishes). What causes the ...killed message? Or, is there a more acceptable way to have anacron run a user script daily? I figured that putting this in /etc/cron.daily would help the system coordinate all of the daily tasks rather than potentially running my task concurrently with the system tasks.

    Read the article

  • How do you persist installed software & configurations on an Amazon EC2 instance?

    - by Richard
    I've gotten a base Debian AMI up and running and now I need to know the best way to maintain it. I've ran the updates (aptitude update/upgrade) and installed/configured my software (Apache, Ruby, etc.) but if I reboot the instance or start a new one I'll have to do all this work over again. How do you persist these types of things over a reboot? Do you build a new AMI every time you adjust some tiny piece of the system? Or is there some way to feed it a script on startup that configures it in "real-time"? I know I could go all the way with a Reductive Labs Puppet style setup but that's a bit too much for my needs right now (1-2 servers). Any best practices on this? Update: I found a bit of information on using User-Data to run scripts at instance boot time.

    Read the article

  • VIM: How do you get the last ex command you used?

    - by TrevorBoydSmith
    I find that sometimes I write a really long ex mode command that do lots of stuff. They are sort of "mini-scripts" that I write in the text editor then I start ex mode and copy them into the ex line and execute. But then I always end up editing in the ex mode and then I find it difficult to get the changes i did in ex mode back to my text editing session. Using the keyboard, how do you copy the last ex command you used and paste it into your text editor? (Note: This is sort of the opposite of this question "how do I copy/paste in vim ex mode" where the user asks "how do you copy from the text editor and paste INTO the ex mode?". My question is the opposite because I wish to copy from the ex mode and paste into my text editor.)

    Read the article

  • How can I upgrade Ubuntu from 9.10 to 10.04 on a netbook with a 4GB root partition?

    - by Blorgbeard
    I have an Asus EeePC 901, running Ubuntu 9.10. I'd like to upgrade it to 10.04. I don't want to reinstall, since I have a bunch of scripts and programs all set up. However, when I attempt to upgrade using sudo apt-get dist-upgrade, I get an error asking me to free up another ~600MB on /. My / is mounted on sda0, which is a 4GB SSD. I do not have 600MB worth of deletable stuff on /. I've emptied my trash, and done apt-get autoremove and apt-get clean. I do have plenty of space in /home, mounted on sda1 (a 16GB SSD). Is there some way I can tell apt-get to use a different download/temp directory?

    Read the article

  • LUKS-Encrypted Root Partition in Ubuntu 9.04

    - by Martindale
    I have a LUKS-encrypted root partition that I have installed Ubuntu 9.04 to. I have of course placed /boot on a separate ext2 partition, and my boot loader loads and functions correctly. However, I can't seem to get my initrd to load the LUKS-encrypted root using the appropriate /dev/mapper/ address. What hooks and scripts do I need to add to get this to function correctly, and what is the correct way to regenerate my initrd? I can CHROOT into this install, and everything works fine - but I just can't seem to get it to actually boot. Help!

    Read the article

  • What's the best Flash Blocker for the Chrome Browser on a Mac?

    - by Bryan Schuetz
    Looking at the extensions gallery for Chromium there seems to be a number fo flash-blocking extensions available: A couple with very similar names even. I've been using ClicktoFlash in Safari and am used to it just flat out working everywhere. Unfortunately after using FlashBlock by Ruzanow for a bit I've noticed it gets a bit "Hinky" at times (blocking the flash by collapsing the div so you can't click to enable it, etc.) I have a feeling there may be some other extensions/scripts out there not listed above that are better. Ultimately I'd like to find a flash blocker that works as well as ClickToFlash does in Safari.

    Read the article

  • I'd like to archive files from Ubuntu to Windows between two computers on a shared home network

    - by Wabbitseason
    I have an old laptop running Ubuntu 9.10 which I use as a LAMP environment for web development, and I have a comfortable, powerful desktop computer with Windows 7 installed on it. These two are connected to a home router so both can access the internet. I have been able to set up Samba so I can mount my Apache home directory so it is accessible from Windows and is mapped as a network drive. What I'd like to do is access some Windows folders from Linux so I could automatically create backups (with cron scripts) of my work to physically different locations on the Windows box. Perhaps at a later time I'd set up a local Subversion repository but I'd love to keep backups of that on the Windows drives too. Using Ubuntu's Places/Network menu I can see my desktop but I'm unable to log in to that despite having created the corrent username and password on Windows. All I can get is the following error message: "Unable to mount location. Failed to retrieve share list from server." What could be misconfigurated?

    Read the article

  • What does "every two minutes" mean in cron?

    - by Ambrose
    I've got two scripts in cron set to run every two minutes: */2 -- the thing is, they're out of step. One runs at 1,3,5,7,9 minutes, etc. and the other at 0,2,4,6,8. This is not a mission-critical problem, but means I've got two status reports, one a bit stale compared to the other. What does cron do exactly? Run the first one in crontab document order, waiting till it's finished to run the second one? Is there any way I can make the run at the same time, or as close as possible?

    Read the article

  • Where would an S3 upload speed cap originate?

    - by CoreyH
    I do a ton of uploading to S3 and am experiencing capped speeds and I can't quite figure out how to address it. The setup: Windows Server 2008 R2 x64, external HD, using a Java based upload tool called Jsh3ll and custom VBS scripts to kick the jobs off. Running one process at a time, I am always limited to about 4mbps. I have FiOS at 35/35mbps speeds, so it isn't an outright limit. AND, I can run parallel instances and can go all the way up to 35mbps, so I know the problem isn't gateway/nic/machine/amazon related. Running parallel instances works to a degree as a solution, but increases the complexity of my workflow greatly. Solving this would make my life dramatically easier. When I was first doing this I was playing around with a bunch of Windows TCP parameters and was able to briefly get unconstrained bandwidth, but it wasn't repeatable. Thoughts?

    Read the article

  • Block a machine from accessing the internet

    - by Simon Rigby
    After some confirmation that I have thinking right in this scenario. We have a number of wired and wireless machines which presently have direct internet access. I also have a Linux (Ubuntu) server which is used as a file server for the network. Essentially I would like to be able to turn internet access on and off for machines. My plan is to block these machines by MAC address at the router. I would then set up a proxy server on the Linux box (ie Squid) so that the machines I wish to restrict can access the internet via the proxy. As I can adjust access via ACLs in squid, I would be able to switch on or off a machines access to the internet without having to further adjust the router's MAC rules. And of course I could go further and create a few scripts to assist with this admin task. Does this seem sound and have I over looked anything? Any help greatly appreciated. Simon.

    Read the article

  • Is there some way Linux editors can tell the programming language without the file extension?

    - by vfclists
    I am editing some scripts on Linux without the languages file extensions, and it seems that the editors, namely vi, nano and gedit are not applying syntax highlighting because the filenames don't use the language extensions. Is there some parameters to be passed or some setting that can enable them to recognize the language? Update: After some googling I realize that bash has that ability, at least to do some parsing or check the shebang at the top determine the language. By default Ubuntu does not install the complete vim package, so after installing it, the shell files are recognized. I don't know about nano or gedit, but vi and its graphical counterpart will do.

    Read the article

  • My powershell script wont save a file when run using Task Scheduler, do I need to specify a specific argument?

    - by EGr
    I have a script that downloads a temporary Excel file, copies parts of it to a new file, and saves it to a specific location on the network. The problem I'm having is that the new file is never created/saved. If I run the script locally (through cmd.exe, powershell, or powershell ise), it WILL save the file locally, or to the network. If I try running the script via a schedule or on-demand via Task Scheduler, the temporary file is created, but the final document is never created or saved. Is there a specific argument I need to pass, or anything I could be doing wrong? This is the command I'm currently using: powershell.exe -file C:\path\to\my\powershell\script\thescript.ps1 Since it calls environment variables, and other variables relative to the scripts positon, I also set "Start in" to C:\path\to\my\powershell\script\

    Read the article

  • PHP not displaying any errors

    - by chutsu
    I've got ubuntu 10.04 installed, as well as: lighttpd mysql/php Now my problem is that my php scripts can be executed and has access to MySQL database, however I'm having problems getting php to display all errors when running. The result of a failed php run is a blank page. I tried setting the display errors on in php.ini in /etc/php5/cgi/php.ini I've also tried adding these two lines in the source to force errors on, to no avail. ini_set('display_errors', '1'); error_reporting(E_ALL); What should I do? Thanks

    Read the article

  • Magento Onepage Success Conversion Tracking Design Pattern

    - by user1734954
    My intent is to track conversions through multiple channels by inserting third party javascript (for example google analytics, optimizely, pricegrabber etc.) into the footer of onepage success . I've accomplished this by adding a block to the footer reference inside of the checkout success node within local.xml and everything works appropriately. My questions are more about efficiency and extensibility. It occurred to me that it would be better to combine all of the blocks into a single block reference and then use a various methods acting on a single call to the various related models to provide the data needed for insertion into the javascript for each of the conversion tracking scripts. Some examples of the common data that conversion tracking may rely on(pseudo): Order ID , Order Total, Order.LineItem.Name(foreach) and so on Currently for each of the scripts I've made a call to the appropriate model passing the customers last order id as the load value and the calling a get() assigning the return value to a variable and then iterating through the data to match the values with the expectations of the given third party service. All of the data should be pulled once when checkout is complete each third party services may expect different data in different formats Here is an example of one of the conversion tracking template files which loads at the footer of checkout success. $order = Mage::getModel('sales/order')->loadByIncrementId(Mage::getSingleton('checkout/session')->getLastRealOrderId()); $amount = number_format($order->getGrandTotal(),2); $customer = Mage::helper('customer')->getCustomer()->getData(); ?> <script type="text/javascript"> popup_email = '<?php echo($customer['email']);?>'; popup_order_number = '<?php echo $this->getOrderId() ?>'; </script> <!-- PriceGrabber Merchant Evaluation Code --> <script type="text/javascript" charset="UTF-8" src="https://www.pricegrabber.com/rating_merchrevpopjs.php?retid=<something>"></script> <noscript><a href="http://www.pricegrabber.com/rating_merchrev.php?retid=<something>" target=_blank> <img src="https://images.pricegrabber.com/images/mr_noprize.jpg" border="0" width="272" height="238" alt="Merchant Evaluation"></a></noscript> <!-- End PriceGrabber Code --> Having just a single piece of code like this is not that big of a deal, but we are doing similar things with a number of different third party services. Pricegrabber is one of the simpler examples. A more sophisticated tracking service expects a comma separated list of all of the product names, ids, prices, categories , order id etc. I would like to make it all more manageable so my idea to do the following: combine all of the template files into a single file Develop a helper class or library to deliver the data to the conversion template Goals Include Extensibility Minimal Model Calls Minimal Method Calls The Questions 1. Is a Mage helper the best route to take? 2. Is there any design pattern you may recommend for the "helper" class? 3. Why would this the design pattern you've chosen be best for this instance?

    Read the article

  • Centos + CPanel results in php always running as fgci, instead of cli, as expected

    - by quamis
    I'm having problems with a server configured by someone else. It uses CPanel, and it has Apache+PHP. For some reason, when running php -v as a user, i get "cli" as handler # php -v | head -n 1 PHP 5.3.27 (cli) (built: Oct 15 2013 16:06:48) If a make a PHP script with echo shell_exec("php -v | tail -n 1"), i get PHP 5.3.27 (cgi-fcgi) (built: Oct 15 2013 16:22:16) Why the cli/fcgi difference? I need to fix this so that scripts ran by the apache user would be running as "cli", not "fcgi". As a side-note, i'm not sure how php got installed, because in the package list i see another version installed, instead of the pne reported by php -v # rpm -qa | grep php [.... other packages...] cpanel-php53-5.3.17-5.cp1136.x86_64 rebuild_phpconf --current: /usr/local/cpanel/bin/rebuild_phpconf --current Available handlers: suphp dso fcgi cgi none DEFAULT PHP: 5 PHP4 SAPI: none PHP5 SAPI: dso SUEXEC: enabled RUID2: not installed redhat-release: # cat /etc/redhat-release CentOS release 6.4 (Final) possibly related to http://superuser.com/questions/665809/centos-6-4-running-php-for-root-as-cgi-fcgi-not-cli

    Read the article

  • Delete temporary files from batch script in xp

    - by Keith Bentrup
    I'm looking for a good batch script that would quickly find & clean all the known safe temporary folders/files from Windows (as many variants as possible) machines (e.g. the windows temp folder, all users IE temp folders, etc.). I'm fond of UI tools like CCleaner (over Cleanmgr.exe), but when I'm trying to clean several computers quickly and/or with minimal involvement, it would be nice to have a script. Plus with a script, I could chain several scripts together. Maybe one to then fire up various antivirus and/or malware detectors. Anyone have a good one or can point to a good resource?

    Read the article

  • Managing service passwords with Puppet

    - by Jeff Ferland
    I'm setting up my Bacula configuration in Puppet. One thing I want to do is ensure that each password field is different. My current thought is to hash the hostname with a secret value that would ensure each file daemon has a unique password and that password can be written to both the director configuration and the file server. I definitely don't want to use one universal password as that would permit anybody who might compromise one machine to get access to any machine through Bacula. Is there another way to do this other than using a hash function to generate the passwords? Clarification: This is NOT about user accounts for services. This is about the authentication tokens (to use another term) in the client / server files. Example snippet: Director { # define myself Name = <%= hostname $>-dir QueryFile = "/etc/bacula/scripts/query.sql" WorkingDirectory = "/var/lib/bacula" PidDirectory = "/var/run/bacula" Maximum Concurrent Jobs = 3 Password = "<%= somePasswordFunction =>" # Console password Messages = Daemon }

    Read the article

  • Excel chart won't update, based on calculated cells

    - by sam SJL
    I have an Excel document (2007) with a chart (Clustered Column) that gets its Data Series from cells containing calculated values The calculated values never change directly, but only as a result of other cells in the sheet changing When I change other cells in the sheet, the Data Series cells are recalculated, and show new values - but the Chart based on this Data Series refuses to update automatically I can get the Chart to update by saving/closing, or toggling one of the settings (such as reversing x/y axis and then putting it back), or by re-selecting the Data Series Every solution I have found online doesn't work - I have Calculation set to automatic - Ctrl+Alt+F9 updates everything fine, EXCEPT the chart - I have recreated the chart several times, and on different computers - I have tried VBA scripts like: Application.Calculate Application.CalculateFull Application.CalculateFullRebuild ActiveWorkbook.RefreshAll DoEvents None of these update or refresh the chart I do notice that if I type over my Data Series, actual numbers instead of calculations, it will update the chart - it's as if Excel doesn't want to recognize changes in the calculations Has anyone experienced this before or know what I might do to fix the problem? Thank you

    Read the article

  • server performance metrics report and practicality

    - by Anjesh
    I have a need of preparing web server (apache-php) performance report containing important metrics like CPU usage, disk io, memory usage on user basis. Couple of domains are hosted in the same server and they run from separate users using fcgi. The reason being sometimes some hosted applications take lots of cpu usage, making the server slow for other applications (running as separate users). i am planning to develop scripts for this, as i can't seem to find any simple utilities for this purpose. This script will take snapshots of the user wise metrics at defined periods say 15 minutes and record it. Any abnormalities will be reported via emails. How practical is that? also would be interesting to know what else need to be recorded.

    Read the article

< Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >