Search Results

Search found 22829 results on 914 pages for 'nautilus script'.

Page 11/914 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • How can I change the color of the pane separator for my Ambiance theme modification?

    - by WarriorIng64
    I am currently messing around with Ambiance, trying to give Nautilus a dark sidebar (because I think it looks much better that way, especially with the current look having the dark-colored breadcrumbs clashing horribly with the light-colored sidebar). I have zero experience and knowledge of how to create GTK+ themes, and I couldn't find any documentation online, so I just made a copy of the folder for Ambiance under /usr/share/themes, renamed it "Ambiance Dark Sidebar" and just started messing with color values. As shown below, I found the value in nautilus.css needed to be tweaked to create the dark sidebar, but there is still one part that stubbornly stays light gray. This is the pane separator, and I want to change it so it matches the rest better (marked in red). Does anyone know what I need to do to change the color of this part so it matches the rest of the sidebar better? I already know from seeing themes like Adwaita Dark that this should be possible, but even after poking around in that I didn't find anything that seemed to help. Here are the contents of the files I modified in the theme folder Ambiance Dark Sidebar, stored alongside Ambiance in /usr/share/themes: index.theme gtk-3.0/apps/nautilus.css

    Read the article

  • Copying photos from camera stalls - how to track down issue?

    - by Hamish Downer
    When I copy files from my camera (connected via USB) to the SSD in my laptop a few files get copied and then the copy stalls. I'm not sure why, any ideas or things to investigate appreciated, or bug reports to go and look at. I have read this answer - the camera (Canon 40D in case that matters) mounts fine using gvfs. I can see the photos in Nautilus, or in the terminal (in /run/user/username/gvfs/... ) and I can copy a few photos, but not many. Using the terminal or Nautilus the process hangs until the camera goes to sleep. Digikam fails to copy any at all, as does Rapid Photo Downloader. Shotwell did manage it in the end, but that is very much a work around for me. I have disabled thumbnail generation by nautilus. Load average stays about 1 while this is happening, while CPU usage is half idle, half wait (and a little user/sys for other programs). None of the programs at the top of the cpu list in top are related to copying photos. There is not much in the logs - from /var/log/syslog Dec 2 16:20:52 mishtop dbus[945]: [system] Activating service name='org.freedesktop.UDisks' (using servicehelper) Dec 2 16:20:52 mishtop dbus[945]: [system] Successfully activated service 'org.freedesktop.UDisks' Dec 2 16:21:24 mishtop kernel: [ 2297.180130] usb 2-2: new high-speed USB device number 4 using ehci_hcd Dec 2 16:21:24 mishtop kernel: [ 2297.314272] usb 2-2: New USB device found, idVendor=04a9, idProduct=3146 Dec 2 16:21:24 mishtop kernel: [ 2297.314278] usb 2-2: New USB device strings: Mfr=1, Product=2, SerialNumber=0 Dec 2 16:21:24 mishtop kernel: [ 2297.314283] usb 2-2: Product: Canon Digital Camera Dec 2 16:21:24 mishtop kernel: [ 2297.314287] usb 2-2: Manufacturer: Canon Inc. Dec 2 16:21:24 mishtop mtp-probe: checking bus 2, device 4: "/sys/devices/pci0000:00/0000:00:1d.7/usb2/2-2" Dec 2 16:21:24 mishtop mtp-probe: bus: 2, device: 4 was not an MTP device This problem has only started recently and I've had all the hardware for ages. I have also recently upgraded to 12.10, though I'm not sure if the problem started when I upgraded or after the upgrade. I also note this similar question but it is currently unanswered and I'm providing more detail

    Read the article

  • Lost all non-linux hard drives

    - by Rick
    I've somehow lost access to all my non-linux drives. I'm not sure how. I have two other hard drives set up (a windows boot and an external usb I was using under windowsxp). A few days ago I could access both under linux with no problem. Now, I see them listed under media, but when I click on them, nothing is there (same thing happens when I go there in a term window). Now, the computer (under linux) sees them, but I can't see anything in them. (in nautilus it shows "(Empty)") Any ideas what happened or how to get access to them again? My guesses as to what might have caused it: -I was trying to set up AdobeAIR (unsuccessfully) and had to make and remove a couple of symbolic links. Never got that to work but maybe links I made or removed did something? -I was also trying to set up nautilus so that I could enter it with sudo permission. I also didn't figure out how to do that successfully, but maybe something I did while trying messed it up? (I think I installed and uninstalled nautilus-actions configuration tool). Note: I'm working on a relatively new install of 12.04

    Read the article

  • Problem with running a script at startup as root?

    - by Usman Ajmal
    Hi The main question: Is there a way I can run 'completely' one of my script when ubuntu's desktop appears no matter if root , administrator, desktop user or an unprivileged user logged in? What does the script do? The script mounts a partition, looks for a file in that partition and finally on the basis of that file a decision of copying a partition to another partition is made. That copying is done via dd if=/dev/sda2 of=/dev/sda5 When does the script run finely? Script runs smoothly when I run it from the terminal by sudo ./my_copying_script This command asks me for the password of currently logged in user. I enter the password and the script starts working. When does the script NOT run finely? I want to run the script at startup. I set it a startup program by using the Startup Applications utility of Ubuntu. Script ran at startup but exited at the dd command returing following error: dd: opening '/dev/sda2': Permission denied On edk's suggestion I set the owner of my_copying_script as root and set the SUID. Now the permissions of my_copying_script are (-rwsr-sr-x). edk's point of view was that once I set the suid, the startup program will run with the permissions of its owner. I did that but the same /dev/sda2 permission denied error came up. I then prefixed the dd with sudo as mentioned below sudo dd if=/dev/sda2 of=/dev/sda5 but this returned following error: sudo: no tty present and no askpass program specified In other words the mounting failed. If I run the script using sudo ./myProgram i don't face this problem and the drive gets mounted successfully.

    Read the article

  • webbrowser disable script debugging in Visual Basic 6

    - by me4245
    Hi, I want to disable script errors from popping up in a VB6 application. (I have VB6 installed on this machine). Currently, if I navigate to a particular page, it pops up saying "INternet Explorer Script Error: An error has ocurred in the script on this page" ... "Do you want to continue running scripts on this page?" Setting the webbrowser1.silent to 'true' does not work. Instead all that happens, is instead of displaying an error message, it starts up the actual 'script' debugger, and then exits the program. On a machine without the (visual studio) debugger, it still pops up a message asking to use the debugger, i.e., on Vista, (when silent is set to true). Manually changing the 'disable script debugging (other)' (and regular one), doesn't seem to working in MSIE (also testing version 6.0 for xp users). How do I disable script errors? Thanks in advance!

    Read the article

  • php script dies when it calls a bash script, maybe a problem of server configuration

    - by user347501
    Hi!!! I have some problems with a PHP script that calls a Bash script.... in the PHP script is uploaded a XML file, then the PHP script calls a Bash script that cut the file in portions (for example, is uploaded a XML file of 30,000 lines, so the Bash script cut the file in portions of 10,000 lines, so it will be 3 files of 10,000 each one) The file is uploaded, the Bash script cut the lines, but when the Bash script returns to the PHP script, the PHP script dies, & I dont know why... I tested the script in another server and it works fine... I dont believe that is a memory problem, maybe it is a processor problem, I dont know, I dont know what to do, what can I do??? (Im using the function shell_exec in PHP to call the Bash script) The error only happens if the XML file has more than 8,000 lines, but if the file has less then 8,000 everything is ok (this is relative, it depends of the amount of data, of strings, of letters that contains each line) what can you suggest me??? (sorry for my bad english, I have to practice a lot xD) I leave the code here PHP script (at the end, after the ?, there is html & javascript code, but it doesnt appear, only the javascript code... basically the html is only to upload the file) " . date('c') . ": $str"; $file = fopen("uploadxmltest.debug.txt","a"); fwrite($file,date('c') . ": $str\n"); fclose($file); } try{ if(is_uploaded_file($_FILES['tfile']['tmp_name'])){ debug("step 1: the file was uploaded"); $norg=date('y-m-d')."_".md5(microtime()); $nfle="testfiles/$norg.xml"; $ndir="testfiles/$norg"; $ndir2="testfiles/$norg"; if(move_uploaded_file($_FILES['tfile']['tmp_name'],"$nfle")){ debug("step 2: the file was moved to the directory"); debug("memory_get_usage(): " . memory_get_usage()); debug("memory_get_usage(true): " . memory_get_usage(true)); debug("memory_get_peak_usage(): " . memory_get_peak_usage()); debug("memory_get_peak_usage(true): " . memory_get_peak_usage(true)); $shll=shell_exec("./crm_cutfile_v2.sh \"$nfle\" \"$ndir\" \"$norg\" "); debug("result: $shll"); debug("memory_get_usage(): " . memory_get_usage()); debug("memory_get_usage(true): " . memory_get_usage(true)); debug("memory_get_peak_usage(): " . memory_get_peak_usage()); debug("memory_get_peak_usage(true): " . memory_get_peak_usage(true)); debug("step 3: the file was cutted. END"); } else{ debug("ERROR: I didnt move the file"); exit(); } } else{ debug("ERROR: I didnt upload the file"); //exit(); } } catch(Exception $e){ debug("Exception: " . $e-getMessage()); exit(); } ? Test function uploadFile(){ alert("start"); if(document.test.tfile.value==""){ alert("First you have to upload a file"); } else{ document.test.submit(); } } Bash script with AWK #!/bin/bash #For single messages (one message per contact) function cutfile(){ lines=$( cat "$1" | awk 'END {print NR}' ) fline="$4"; if [ -d "$2" ]; then exsts=1 else mkdir "$2" fi cp "$1" "$2/datasource.xml" cd "$2" i=1 contfile=1 while [ $i -le $lines ] do currentline=$( cat "datasource.xml" | awk -v fl=$i 'NR==fl {print $0}' ) #creates first file if [ $i -eq 1 ]; then echo "$fline" "$3_1.txt" else #creates the rest of files when there are more than 10,000 contacts rsd=$(( ( $i - 2 ) % 10000 )) if [ $rsd -eq 0 ]; then echo "" "$3_$contfile.txt" contfile=$(( $contfile + 1 )) echo "$fline" "$3_$contfile.txt" fi fi echo "$currentline" "$3_$contfile.txt" i=$(( $i + 1 )) done echo "" "$3_$contfile.txt" return 1 } #For multiple messages (one message for all contacts) function cutfile_multi(){ return 1 } cutfile "$1" "$2" "$3" "$4" echo 1 thanks!!!!! =D

    Read the article

  • Photoshop CS5 not recognising activeDocument

    - by Max Kielland
    I wrote a quite big script for Photoshop CS5.1 on my 64bit Vista machine. Now when I run the very same script on my new 64bit Windows 7 machine, Adobe ExtendScript Tool complains about activeDocument (no such element) in this simple script: #target photoshop var pDoc = app.activeDocument; alert("Done!"); I have tried both and without #target and choosing the target in the ExtendedScript Tool. Is there something I have missed, or do I need to install something more. I only installed the 64bit version of Photoshop. Is it so that the 32bit Photoshop has the script extensions? I don't see why I need to install both 32bit and 64bit versions if I'm only going to use the 64bit version. EDIT I installed the 32bit version as well. Tried the same script against 32 and 64 bit, still no difference. SOLVED The mystery is solved. It is embarrassing simple if you interpret the error message more careful. Of course I can't get an activeDocument if there are no documents in Photoshop, duh!?! I interpreted it as the statement activeDocument wasn't recognised, but of course if I have no document there is no such element (as a photoshop document) to give me. I'm used to C++ and would expect the reuslt to be a NULL value or similar if there is a problem to get the document... excuses, excuses ;) Well, if someone else should get into the same problem, here is the answer on my expense :D

    Read the article

  • How to run adb command through a script

    - by Outride
    As the title says, I'm trying to find how to run adb commands through a pre-written script (such as how .bat and .vbs files work) so I can make a semi automatic program to pull some files from my android phone whenever I click on the launch.bat which launches a line of program to go through the phone and find certain files. I know the script line: adb pull /data/data/kik.android/databases/ %drive%\All\Database This copies all the files in this specific database into the flash drive's folder "Database" under the "All" folder. But, how could i make it so that the adb program looks through the phone and finds other files, and how could i pull the specific files, rather than pulling the entire folder?

    Read the article

  • rm failing inside cron script

    - by Nicholas
    I have a cron job calling a bash script which runs fine, except for one line inside it that is suppose to remove all fines in a directory. The result of this line is always 'no such file or directory' even though I have verified (many times) that there are files in that directory. The line in question is as simply: rm /dir1/dir2/dir3/* The script works fine when run manually in the terminal, so it must be something about how the cron is run. I've tried giving 'dir3' and all the files inside it every permission possible, so it shouldn't be a permission problem. (The directory and files are also owned by the user). I've tried specifing 'SHELL=/bin/bash' inside 'crontab'. There is no sticky bit set and there is no alias on the rm command. Interestingly changing the 'rm' command to 'ls' gives the same negative result (unless you remove the trailing '*', and then that works). What am I missing here?

    Read the article

  • wait for the second VB script untill ended from primary VB script

    - by yael
    Hi I have VB script that run second VB script The second VB script ask some questions from the input box My problem is that “MyShell.Run” not wait until SecondVBscript.vbs will ended And the Other VB syntax run immodestly also Need to wait for MyShell.Run process ended and then perform the Other VB syntax How can I do that? Set MyShell = Wscript.CreateObject("WScript.Shell") MyShell.Run " C:\Program Files\SecondVBscript.vbs" Set MyShell = Nothing Other VB syntax

    Read the article

  • sudo displays typed password in bash script

    - by Andy
    Hullo, I have a bash script that uses sudo a few times. There's a couple of strange points about it though. It asks me for my password a few seconds after I've already entered it for a previous command. The second time I enter my password, it's echoed to the display. Here's the relevant bits of the script. sudo service apache2 stop drush sql-dump --root="$SITE_DIR" --structure-tables-key=svn --ordered-dump | grep -iv 'dump completed on' | sudo tee "$DB_DIR/${SITE_NAME}.sql" > /dev/null sudo svn diff "$DB_DIR" | less sudo svn commit -m "$MESSAGE" "$DB_DIR" sudo service apache2 start The first password is to stop apache, and it works as expected. As mentioned, the sudo tee doesn't 'remember' that I have elevated privileges, asks for the password again, and echoes it to the screen. Given that tee is all about echoing to screen, I've played around a little with simple scripts which have | sudo tee, and they all work as expected. Ideas?! TIA Andy

    Read the article

  • How to remove blank lines in .txt file

    - by Brant
    I want to change text file format as following , but don't know how to do (2) 5. The function of the condenser is to: a) vapourise the liquid refrigerant b) change high pressure refrigerant vapour to liquid c) pressurise low pressure refrigerant vapour d) vent off vapourised refrigerant e) lower the liquid refrigerant pressure (2) 6. One tonne of refrigeration is: a) 13958 kJ per day b) 100 kJ per minute c) 233 kJ per minute d) 13958 J per hour e) 335 J per second (2) 5. The function of the condenser is to: a) vapourise the liquid refrigerant b) change high pressure refrigerant vapour to liquid c) pressurise low pressure refrigerant vapour d) vent off vapourised refrigerant e) lower the liquid refrigerant pressure (2) 6. One tonne of refrigeration is: a) 13958 kJ per day b) 100 kJ per minute c) 233 kJ per minute d) 13958 J per hour e) 335 J per second

    Read the article

  • Shell script with ImageMagick: hangs forever?

    - by AP257
    I've generated a shell script that uses ImageMagick to convert and crop around 18000 images. Here's a sample entry (so there are 18000 of these): if [ ! -f ./cropped/16333-1.png ] then convert -crop 724x118+876+1989 ./lin/34.png ./cropped/16333-1.png echo cropping 16333-1 fi if [ ! -f ./cropped/16333-1_thumb.png ] then convert -define jpeg:size=400x100 ./cropped/16333-1.png -thumbnail '400x100>' -background transparent -gravity center -extent 400x100 ./cropped/16333-1_thumb.png echo thumbing 16333-1 fi The script only runs for about 2000 images before hanging forever. Am I missing something, or leaking memory somewhere? Thanks for your help!

    Read the article

  • cannot run CMD script from Vista windows explorer

    - by jamesvista
    I am running Vista Home Premium. I tried to write a script to do some simple automation.... it does not work! even the most simple script like: @echo ON dir . does not get executed and only an empty CMD shell pops open when started from explorer. From the cmd windows there is no problem. This is really weird and I have never seen this before (but wrote many CMD scripts before) ftype cmdfile and batfile are unchanged from "%1" %* virusscan done - no problems Is there a policy setting that might have changed? Any ideas?

    Read the article

  • Shell Script, iterating over a folder

    - by Martin
    I have very basic shell scripting knowledge. I have photos under original folder on many different folder like this folder + folder1 + original + folder2 + original + folder3 + original + folder4 + original Using mogrify I'm trying to create thumbs under a thumb folder following a structure to this. folder + folder1 + original + thumb + folder2 + original + thumb + folder3 + original + thumb + folder4 + original + thumb I'm a little lost in how to write the shell script that may iterate through it. I'm ok giving mogrify its settings but I don't complete understand how to tell the script to go iterate each folder to run the mogrify command.

    Read the article

  • cannot run CMD script from Vista windows explorer

    - by jamesvista
    I am running Vista Home Premium. I tried to write a script to do some simple automation.... it does not work! even the most simple script like: @echo ON dir . does not get executed and only an empty CMD shell pops open when started from explorer. From the cmd windows there is no problem. This is really weird and I have never seen this before (but wrote many CMD scripts before) ftype cmdfile and batfile are unchanged from "%1" %* virusscan done - no problems Is there a policy setting that might have changed? Any ideas?

    Read the article

  • Server speed: sharing one script.php or using many copies the same script.php

    - by Marco Demaio
    Let's assume: I have thousands of domains on same Apache server. Each domain is in a folder under server public_html document folder, so it can be accessed by calling "www.somedomain.com" or by calling "www.serverdomain.com/somedomain_folder" In each domain there is a website who needs a certain script.php (identical for each domain). From a coding point view, its obvious that it's better to use a unique script.php, so when i update it with new features/bug fixes etc, I need to update on server only one file and it will work for all domains. But from a server point of view? If i use a unique script all domains will access it at the same time, will the server run slower compared to the situation where each domain called its own script?

    Read the article

  • How to open a file or folder from Terminal

    - by Victor Hugo Souza
    I'd like to know if there is a way to open a file or a folder from terminal? When I wrote a URL LINK in terminal, it's allows me to open that link on my default browser. So I'd like to do the same with my files and folders. I know that there is a way via cli using gnome-open or xdg-open, but I'd like a solution that use mouse by clicking on the path or the url. Eg. When I write "pwd" the path allows me to click and open on Nautilus It's the inverse of what "nautilus-open-terminal" does.

    Read the article

  • Raleigh theme on LightDM, Ubuntu desktop, Nautilus window, menus & menu-bars

    - by Tassos Seligkas
    After performing an upgrade from Natty to Oneiric, I had a problem similar to the one reported in here at every system boot: Desktop forgets theme? Everything, from the LigthDM greeter to the ubuntu desktop used the ugly raleigh theme, apart from firefox, thunderbird few other applications after logging in. Unfortunately none of the solutions suggested in the topic above worked for me. The only way I could get acceptable appearance would be to switch to gnome at login and using the Adwaita theme. The lightdm greeter still uses the raleigh theme though. Unfortunately, I tried some "brute-force" methods by reinstalling (sudo apt-get install --reinstall) ubuntu-desktop, unity, unity-common, unity-greeter, gnome-session, gtk2-engines. I also tried moving .config, .gconf, .gconfd, .gnome, .gnome2 to a backup dir to reset account desktop preferences. None of the above solved the issue. On the contrary, logging-in to Ubuntu setup does not show unity and window decorations anymore. My fallback remains the Gnome logon and the Adwaita theme. This is my workstation machine I am hosting Ubuntu on, so, though possible, it is time consuming to perform a backup and format-reinstall ubuntu 11.10. Could you please let me know if I can get an alternative way of repairing my ubuntu desktop? (I believe it all started when, during the 11.04-to-11.10 upgrade, the installation of downloaded packages for oneiric broke when nautilus-dropbox failed to access the dropbox servers - I am behind a proxy but with proper proxy settings had no problems using apt-get & synaptic. However I removed dropbox and resumed partial installation on second boot.)

    Read the article

  • How can I make SWF files be opened with the standalone player?

    - by shanethehat
    I have installed the standalone Flash debug player to /usr/lib/flashplayerdebugger and I can now use it to test within Flash Builder (Eclipse), but I can't make an SWF open with it from Nautilus. If I right click and select Open With Other Application it is not in the list of programs, and I can't see how to add it. How can I make it the default application for SWF files opened in Nautilus? Update - *.desktop file [Desktop Entry] Name=Flash Player Debuger Type=Application Exec=/usr/lib/flashplayerdebugger Categories=GNOME;Player;AudioVideo; MimeType=application/x-shockwave-flash;

    Read the article

  • Display folder sizes in file manager

    - by wim
    In nautilus (or nemo) file manager, the "Size" column shows the filesize for files and the number of items contained in a folder for subdirectories: Number of items is not that important for me, it would be more useful if I could make this column show the total size contained under the directory. I had an extension on windows called foldersize which shows what I mean: I think it involved a service which ran in the background monitoring filesystem modifications in order to make sure the column was kept up to date. I am interested to know if there is any similar extension to nautilus, I would also be open to switching to another file manager to get this functionality. I am aware of the Disk Usage Analyser in Ubuntu, but what I'm looking for is a solution with file manager integration.

    Read the article

  • How to mount nexus 7 file system to view files from command line

    - by knotech
    Just got a Nexus 7, switched MTP on and plugged it in. It pops right up in nautilus, is titled Nexus 7, and lets me browse all of the files. However, on the command line, it's simply listed as /media/usbdrive and when I try to list the files, nothing comes up. Do I have to manually mount it in order to navigate the files from the command line? And why is it that nautilus recognizes it, and has access immediately, but I can't get at it from the command line?

    Read the article

  • Executable execution path. Does it depends of the place the executable is called from?

    - by Valkea
    as I'm still a new Linux user, I still discover some behaviours and I'm unable to tell if they are "normals" or not. I searched the Internet but as I can't really find an answer I guess it's time to ask here. Few weeks ago I installed a small game called "Machinarium" and I played it... but few days later when I wanted to continue my game I was unable to make the game start correctly. And as I didn't had the time to search I given up. But yesterday as I was working on a program of mine, I had the exactly same behaviour. So I searched a bit and I discovered that when using Nautilus with the "List view", I was able to run the program (ie: the program does find the sound, images etc resources) when I was literally "inside" the executable folder, but unable when I was in a parent folder and expanding it to the executable folder to run it. To illustrate the behaviour here are two screen shots. It doesn't works if the executable is double clicked from here It does works if the executable is double clicked from here This is indeed the same "place", but the Nautilus view is slightly different as the current folder is not the same and it seems to make a difference for the program. Furthermore, when I create a menu items via System Settings/Main Menu to the executable, it behaves just like if the executable can't find the resources (that's why I was unable to play Machinarium the second time as I created a menu short-cut after my first game). So I asked my program to generate a text file at it's root when running, and I started to launch it from different "parent" folders to see where is generated the text file. Each time the file was generated on the top folder of the current Nautilus view. I was expected to see it appears in the same folder of the executable (well not as I was guessing what as happening, but before that I would have expected to see it appears in the exe folder). Does anyone can explain me why it does works like this (I guess it's normal) ? How I'm supposed to solve this when creating programs (Should I detect the executable path in my C++ code or should I organize the resources files another way than on windows ?)

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >