Search Results

Search found 71513 results on 2861 pages for 'file extension'.

Page 260/2861 | < Previous Page | 256 257 258 259 260 261 262 263 264 265 266 267  | Next Page >

  • I lost the menu inside of Explorer (file browser) under Windows XP

    - by jfmessier
    The question says it all. Inside of the file explorer under Windows XP, I lost all menu items from the top, as well as the toolbars. And right-clicking only gives me the file/directory context menu, nothing to restore the toolbar or the menu. Help. I don't use Windows so much (I much prefer linux), so I don't know all the latest tricks and hacks. Merci :-)

    Read the article

  • Open dialog box won't show files in libraries, but explorer will

    - by Alex
    I have the weirdest problem when trying to open or save files. When I try to get to "My Documents" through the "Libraries" side link it won't show any of my files. It will show them if I go around from the C:// drive into the user files though. I thought it was because I didn't have the right location defined for the "Libraries" shortcut, but when I use "Explorer" to open my "Libraries" it shows all the files. Any ideas?

    Read the article

  • I'd like to archive files from Ubuntu to Windows between two computers on a shared home network

    - by Wabbitseason
    I have an old laptop running Ubuntu 9.10 which I use as a LAMP environment for web development, and I have a comfortable, powerful desktop computer with Windows 7 installed on it. These two are connected to a home router so both can access the internet. I have been able to set up Samba so I can mount my Apache home directory so it is accessible from Windows and is mapped as a network drive. What I'd like to do is access some Windows folders from Linux so I could automatically create backups (with cron scripts) of my work to physically different locations on the Windows box. Perhaps at a later time I'd set up a local Subversion repository but I'd love to keep backups of that on the Windows drives too. Using Ubuntu's Places/Network menu I can see my desktop but I'm unable to log in to that despite having created the corrent username and password on Windows. All I can get is the following error message: "Unable to mount location. Failed to retrieve share list from server." What could be misconfigurated?

    Read the article

  • windows xp cannot access admin share

    - by barlop
    I have 3 systems. A,B,Compx all on xp. but comps A and B have an issue with Compx. Compx has network shares I can access. I can do \\compx and get some. But I cannot access the admin share c$ \\compx\c$ gives a login prompt, and I can't get any user/pass to work. I looked at permissions but don't see an issue. Nevertheless, I will describe what I see in the permissions. In the security tab of C, I have Administrators,creator owner,everyone,bob,system,users (6 things there) "creator owner" has nothing ticked, I can't seem to change that. If I tick so they all get ticked, and click apply, 2.5min and it's completed its opration and they all untick. Though this isn't the root of the problem. Since I get the same in the share I can access. In advanced, I see those 6 things, Administrators,creator owner,everyone,bob,system,users (6 things there) all "full control" all are "this folder, subfolders and files".. except creator owner, which is just subfolders and files only I look at the properties for the share I can see. looks the same, except in security..advanced, double clicking any of them the boxes are all ticked but greyed. That's not the problem though since I can access that share. So, I don't know what the problem is.

    Read the article

  • Simple Distributed Disconnected way to sync a directory

    - by Rory
    I want to start regularly backup my home directory on my ubuntu laptop, machine X. Suppose I have access to 2 different remote (linux) servers that I can backup to, machines A & B. Machine X will be the master, and should be synced to A and B. I could just regularly run rsync from X to A and then from X to B. That's all I need. However I'm curious if there's a more bandwidth effecient, and hence faster way to do it. Assuming X is going to be on residential style broadband lines, and since I don't want to soak up the bandwidth, I would limit the transfer from X. A and B will be on all the time, however X, will not be, so I'd also like to reduce the amount of time that X is transfering, potentially allowing A and B to spend more time transfering. Also, X won't be connected all the time. What's the best way to do this? rsync from X to A, then from A to B? Timing that right could be troublesome. I don't want to keep old files around, so if I was to rsync, then the --del option would be used. Could that mean something might get tranfered from A to B, then deleted from B, then transfered from A to B again? That's suboptimal. I know there are fancy distributed filesystems like gluster, but I think that's overkill in this case, and might not fit with the disconnected nature.

    Read the article

  • Download of ~100MB file from Joomla + Docman component is incomplete (sometimes)

    - by Knix
    Hi, I hope this is right place for this. I am currently running Joomla v1 with a file management component called Docman on a Bluehost server. Some users (particularly with slower download speeds) are experiencing partial file downloads with some of my larger files (~100MB). I would like to keep the Joomla + Docman installation. Is there anything I can do to resolve this issue? I would greatly appreciate any recommendations that you may have. Thanks, Knix

    Read the article

  • Burn 30GB zip file to DVD

    - by Joel Coehoorn
    I have a zip 30GB zip file containing an archive a digital materials available in the school library that I want to burn to dvd. Of course, 30Gb is far too large for a single dvd and the content is already zipped. I'm open to ideas, but leaning towards suggestions that will help me automatically spread the file over multiple dvds, including a simple program to stitch it back together again later.

    Read the article

  • Terminal command to change permissions to my 'Sites' folder and apply change to enclosed items?

    - by Ryan
    Using Snow Leopard, I'm having issues with permissions in my Sites folder. While I can navigate to localhost/~username and read any files or folders there, the same permissions have not been applied to enclosed items, and I get a 403 error trying to access them in the browser. If I select one of these enclosed folders and get info using Finder, I see the user 'Everyone' is set to 'No Access' but I can't change that (this behavior seems buggy, actually). And if I select my 'Sites' folder, the tool to 'Apply to enclosed items' is grayed out... Is there a Terminal command I can use to grant 'Read Only' access to my Sites folder, and all it contains, for the user 'Everyone'?

    Read the article

  • Generating and collaborating on network map diagrams?

    - by Ian C.
    I have to turn out some network topology maps for very large networks. I'd like the format for the maps to be something other people can also edit and contribute to regardless of what software I'm using on my Mac to build them. I don't mind spending money on my end for software, but I can't require that my clients spend any money. I also can't promise my clients are also using OS X -- they could be running Linux or Windows. Is there a best software application on OS X for producing maps that I can share with other, non-OS X, users? Is there a best format for sharing topology maps that I should use when exporting the maps to disk?

    Read the article

  • CMD: How do I delete all the contents of all directories (in the current directory) without deleting the directories themselves?

    - by merlin2011
    For example: I'm in the directory: F:\Data Inside this directory, I have four directories: F:\Datadir 22179 22915 23459 23460 These directories have various content, including directories and files. I'm trying to run something like: rmdir /s *\* where I delete all the contents of these numbered directories, while leaving the empty directories. Is there a one-liner that can do this, or do I have to loop through the sub-directories?

    Read the article

  • Maximise network transfer speed of various applications

    - by Alex
    When using nc, scp, wget to transfer files between 2 machines on a dedicated 2Mbps link, I get speeds between 0.5 and 1 Mbps. However, when I use iperf -c 10.0.1.4 -t 20 -P 12 (for example) I can maximise the speed of the link (getting stable 2Mbps). Is there a way to make single stream transfers (such as those done by scp) to utilise all/most of the link? Some kind of tcp settings, or iptables...?

    Read the article

  • find files their name is smaller or greater than a given parameter

    - by Tzury Bar Yochay
    Say that in a given directory I got tzury@x200:~/Desktop/sandbox$ ls -l total 20 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N00.P000 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N00.P001 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N00.P002 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N00.P003 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N00.P004 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N01.P000 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N01.P001 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N01.P002 I seek for a bash way to grab the list of files which their name is either grater or smaller than a given parameter, for instance: $ my_finder lt N00.P003 shall return N00.P000, N00.P001 and N00.P002 $ my_finder gt N00.P003 shall return N00.P004, N01.P000, N01.P001 and N01.P002 I was thinking of iterating over for name in $(ls) and while $name != $2 but believe there are more elegant way of doing so

    Read the article

  • Converting Powerpoint to PDF solutions?

    - by OWiz
    I asked a version of this question earlier, but I'm in need of other solutions, so this is a more pointed question. I'm in need of a server-based solution for converting ppt files to pdf files. This solution can either sit on the current web server as a console command-triggered service, it can be integrated into the C# code of the web all, or it can be it's own server. It also can't be based off of Libreoffice or Openoffice, as those two have problems converting SmartArt. I'm currently using Libreoffice. I've tried Powerpoint console commands combined with a PDF driver but I can't get that to work from C#. I've tried a .vbs script, but that briefly opens the powerpoint window.

    Read the article

  • Automatically reboot Windows8 if no internet activity

    - by GrapeCamel
    I have a media server located in a very inconvenient part of my house. Occasionally I will have to reset my router or it will reset itself. The issue is the PC loses connectivity for some reason, and I am forced to walk outside, around the house, into the basement, over a bunch of toys and weights and boxes, to push a button to reboot it. I would love to have it check itself every 5-10 minutes and auto reboot if it is unable to ping a given address/IP. Any ideas how to accomplish this?

    Read the article

  • linux: upload / download difference on network shares

    - by Batsu
    I have a Red Hat Enterprise Linux 6 (with SELinux) which shows significant differences of speed between download and upload (the latter significantly slower) of files shared over the LAN. The bottleneck seems to be the output of the linux machine since I have a rate around 1Mb/s when WinXP machines download files shared (using samba) by the RHEL machine uploading files from the RHEL to a WinXP's shared folder while uploading from the XP machines to linux's shares downloading XPs' shares on the RHEL any share between Windows machines only run smooth (around 50Mb/s). Since the upload from RHEL to WinXP's share is slowed too I would exclude an issue in the configuration of samba. What could possibly determine this limit in the upload speed? update: iptables doesn't show any output rule and disabling it doesn't show any noticeable difference, so I would rule out it too.

    Read the article

  • Copying a large directory tree locally? cp or rsync?

    - by Rory
    I have to copy a large directory tree, about 1.8 TB. It's all local. Out of habit I'd use rsync, however I wonder if there's much point, and if I should rather use cp. I'm worried about permissions and uid/gid, since they have to be preserved in the clopy (I know rsync does this). As well as thinks like symlinks. The destination is empty, so I don't have to worry about conditionally updating some files. It's all local disk access, so I don't have to worry about ssh or network. The reason I'd be tempted away from rsync, is because rsync might do more than I need. rsync checksums files. I don't need that, and am concerned that it might take longer than cp. So what do you reckon, rsync or cp?

    Read the article

  • rsync windows to linux permission denied

    - by user64908
    Using Command rsync -avzP --delete --omit-dir-times ../../ [email protected]:/var/www/mysite/ I'm getting rsync: mkstemp "/var/www/mysite/.." failed: Permission denied (13) If ext is in the www-data group should I still set all the files to be owned by user www-data? I am trying to publish the files with rsync and then set the permissions using sudo chown -R www-data doc sudo chgrp -R www-data doc but I can't even rsync because of the permission denied. The SSH works fine, the rsync too except when it tries to write over or update some of the files in /var/www Client * Windows 7 * Cygwin 1.7.16 (GNU bash, version 4.1.10(4)-release (i686-pc-cygwin)) * rsync version 3.0.9 protocol version 30 Server * Ubuntu 12.04 * Apache2 * Root Accounts [ubuntu,ext] * Groups [www-data] * sudo vigr has www-data:x:33:ubuntu,ext I have already configure this http://stackoverflow.com/questions/2124169/cwrsync-ignores-nontsec-on-windows-7 This article has also managed to confuse me http://unix.stackexchange.com/questions/41687/how-should-i-rsync-files-in-var-www-if-i-want-them-to-be-owned-by-www-data What is the right procedure?

    Read the article

< Previous Page | 256 257 258 259 260 261 262 263 264 265 266 267  | Next Page >