Search Results

Search found 46908 results on 1877 pages for 'managing files and folder'.

Page 557/1877 | < Previous Page | 553 554 555 556 557 558 559 560 561 562 563 564  | Next Page >

  • SQL Server Backup problem when browsing to the directory

    - by Richard West
    I want to allow a group (eg. 'BackupManagers') who can only preform backup and restore operations on certain databases. When creating the BackupManagers user account I checked db_backupoperator. When the user logs in to create a backup they get an error message similar to the following when the select Tasks - Backup - Click on Add in the destiantion block - click on the "..." button to browse TITLE: Locate Database Files - MYSERVER\SQL2005 E:\MSSQL\Backup Cannot access the specified path or file on the server. Verify that you have the necessary security privileges and that the path or file exists. If you know that the service account can access a specific file, type in the full path for the file in the File Name control in the Locate dialog box. I have confirmed that the user has permissions to the folder. I have even created a share to this folder and had them access it through explorer. They are able to create and delete files within the folder. I have found that if they type in the path to the file instead of using the "..." button to browse the directory tree then they can create a backup file fine. Why is the browse button not working as expected? Thanks!

    Read the article

  • Filesystem access through web interface

    - by Jorge Suárez de Lis
    I have an SSH+Samba server so people can access its files from anywhere on the network. I thought it would be also interesting to provide access through a web interface, so they can access the files even when they don't have access to the VPN or a Samba/SSH client. Something like the Ubuntu One or Dropbox web interface. The http server could be on the same machine as the SSH+Samba, so it should just provide access to local files and some way to login with their username/password. Someone knows any software like this?

    Read the article

  • Executing Secondary Applications

    - by JooBlow
    I have an application I am attempting to make "portable". The application contains a lot of secondary utility functions that I would like to execute on external files(from the app). I tried adding them in the build process but I didn't get any "Executables" for them(just the main one and a few others). Is there a way to get these to excute? They are basically command line utility functions to process some text files but use large files in the distribution and are also used by the main application. Thanks

    Read the article

  • Mac friendly file sharing from VirtualBox

    - by kitsched
    I have set up Ruby on Rails on Ubuntu into a VirtualBox instance on my PC, I enabled Samba and I'm connecting to it via the home network from my Mac. All is fine except that I have some issues deleting some files from inside applications e.g. in Sublime Text 2 when I right click a file in the browser and select delete nothing happens (same in my Git client). To be able to delete files I have to navigate to the folder in Finder (which leaves those nasty .DS_Store files scattered all around) or issue the delete command from the terminal (inconvenient). If you're asking why I'm using VirtualBox for Rails instead of doing the development directly on the Mac it's because the ease of portability. So my question is: are there any network sharing options which I could use to make the Linux instance play nicer with my Mac?

    Read the article

  • file-name encoding problems

    - by tenhouse
    I googled over this topic but couldn't find what I was looking for... the following "happend" to me: I had my files stored on a NTFS-USB Harddisk, because of space problems I moved them to an ext3 system....somehow the filename (content is still ok as far as I saw) encoding screwed up....my files look like the following now: Kküken <--- should have an "ü" Jäger <--- should be an "ä" Zwölf <--- should be an "ö" fünfte <-- should be an "ü" etc .... These are just examples, but already give me my first question Why has the "ü" two different representations? (Maybe I screw up, before I screw up and now I have a mixing of x different encoding-layers? :) ) I tried the following command: convmv -r -f UTF-8 -t ISO-8859-1 * This command work for some files (for example Zwölf) but not for all: iso-8859-1 doesn't cover all needed characters for: "fünfte" So Iguess it must be another encoding - but which? How can I find out this? And is there any way that I can still fix all of this?

    Read the article

  • I made an .htaccess template; is there anything else that should be added or changed?

    - by purpler
    # DEFAULTS ServerSignature Off AddDefaultCharset UTF-8 DefaultLanguage en-US SetEnv Europe/Belgrade SetEnv SERVER_ADMIN [email protected] # Rewrites RewriteEngine On RewriteBase / # Redirect to WWW RewriteCond %{HTTP_HOST} ^serpentineseo.com RewriteRule (.*) http://www.serpentineseo.com/$1 [R=301,L] # Cache media files <filesMatch "\.(gif|jpg|jpeg|png|ico|swf|js)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> <FilesMatch "\.(js|css|pdf|swf)$"> Header set Cache-Control "max-age=604800" </FilesMatch> <FilesMatch "\.(html|htm|txt)$"> Header set Cache-Control "max-age=600" </FilesMatch> # DONT CACHE <FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$"> Header unset Cache-Control </FilesMatch> # Deny access to .htaccess <Files .htaccess> order allow,deny deny from all </Files>

    Read the article

  • Can't figure out why hard drive is full [closed]

    - by Belgin Fish
    Possible Duplicate: How do I find out what is using up all the space on my / partition? No Free disk space so I have 2 hard drives in my server, one main one that is 10gb and then a separate one that is 2tb I'm storing all the files on the second one and the df -h output looks like this Filesystem Size Used Avail Use% Mounted on /dev/sda2 9.2G 8.8G 0 100% / tmpfs 1.5G 0 1.5G 0% /lib/init/rw udev 1.5G 148K 1.5G 1% /dev tmpfs 1.5G 0 1.5G 0% /dev/shm /dev/sda4 1.8T 747G 981G 44% /home /dev/sda4 1.8T 747G 981G 44% /usr/lib/cgi-bin I just can't figure out why the first one is full when all the files are being stored in the /usr/lib/cgi-bin I'm running debian I can't seem to find any files that would take up 8.8gb that arn't on the second hard drive :S Thanks!

    Read the article

  • Nagios configuration management

    - by HannesFostie
    I am going to implement Nagios (most likely anyway, could turn out to be another tool as well) and I was wondering if anyone would like to share their best practices when it comes to creating, managing and maintaining the config files when it comes to scalability and managability as I find that it might quickly become a real big mess. Any tips, examples or even full configurations would be most welcome and I'd happily look them over. Tools would be welcome as well. Tried out NConf so far, but the generated config files don't seem to do what was promised (not including the parent information for one, and just a PITA to get them working - they generate a ton of errors when checking the config files with the script supplied by nagios) Thanks

    Read the article

  • How to set JS source directory in apache2?

    - by highBandWidth
    I am trying to run a very basic webserver for development/debugging. The static HTML seems to be delivered correctly, but it seems that the JavaScript libraries are not being delivered to the browser. The page HTML says something like <html> <head> <script type='text/javascript' src="/lib/json.js"></script> ... Now, I have set up a link for /lib/ in my httpd.conf as: Scriptalias /lib/ "/SomeFolder/lib/" When I do this, it can't fetch the files because this is what I see in my apache error log: ... [error] [client ::1] client denied by server configuration: /SomeFolder/lib/json.js, referer: http://localhost/SomeSite It seems that apache is not allowing access to the folder, so I add this to httpd.conf: Directory "/SomeFolder/lib/"> Allow from all </Directory> After this, browsing the page still does not run the JS, instead I see the following error in my apache error log: [error] [client ::1] (13)Permission denied: exec of '/SomeFolder/lib/json.js' failed, referer: http://localhost/SomeSite So now, it seems that apache is trying to run the JS files on the server like a cgi script or something. But I have not made that folder a cgi-bin folder. The only lines where SomeFolder is mentioned by name is in these lines in httpd.conf: Scriptalias /lib/ "/SomeFolder/lib/" Directory "/SomeFolder/lib/"> Allow from all </Directory>

    Read the article

  • Migration of egroupware from one Ubuntu server to another

    - by Chris Schmidt
    I am new to server administration and have been attempting to migrate from one Ubuntu server to another for 4 days now. I am having a problem with migration of egroupware settings. Specifically, I need to find where knowledgebase saves the files on the local machine. I have the database set correctly and its using the previous settings. However, articles in knowledgebase are not showing images associated with them. I have tried everything to find the file on the old server that stores data files uploaded to the knowlegebase but I cannot. If there is another way to import these files, or if someone knows where they are saved, I would really appreciate the help. -Chris

    Read the article

  • Best Pratices for a Network File Share?

    - by Chris
    So we have a file share that was started 10 years or so ago and it started off with the best intentions. But now it's gotten bloated, there's files in there that nobody know who put them there, it's hard to find information, ect ect. You probably know the problem. So what I'm wondering is what do people do in this situation. Does anyone know of a decent program that can go through a file share and find files that no body has touched? Duplicate files? Any other suggestions on cleaning this mess up?

    Read the article

  • 403 Forbidden when trying to download file that was uploaded using SSH

    - by Simon Hartcher
    I have FTP access to an Apache server on linux to upload files so that they can be downloadable from the web. I recently was granted SSH access for extra permissions and figured that it would be quicker to download the files directly to the server, instead of downloading them to my machine then FTPing to the server. When I downloaded a file using SSH to the server, and then placed it in the public_html directory, it was not visible from the web. The permissions (from SSH and the FTP client) were the same as all the other files that are visible, but it was not visible in the directory listing, and if I tried to type in the filename into my browser I would get a 403 error. Obviously, when I FTP a file to the server something else happens that makes it web visible, that I am not currently privy to. What am I missing that is causing the file to be invisible from the web?

    Read the article

  • download management

    - by Jonathan
    I download many files, usually 2 or 3 a day, often 10ish. Some of them are duplicates because I just can't be bothered to find the original in my downloads folder. I have previously tried DAP and used that to create a new subfolder for each day's download. yet I have found this insufficient as sometimes I wish to find files by name/file type or I have multiple parts of downloads over more than one day. Another problem I have found is zips/rars/etc after downloading them and extracting them I then have the zip and the folder. I like it like on a Mac where it automatically extracts the zip after it has been downloaded and removes the zip. What I'd like to be able to do is sort the downloads by date, but dynamically so they are just in the big downloads folder, but I can just press a button and it will show me all the files from a particular site, or from a particular day or by a certain file type. Is there any software that will do this? I use Chrome as a browser but also have Firefox and like that. Jonathan

    Read the article

  • Windows XP error message: "Windows cannot find 'explorer.exe'"

    - by Meysam
    In Windows XP I can open "My Computer" and see all the hard drives. I can also see the explorer.exe process running among other processes in Task Manager. But after opening "My Computer", when I double click on one of the drives to open it, I get the following error message: Windows cannot find 'explorer.exe'. Make sure you typed the name correctly, and then try again. To search for a file, click the start button, and then click search. Although I could detect and remove several suspicious files using Malwarebytes & Microsoft Security Essentials, the problem still remains. The interesting point is that if I right click on one folder and select Open or Explore from the menu bar, I can open the folder! but if I double click on the folder, it does not open and I get the above error message. How can I fix this problem? Any advice would be appreciated! Update: I formatted the C: drive (NTFS), a deep format, and installed a fresh Windows XP on it. I am not getting this error when I double click on C drive icon anymore. But the same error appears when I double click on other drive names. Maybe I should format them too!

    Read the article

  • Address bar in Finder?

    - by wag2639
    I'm used to knowing where all my files are (and I'm anal about it -- I don't need Mr Jobs thinking he knows best about where my files should go). Is there a way to get an address bar to show up in Finder in OSX (10.5+) like in Explorer in Windows or Nautilus in Gnome. Edit: I also want to be able to copy the address bar. Perhaps the workflow is different on a Mac, but I'm use to throughly sorting my files under many layers of folders and then when I need to upload or download something, or access a file in command line or etc, I can copy and paste that directly into the file dialog. To clarify, my goal is to have an experience like in Windows: press Ctrl + D (CMD + L) and Ctrl + C.

    Read the article

  • Overcrowded Windows XP Folders

    - by BlairHippo
    I know that, technically, an individual Windows XP directory can hold an immense number of files (over 4.29 billion, according to a quick Google search). However, is there a practical ceiling where too many files in one directory starts having an impact on reads to those files? If so, what factors would exacerbate or help the issue? I ask because my employer has several hundred XP machines in the field at client sites, and the performance on some of the older ones is getting "sludgy." The machines download and display client-defined images, and my supervisor and I suspect that our slacktastic approach to cache management could be to blame. (Some of the directories have tens of thousands of images in them.) I'm trying to gather evidence to support or contest the theory before spending time on a coding fix.

    Read the article

  • I started getting a weird message "Encrypting file system - Back up your file encryption key"

    - by Ove
    I started getting a strange message when I start my computer. An icon appears in the system tray, and a popup tells me "Encrypting file system - Back up your file encryption key". I know what EFS is, but I don't use it. To my knowledge, I don't have any encrypted files on my partition. I have searched using Total Commander on all the partitions for files that have the "encrypted" attribute, but I found nothing. So I don't have any encrypted files. Does anyone know what I did to get this message?

    Read the article

  • Cannot access drive in Windows 7 after scandisk lockup, but can in safe mode....

    - by Matt Thompson
    I ran scandisk on my external USB drive due to the inability to delete a few files. Windows asked me if I wanted to unmount the drive before the scan, warning me that it would be unusable until the scan was finished, and I said yes. During the scan, my machine locked up, and I was forced to reboot the machine. When it came up, I was unable to access the drive, getting an error that "L:is not accessible, access is denied". Comupter Management sees the drive, and has the proper amount of disk space filled. I booted into safe mode, and can access the drive with no problems, and I noticed that in explorer, all the folders have locks on them. I booted back into windows, but still could not access the drive, getting the same error as above. Hovever, if I right click on the drive, select properties, and go to Customize, in the folder pictures ares, I select Choose File, and a window open up, that shows the root of the directory, with all the folder able to be accessed, but again, the icon is the folder icon with a lock on it. I can even copy files from the drive to another. So, the files are not gone, windows can obviously access the drive no matter what it thinks, so there has to be a problem with the flag windows put on the drive when it ran the original scan that failed. I was able to run a scan both in safe mode with no problems, and in windows. In windows, I received the cannot access error the first time I run scan disk on it, but if I try again, it works fine. Any ideas on how to clear the flag that windows set, so I can access the drive normally again?

    Read the article

  • "I/O Error Occurred" in vSphere Client working with ESXi

    - by Chris
    I have a datastore set up in ESXi where I put all my ISOs. Somehow, something broke (I don't know what) and now I can't upload files to that (or any other) datastore. For large, ISO-sized files, the "Uploading..." dialog pops up, hangs for a while, and then the "I/O Error Occurred" displays. For smaller files (10 meg neighborhood), the "Uploading..." dialog comes up, a progress bar starts going, and it estimates a time remaining. Then it hangs at 1 second remaining for a while, and the same "I/O Error Occurred" comes up. Has anyone seen a problem like this?

    Read the article

  • How to change drag & drop behaviour in Windows 7's explorer?

    - by Pekka
    I have a new touch screen, and am playing around with its functionality. The most productive use for me is organizing files (literally) by hand. It's fun working through a list of files, dragging and dropping them to the right locations using your index finger. It feels better on the wrist than mouse-clicking, too. The only problem is that when I drag & drop files across drives in Windows 7, the default behaviour is to copy the file instead of moving it. I know I can influence this using right click, but that is of course no option in my situation. How can I change the default drag & drop behaviour in Windows 7's explorer?

    Read the article

  • Using GUI ftp on Win7 and Vista without additional software

    - by Stephen Jones
    Goal: provide a 'no-software' method for 'less technical' users to access password protect ftp location from Win7 and Vista (existing approach for WinXP works). 'No software' method to mean without installing additional software (e.g. FileZilla, WinSCP) - the solution is supplied to external non-technical users. WinXP (works): Using Windows Explorer, WinXP supports non-technical ftp access by pasting: ftp://username:[email protected] into the address bar. The remote ftp site's files / directory structure becomes available and can be copied to / from easily (in the style of local file copy / paste) by a 'less technical' user. Win7 / Vista (doesn't work): Pasting the same URL into the Windows Explorer on Win7 or Vista causes an error: An error occurred opening that folder on the FTP server. Make sure you have permission to access that folder. Details: The connection with the server was reset. Notes: a) The same username/password/server typed from the (DOS) command line achieves access to the server, but this is a more 'technical' solution than desired. I am looking for a WinXP equivalent solution. b) Under 'Control Panel' / 'Internet options' / 'Advanced' tab - the boxes for 'Enable FTP folder view' and 'Use Passive FTP' are ticked (enabled) c) Adding an inbound firewall rule for local port 20 (TCP) was attempted with no difference in results (i.e. failure)

    Read the article

  • Enable FTP on OS X 10.8 Mountain Lion Server

    - by Oleg Trakhman
    There is a LAN comprising several mac machines (iMac, Mac Pro, macbook etc.), Airport Express router and Mac Mini Server running OS X Server 10.8 (Mountain Lion Server). I need to share a folder on Mac Mini Server by FTP. What did I try so far: Made special partition for FTP Access, call it "Reports" So shared folder would be "/Volumes/Reports" Gave access every user and group in system, and also enabled guest access. I checked posix acl, which is "rwxrwxrwx", I checked sharing settings in "Preferences.app" and "Server.app" Checked that users have access to FTP service Enabled FTP in Server.app I tried access to shared folder (by FTP): via Cyberduck via Finder via shell: ftp server.local And what I got: $ ftp [email protected] Trying 10.0.2.2... Connected to server.local. 220 10.0.2.2 FTP server (tnftpd 20100324+GSSAPI) ready. 331 User ftpuser accepted, provide password. Password: 530 User ftpuser may not use FTP. and $ ftp [email protected] Trying 10.0.2.2... Connected to server.local. 220 10.0.2.2 FTP server (tnftpd 20100324+GSSAPI) ready. 331 User admin accepted, provide password. Password: 530 User admin denied by SACL. ftp: Login failed ftp> (admin is administrator account , ftpuser is special user account made to access ftp) What I'm doing wrong? Getting really tired of this...

    Read the article

  • How to use BT or emule across 2 or more hard drives?

    - by the searcher
    One difficulty with BT or emule is that, when the hard drive is full, we constantly need to move older files to a new hard drive so that we can download newer files. We can change BT or emule's setting so that the folder for downloading points to the new hard drive, but then, what if emule haven't finished downloading for some files that are hard to find, and it is 92% done... in that case, we would like to keep the old setting so that when the last 8% arrives, it can go into the correct file. (and same for BT, if we haven't finished some file or if we want to seed something later). So is there a good way to let BT or emule point to 2 hard drives, or somehow let the new hard drive "merge" into the existing hard drive / folder?

    Read the article

  • Clean URLS on Hiawatha

    - by Botto
    I am using the Hiawatha web server and running drupal on a FastCGI PHP server. The drupal site is using imagecache and it requires either private files or clean urls. The issue I am having with clean urls is that requests to files are being rewritten into index.php as well. My current config is: UrlToolkit { ToolkitID = drupal RequestURI exists Return Match (/files/*) Rewrite $1 Match ^/(.*) Rewrite /index.php?q=$1 } The above does not work. Drupal's apache set up is: <Directory /var/www/example.com> RewriteEngine on RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?q=$1 [L,QSA] </Directory>

    Read the article

  • re: 3ware raid 10 (4drive) suggested stripe size suggestions?

    - by dasko
    looked around on the site but nothing really concrete on my question. i will have about 120GB of data total, files are made up of 5MB files, excel, word and about 25 .pst files that are about 1.2GB each. Yes they use .pst over network, even though it is not recommended this is legacy setup without issue so we will continue to support this for another year or so. I need to know what you think about a stripe size of 256kb for the raid 10 based on the above requirements. I did try and bench with these settings and it seems alright without any real issue, just trying to rule out anything i might of missed. thanks.

    Read the article

< Previous Page | 553 554 555 556 557 558 559 560 561 562 563 564  | Next Page >