Search Results

Search found 46908 results on 1877 pages for 'managing files and folder'.

Page 102/1877 | < Previous Page | 98 99 100 101 102 103 104 105 106 107 108 109  | Next Page >

  • Merging Two KML Files to Display Them with Different Marker Icons on Google Maps

    - by Maxim Z.
    Let's say that I have two spreadsheets with addresses. I uploaded these spreadsheets into Google Fusion Tables, geocoded the addresses, and exported the results as KML files. Now, I want to take these two KML files and merge them, while maintaining the location data and using it to map the points with Google Maps. Well, I found a way to easily merge the KML files: import both of them into a "My Maps" map with Google Maps! However, my problem is this: when I do that, all of the locations in my data have the same marker icon on the map. From past experience, I know that these markers can be somehow defined inside the KML files. Is it possible to combine these two KML files while giving one's points one marker icon and the other's points another marker icon? Just in case my question is confusing, what I mean, is giving the first set of points blue markers, for example, and the other set of points red markers, so that they can be overlayed.

    Read the article

  • How rotate TomCat 6 logs on Windows every night

    - by Danilo Brambilla
    Hi all, our TomCat 6 is running on a Windows Server 2003 server producing some logs on Program Files\Apache Software Foundation\Tomcat 6.0\logs folder. Only catalina.YYYY-MM-DD.log rotates every night. Admin. Host-Manager. Jakarta. LocalHost. Manager. stderr. stdout does not roate and are dated at the last server restart date. These files are most empty and always locked. How can I set TomCat to rotate all these logs every night (if possible without server/service restart)? Thank you in advance for help.

    Read the article

  • Folder/File permission transfer between alike file structure

    - by Tyler Benson
    So my company has recently upgraded to a new SAN but the person who copied all the data over must have done a drag n' drop or basic copy to move everything. Apparently Xcopy is not something he cared to use. So now I am left with the task of duplicating all the permissions over. The structure has changed a bit ( as in more files/folders have been added) but for the most part has been stayed unchanged. I'm looking for suggestions to help automate this process. Can I use XCopy to transfer ONLY permissions to one tree from another? Would i just ignore any folders/permissions that don't line up correctly? Thanks a ton in advance, Tyler

    Read the article

  • copying folder and file permissions from one user to another after switching domains [closed]

    - by emptyspaces
    Please excuse the title, this was the best way I could think to describe this scenario without an entire paragraph. I am using C#. Currently I have a file server running windows server 2003 setup on a domain, we will call this oldDomain, and I have about 500 user accounts with various permissions on this server. Because of restrictions out of my control we are abandoning this domain and using another one that is more dominant within the organization, we will call this newDomain. All of the users that have accounts on oldDomain also have accounts on newDomain, but the usernames are completely different and there is no link between the two. What I am hoping to do is generate a list of all user accounts and this appropriate sid's from AD on the oldDomain, I already have this part done using dsquery and dsget. Then I will have someone go through and match all of the accounts from oldDomain to the correct username on newDomain. Ultimately leaving me with a list of sids from oldDomain and the appropriate username from newDomain. Now I am hoping to copy the file and folder permissions from the old user from oldDomain to the new user on newDomain once I join the server to newDomain. Can anyone tell me what the best way to copy permissions from the sid to the user on newDomain? There are a bunch of articles out there about copying permissions from user a to user b but I wanted to check and see what the recommended practice is here since there are a ton of directories.

    Read the article

  • RHEL5: Can't create sparse file bigger than 256GB in tmpfs

    - by John Kugelman
    /var/log/lastlog gets written to when you log in. The size of this file is based off of the largest UID in the system. The larger the maximum UID, the larger this file is. Thankfully it's a sparse file so the size on disk is much smaller than the size ls reports (ls -s reports the size on disk). On our system we're authenticating against an Active Directory server, and the UIDs users are assigned end up being really, really large. Like, say, UID 900,000,000 for the first AD user, 900,000,001 for the second, etc. That's strange but should be okay. It results in /var/log/lastlog being huuuuuge, though--once an AD user logs in lastlog shows up as 280GB. Its real size is still small, thankfully. This works fine when /var/log/lastlog is stored on the hard drive on an ext3 filesystem. It breaks, however, if lastlog is stored in a tmpfs filesystem. Then it appears that the max file size for any file on the tmpfs is 256GB, so the sessreg program errors out trying to write to lastlog. Where is this 256GB limit coming from, and how can I increase it? As a simple test for creating large sparse files I've been doing: dd if=/dev/zero of=sparse-file bs=1 count=1 seek=300GB I've tried Googling for "tmpfs max file size", "256GB filesystem limit", "linux max file size", things like that. I haven't been able to find much. The only mention of 256GB I can find is that ext3 filesystems with 2KB blocks are limited to 256GB files. But our hard drives are formatted with 4K blocks so that doesn't seem to be it--not to mention this is happening in a tmpfs mounted ON TOP of the hard drive so the ext3 partition shouldn't be a factor. This is all happening on a 64-bit Red Hat Enterprise Linux 5.4 system. Interestingly, on my personal development machine, which is a 32-bit Fedora Core 6 box, I can create 300GB+ files in tmpfs filesystems no problem. On the RHEL5.4 systems it is no go.

    Read the article

  • How to set the VirtualDocumentRoot based on the files within

    - by Chuck Vose
    I'm trying to set up Apache to use the VirtualDocumentRoot directive but my sites aren't all exactly the same. Most of the sites have a drupal folder which should be the root but there are a few really old drupal sites, a few rails sites, some django sites, etc. that want the Document root to be / or some other folder. Is there a way to set up VirtualDocumentRoot based on a conditional or is there a way to use RewriteRule/Cond to detect that / is the incorrect folder if there is a drupal folder or a public folder? Here's what I have so far: <VirtualHost *:80> # Wildcard ServerAlias, this is the default vhost if no specific vhost matches first. ServerAlias *.unicorn.devserver.com # Automatic ServerName, based on the HTTP_HOST header. UseCanonicalName Off # Automatic DocumentRoot. This uses the 4th level domain name as the document root, # for example http://bar.foo.baz.com/ would respond with /Users/vosechu/Sites/bar/drupal. VirtualDocumentRoot /Users/vosechu/Sites/%-4/drupal </VirtualHost> Thanks in advance! -Chuck

    Read the article

  • Serving only certain files from a directory to users on IIS7

    - by HarbingTarbl
    I'm have a need to show the most up to date version of a certain file in a directory to users who access a folder on my site (lets call this folder logs). I can't just move the file into the folder as another process relies on being able to find and edit this file while it is running. At first I had thought I could just create a folder on my site, give it the correct permissions and then create a symbolic link to the file. However it seems IIS7 does not follow symlinks. Another solution would be to create a phpscript that pulls the correct file and displays it, but that felt like over-engineering the solution. I know that on Apache this would be simple, but I can't figure out how to do it with IIS7. To give an idea of the folder structure I'm working with. The directory looks like this. Root --File I need to serve. --File containing plain text passwords. --Other folders/files. I can't move any of these files. If I just serve the entire directory using Virtual Directories in IIS I'll also be sharing files and folders containing configuration and other sensitive information.

    Read the article

  • Open source command line tools for indexing a large number of text files

    - by ergosys
    I'm looking for any open source command line tool or tools which will allow me to index and search a large number of plain text files. Approximate search would be a plus. The tool only needs to print the files that match, although some match context would be useful. A GUI tool isn't useful for my application, nor is anything that searches files one by one (grep for example). I'm basically targeting unix platforms (osx, linux, bsd). EDIT: I'm not interested in any sort of tool that is system-wide, or needs to run in the background. Basically, I want to build an index for a directory tree full of text files and then later be able to search against it. Preferably the index is one or a few files that I can specify the location of. Any ideas?

    Read the article

  • serving static assets via http is really slow compared to sshfs (apache2/nginx)

    - by s1lv3r
    After migrating to a new VPS I had some users complaining about slow loading images on their sites. After creating some test files with dd I realized that I can download all files via sshfs with full speed while downloads via web are painfully slow. The larger the file is and the longer the transfer takes, the slower the transfer speed gets. I thought I had some problems with Apache and just spend the whole evening with replacing Apache2 against nginx for static file serving - with no effect at all. No I/O wait states in top. Tons of RAM free, no high CPU utilization and hdparm shows a decent I/O performance at all times. I just have no idea anymore, what's happening on this server. This is a link to a demo file: http://master.dealux.de/file.tgz Anybody an idea what I can check out?

    Read the article

  • How to check for duplicate files?

    - by miorel
    I have an external hard drive on which I have backed up files several times. Some files were modified between backups, others were not. Some may have been renamed. Now I'm running out of space, and I'd like to clean up duplicate files. My idea was to md5sum every file on the drive, then look for duplicates, and diff the relevant files (just in case, haha). Is this the best way to do this? What are some other methods of checking for duplicate files?

    Read the article

  • Recursively move files in sub-dirs to new sub-dirs of same name

    - by Gabriel
    I have a batch of files all ending with the same string, ie: *_ext.dat located in several sub-dirs along with several other files, in a given main dir. This is the structure: /main_dir/subdir1/file11_ext.dat /main_dir/subdir1/file12_ext.dat /main_dir/subdir1/file13_ext.dat /main_dir/subdir1/file14_other.dat /main_dir/subdir1/file15_other.dat /main_dir/subdir2/file21_ext.dat /main_dir/subdir2/file22_ext.dat /main_dir/subdir2/file23_ext.dat /main_dir/subdir2/file24_other.dat /main_dir/subdir2/file25_other.dat /main_dir/subdir3/file31_ext.dat /main_dir/subdir3/file32_ext.dat /main_dir/subdir3/file33_ext.dat /main_dir/subdir3/file34_other.dat /main_dir/subdir3/file35_other.dat I need to recursively move only the files ending in *_ext.dat into a new main dir, new_dir, respecting the sub-dir structure so the files will end up in an equivalent dir structure like this: /new_dir/subdir1/file11_ext.dat /new_dir/subdir1/file12_ext.dat /new_dir/subdir1/file13_ext.dat /new_dir/subdir2/file21_ext.dat /new_dir/subdir2/file22_ext.dat /new_dir/subdir2/file23_ext.dat /new_dir/subdir3/file31_ext.dat /new_dir/subdir3/file32_ext.dat /new_dir/subdir3/file33_ext.dat Because of this the command should also create those sub-dirs with their corresponding names. I know that with a line like this one: find . -name "*_ext.dat" -print0 | xargs -0 rm -rf I can delete all those files, but I don't know how to modify it to do what I need (or if it is even possible).

    Read the article

  • Linux - Create ftp account with read/write access to only 1 folder

    - by Gublooo
    Hey guys.... I have never worked on linux and dont plan on working on it either - The only command I probably know is "ls" :) I am hosting my website on Eapps and use their cpanel to setup everything so never worked with linux. Now I have this one time case - where I need to provide access to a contractor to fix the CSS issues on my website. He basically needs FTP (read/write) access to certain folders. At a high level - this is my code structure /home/webadmin/example.com/html/images /css /js /login.php /facebook.php /home/webadmin/example.com/application/library /views /models /controllers /config /bootstrap.php /home/webadmin/example.com/cgi-bin I want the new user to be able to have access to only these folders /home/webadmin/example.com/html/js /home/webadmin/example.com/html/css /home/webadmin/example.com/application/views He should not be able to view even the content of other folders including files like bootstrap.php or login.php etc If any sys admins can help me set this account up - will really appreciate it. Thanks

    Read the article

  • Linux - Create ftp account with read/write access to only 1 folder

    - by Gublooo
    Hey guys.... I have never worked on linux and dont plan on working on it either - The only command I probably know is "ls" :) I am hosting my website on Eapps and use their cpanel to setup everything so never worked with linux. Now I have this one time case - where I need to provide access to a contractor to fix the CSS issues on my website. He basically needs FTP (read/write) access to certain folders. At a high level - this is my code structure /home/webadmin/example.com/html/images /css /js /login.php /facebook.php /home/webadmin/example.com/application/library /views /models /controllers /config /bootstrap.php /home/webadmin/example.com/cgi-bin I want the new user to be able to have access to only these folders /home/webadmin/example.com/html/js /home/webadmin/example.com/html/css /home/webadmin/example.com/application/views He should not be able to view even the content of other folders including files like bootstrap.php or login.php etc If any sys admins can help me set this account up - will really appreciate it. Thanks

    Read the article

  • Linux - Create ftp account with read/write access to only 1 folder

    - by Gublooo
    Hey guys.... I have never worked on linux and dont plan on working on it either - The only command I probably know is "ls" :) I am hosting my website on Eapps and use their cpanel to setup everything so never worked with linux. Now I have this one time case - where I need to provide access to a contractor to fix the CSS issues on my website. He basically needs FTP (read/write) access to certain folders. At a high level - this is my code structure /home/webadmin/example.com/html/images /css /js /login.php /facebook.php /home/webadmin/example.com/application/library /views /models /controllers /config /bootstrap.php /home/webadmin/example.com/cgi-bin I want the new user to be able to have access to only these folders /home/webadmin/example.com/html/js /home/webadmin/example.com/html/css /home/webadmin/example.com/application/views He should not be able to view even the content of other folders including files like bootstrap.php or login.php etc If any sys admins can help me set this account up - will really appreciate it. Thanks

    Read the article

  • How do I setup unison to sync a folder one way

    - by Rob
    I have a 1tb NAS that has a 1tb usb external hard attached I have prepared the file system on the usb disk and mounted it I want to 100% sync my data from my nas to the usb disk - but I want it to be incremental and only have the NAS as the 'master' - eg if a file changes on the usb external hard drive I want it to ignore this change as its not the live version (not that I think the files will change on the usb disk but im paranoid the live could get overwritten) Also if a file gets deleted on live I want to retain the deleted file on the usb disk Can unison sync one-way and achive the above for me? if so with simply unison sorce/ target/ Work? Thanks Rob

    Read the article

  • Backing up Excel Files to a different Directory

    - by Joe Taylor
    In Excel 2007 in the Save As box there is an option to 'Create a Backup' which simply backs up the file whenever it is saved. Unfortunately it backs up the file to the same directory as the original. Is there a simple way to change this directory to another drive / folder? I have messed about with macros to do this, coming up with: Private Sub Workbook_BeforeClose(Cancel As Boolean) 'Private Sub Workbook_BeforeSave(ByVal SaveAsUI As Boolean, Cancel As Boolean) 'Saves the current file to a backup folder and the default folder 'Note that any backup is overwritten Application.DisplayAlerts = False ActiveWorkbook.SaveCopyAs Filename:="T:\TEC_SERV\Backup file folder - DO NOT DELETE\" & _ ActiveWorkbook.Name ActiveWorkbook.Save Application.DisplayAlerts = True End Sub This creates a backup of the file ok the first time, however if this is tried again I get: Run-Time Error '1004'; Microsoft Office Excel cannot access the file 'T:\TEC_SERV\Backup file folder - DO NOT DELETE\Test Macro Sheet.xlsm. There are several possible reasons: The file name or path does not exist The file is being used by another program The workbook you are trying to save has the same name as a... I know the path is correct, I also know that the file is not open anywhere else. The workbook has the same name as the one I'm trying to save over but it should just overwrite. I have posted the question about the coding on Stack Overflow but wondered if there is an easier way to do this. Any help would be much appreciated. Joe

    Read the article

  • Unable to remove Read-Only attribute from folder in Windows XP

    - by elcuco
    I have this directory which I cannot remove the read only attribute from. The computer is running XP SP2 (or SP3, not sure) and the directory sits in a NTFS file system. Looking into the web I found this: http://support.microsoft.com/kb/256614 which tells that if the directory is "customized" it's treated as a system folder and thus "read only". I don't think this is a scenario in my case, but anyway it's not helping, their recommendation is more or less: attr -r -s /d /s d:\data and this is not working for me. Any other ideas? More info: The directory is served to an HTTP server (wamp) and the directory is an SVN check out. What happens is that the web server cannot write files into the directory (imagechace from drupal is you are really interested). Edit 2: The original post claimed that the directory sits on a VFAT FS, however I booted Fedora 11 from livecd and the partition is marked as NTFS. Edit 3: I left the company which I worked on, on which this situation happened... so I cannot fully close this question. But things get even worse: I tested the "attr -r" answer I put, it did not work for me, and now the developer said that it worked for her. A nice WTF moment. Probably a reboot helped... Sorry for loosing details. If anyone has the same problem, and one of the answers helps him - please comment.

    Read the article

  • How restore back up email files in qmail

    - by Maysam
    I have problem with restoring some old backup mail files in a mail server that uses qmail. The problem is, when I copy a new email file to the /cur directory, the number of emails in front of inbox increases, but when I click on the inbox, I don't see the newly copied email. I can only see the old emails. I also deleted maildirsize and courierimapuiddb files and they where automatically created again, but it didn't help and I cannot still see the email in my inbox. Is there something I am missing? How can I restore the backed up email files? Please note that when I copy the email files in /.sent-mail/cur directory, they are all displayed in my sent box, but that doesn't happen for inbox files in /cur directory.

    Read the article

  • File access with hostname or ip only - no domain?

    - by Jonathon
    It seems likely that this is an obvious question, but I'm having trouble tracking down any useful information. Normally when accessing files in a particular directory on a server, I'm able to create a virtual host, assign a domain, root directory location, etc -- however am in a situation where I have server space and need to access files with only a hostname. Is this possible? For example, let's say the hostname is 123hostname.com, and the file I want access to is in /home/sub-directory/filename.php. How do I get at it via a browser? I've tried: http://123hostname.com/home/sub-directory/filename.php ...and some other variations on that theme (that I can't post because new users are restricted to one link in messages). But generally stuck. Any help -- even if it's just to let me know that this isn't possible without some additional configuration -- would be great. Thank you!

    Read the article

  • Windows Server 2008 R2 - VPN Folder Sharing Permissions

    - by daveywc
    I have setup VPN access to my Windows Server 2008 R2 server using RRAS. Clients can connect, run applications, view shares etc. My problem is that one of the applications that they use relies on some network shares. The application is not able to access the shares unless the user first goes into Windows Explorer and accesses the share, providing their user name and password (the same one that they use to connect via the VPN). Previously on a different Windows Server 2008 (not R2) this was not necessary i.e. the application and user could access the share without providing another user name and password. I have tried giving the Everyone group full control over the shared folder - both on the Security tab and in the Permissions area under Advanced Sharing on the Sharing tab. This still did not resolve the issue. (I don't really want to give Everyone access anyway - I was hoping that granting access to a group that the VPN users had membership of would be enough). I have also turned off password protected sharing in the Advanced Sharing Settings area of the Network and Sharing Center (under both Home or Work and Public). So my question is what is preventing my VPN users from having access to these folders without having to re-supply the same login and password that they use to access the VPN? And what is the best practice in this type of scenario?

    Read the article

  • The Network folder specified is currently mapped using a different user name and password

    - by Frank Thornton
    I have a NAS device, it has 3 shares. On one computer I have access to all 3 of the shares. On another computer I keep getting this error when try and add a 2nd one. The Network folder specified is currently mapped using a different user name and password [...] That is the message I keep getting. What causes that? EDIT: Every share has it's own username and password. EDIT: NET USE on the one running 3 from the same NAS device New connections will be remembered. Status Local Remote Network ------------------------------------------------------------------------------- OK T: \\192.168.2.5\SHARE1 Microsoft Windows Network OK X: \\Nas-1dsho-abc\SHARE2 Microsoft Windows Network Disconnected Y: \\192.168.2.9\backups Microsoft Windows Network OK Z: \\Nas-1dsho-abc\cbackups Microsoft Windows Network The command completed successfully. NET USE on the other: New connections will be remembered. Status Local Remote Network ------------------------------------------------------------------------------- OK Y: \\192.168.2.5\SHARE1 Microsoft Windows Network Unavailable Z: \\192.168.2.5\SHARE2 Microsoft Windows Network The command completed successfully.

    Read the article

  • How to share files between cPanel accounts?

    - by Darren
    I am setting up a multi-site/multi-store Magento installation, and I want each site to have its own cPanel account so I can setup the SSL and dedicated IP properly. I have tried to create a linux group called 'magento' and changed the files I need to share to that group (even added the users to that group), however when I try to access files through my scripts on those accounts it doesn't acknowledge the files exist. I first made a soft symbolic link which didn't work and then including them to their real location but it didn't work. Am I missing a step in allowing which users can access which files? I added the users to the magento group and like I said changed the group of the files I need to share to them but it's still not working. Thanks, Darren

    Read the article

  • VMWare shared folder out of sync

    - by JochenJung
    After booting the guest system, the shared folders are in sync and all works well. But as soon, as I make a change to one of the files on my host system (Windows 7), the file on the guest system (Ubuntu) looses its last characters and still has the old version. So the actual change is not synced. However the version on the guest gets truncated. If I delete the file on the host and create a new one with new content, everything is in sync again. It all started happening when I updated VMWare player 6.0.1. And its happening for my Ubuntu guests only (Redhat works fine). How can I tell VMware tools to force a sync on the shared folders? Host: Windows 7 Guest: Ubuntu 12.04.3 LTS VMWare: 6.0.1 build-1379776 VMWare-Tools: VMwareTools-9.6.1-1378637.tar.gz

    Read the article

  • View hidden contents of usb device

    - by Srikanth Suresh
    I have a USB with the following contents on ls -lah total 8.0K drwx------ 1 srikanth srikanth 4.0K May 27 22:54 . drwxr-xr-x 4 root root 4.0K May 28 19:37 .. -rw------- 2 srikanth srikanth 0 May 27 22:52 Files.az3w On viewing the properties of the folder I have the following information: 90.4Mb used and 16.1GB free There is data in this pen drive which I am currently unable to view also it is sensitive. After searching about hiding contents in a USB I think that there is a hidden partition here that I cant access. How should I proceed to view the contents without damaging the files already present?

    Read the article

  • Windows 7, files reappear after deletion.

    - by HeavyWave
    I'm trying to delete some files from a folder. I've taken ownership of the files and the folder. When I delete these files Windows doesn't report any errors and deletes them. BUT, after I press F5 these files reappear again. There are no messages whatsoever, they are just undeletable. I know login off will help, but how do I fix it without going through the pain of closing everything down? P.S. Files disappear from the folder after aprox. 5 minutes. Update. Turns out my version of Windows did not properly upgrade from test version, so it had some weird disk drive issues.

    Read the article

< Previous Page | 98 99 100 101 102 103 104 105 106 107 108 109  | Next Page >