Search Results

Search found 17260 results on 691 pages for 'folder tree'.

Page 62/691 | < Previous Page | 58 59 60 61 62 63 64 65 66 67 68 69  | Next Page >

  • Windows 7 libraries and folder redirection nightmare

    - by Lobuno
    Hello! In our active directory we deploy a policy to our clients where the personal directory (My documents) is redirected to a file server of ours \server\share\username\Documents In older systems everything worked fine. in Windows 7 some users are experimenting the following symptoms: The Documents library is EMPTY Where the documents library should be shown in Explorer an empty white icon is displayed. No caption. Right clicking in the Documents library to edit the folders that are part of the libraries brings the dialog up. However, that dialog is unusable. No folder is present there and clicking Add folder does nothing. Deleting the library and auto-creating it doesn't solve the problem The shared directory can be accessed via UNC paths and it can be mounted as a shared drive as well. The library is still broken. The shared drives are on a W2008 indexed server... Using the Windows Library tool utility doesn't solve the problem. What can the cause of this problem be and how can this be solved?

    Read the article

  • Windows 7 libraries and folder redirection nightmare

    - by Lobuno
    Hello! In our active directory we deploy a policy to our clients where the personal directory (My documents) is redirected to a file server of ours \server\share\username\Documents In older systems everything worked fine. in Windows 7 some users are experimenting the following symptoms: The Documents library is EMPTY Where the documents library should be shown in Explorer an empty white icon is displayed. No caption. Right clicking in the Documents library to edit the folders that are part of the libraries brings the dialog up. However, that dialog is unusable. No folder is present there and clicking Add folder does nothing. Deleting the library and auto-creating it doesn't solve the problem The shared directory can be accessed via UNC paths and it can be mounted as a shared drive as well. The library is still broken. The shared drives are on a W2008 indexed server... Using the Windows Library tool utility doesn't solve the problem. What can the cause of this problem be and how can this be solved?

    Read the article

  • Unix Permissions issue with users belonging to the same group accessing a folder

    - by TK Kocheran
    I have a folder I'd really like to allow another user on this machine access to. I'm using mt-daapd to serve music to the network, so I'd like to enable the mt-daapd user to access my Music directory, /home/rfkrocktk/Music. The master user is rfkrocktk obviously. I've tried to set all of my permissions properly on the directory, but the mt-daapd user can't acces the files. I created a group called media-users and added both rfkrocktk and mt-daapd to it in order to give mt-daapd permission to simply read all of the files in that directory and subdirectories. If I run id on each of my users, here's what's displayed: $ id rfkrocktk > uid=1000(rfkrocktk) gid=1000(rfkrocktk) groups=1000(rfkrocktk),4(adm),20(dialout),24(cdrom),29(audio),46(plugdev),104(lpadmin),115(admin),120(sambashare),124(vboxusers),1001(jupiter),2002(media-users) $ id mt-daapd > uid=123(mt-daapd) gid=65534(nogroup) groups=65534(nogroup),2002(media-users) It definitely seems that both users are a part of the media-users group, so what could be going wrong? If I run ls -l on the actual Music directory to see its permissions, here's the output: drwxr-Sr-- 201 rfkrocktk media-users 12288 2011-01-13 12:26 Music If I run ls -l on the Music directory to get its children, here's the output: drwxr-Sr-- 3 rfkrocktk media-users 4096 2010-12-20 15:31 2DBoy drwxr-Sr-- 3 rfkrocktk media-users 4096 2010-05-25 12:50 ABBA drwxr-Sr-- 3 rfkrocktk media-users 4096 2009-12-28 15:19 Access Denied drwxr-Sr-- 10 rfkrocktk media-users 4096 2009-12-28 15:19 AC-DC drwxr-Sr-- 3 rfkrocktk media-users 4096 2009-12-28 15:19 Aerosmith drwxr-Sr-- 3 rfkrocktk media-users 4096 2010-06-04 10:45 A Flock of Seagulls drwxr-Sr-- 4 rfkrocktk media-users 4096 2010-05-28 18:13 Alestorm drwxr-Sr-- 3 rfkrocktk media-users 4096 2010-06-22 23:29 Amon Amarth drwxr-Sr-- 5 rfkrocktk media-users 4096 2009-12-28 15:19 Anberlin ... From this, it would seem that I should be able to access the folders from mt-daapd, but I can't. Running sudo -i -u mt-daapd ls -l /home/rfkrocktk/Music displays nothing, indicating to me that for whatever reason, mt-daapd doesn't have access to read the folder. What am I doing wrong?

    Read the article

  • Deleting files in the windows installer folder

    - by qw3n
    How do you clean up the windows/installer folder on a xp machine. I looked on different forums, but the tool many mention is no longer officially supported and from what I understand not specifically for this task. Also, I was confused on which tool to use or how to use it. The reason I ask this is I have an older computer with ~86gb drive and ~80gb of is being used by the windows/installer. I'm assuming that at least some of these are glitches and shouldn't be in there. Note that the person who uses the computer mentioned trying to interrupt an install at some point and I don't know if this has anything to do with it. Also, there are not that many programs installed on this computer ~25. Also, I know that similar questions has been asked several times already, but the accepted answer Is it safe to delete from C:\Windows\Installer? is mainly talking about is it safe to delete (along with most of the duplicates). I'm asking how to find and delete the files that shouldn't be there especially since were not talking 5-10gb but something that practically fills the entire hard drive, and for those who are wondering I ran CCleaner, but it doesn't seem to check this folder.

    Read the article

  • Very slow browsing shared folder XP client/host

    - by Ickster
    I have a pretty straightforward setup where I'm storing media files on an XP pro machine, and sharing the folder to be accessed by other XP pro machines around the house. (Typically, there's only one client accessing the share at a time, although there may be several with the share mounted.) It's been working just fine for years, but I've recently started having some problems. A couple of days ago, the host PC had power disconnected while it was running. It was restarted and everything seemed fine initially, but since then browsing the shared folder from client machines has been extremely slow and actually reading data is all but impossible. The problem exists in every access method I've tried: Windows Explorer, VLC dialogs, command line, etc. My first thought was that the disk was experiencing problems, but there are no problems viewing the files locally on the host machine. My second thought was that there was a network problem on the host machine, so I removed and reinstalled drivers for the NIC with no change. My third thought was that there might've been a problem elsewhere on the network, so I swapped out hardware to no avail. I'm regrouping and trying to come up with a methodical approach to figuring out what might be wrong. I would of course be thrilled if you can suggest specific problems (Microsoft KB articles, etc.) that I might check, but I'm not expecting a silver bullet. If you can help me outline an approach to identify the problem (including recommended tools, e.g., disk checkers, network analyzers, etc.) I'd greatly appreciate it.

    Read the article

  • Break all hardlinks within a folder

    - by Georges Dupéron
    I have a folder which contains a certain number of files which have hard links (in the same folder or somewhere else), and I want to de-hardlink these files, so they become independant, and changes to their contents won't affect any other file (their link count becomes 1). Below, I give a solution which basically copies each hard link to another location, then move it back in place. However this method seems rather crude and error-prone, so I'd like to know if there is some command which will de-hardlink a file for me. Crude answer : Find files which have hard links (Edit: To also find sockets, etc. that have hardlinks, use find -not -type d -links +1) : find -type f -links +1 A crude method to de-hardlink a file (copy it to another location, and move it back) : Edit: As Celada said, it's best to do a cp -p below, to avoid loosing timestamps and permissions. Edit: Create a temporary directory and copy to a file under it, instead of overwriting a temp file, it minimizes the risk to overwrite some data, though the mv command is still risky (thanks @Tobu). # This is unhardlink.sh set -e for i in "$@"; do temp="$(mktemp -d ./hardlnk-XXXXXXXX)" [ -e "$temp" ] && cp -ip "$i" "$temp/tempcopy" && mv "$temp/tempcopy" "$i" && rmdir "$temp" done So, to un-hardlink all hard links (Edit: changed -type f to -not -type d, see above) : find -not -type d -links +1 -print0 | xargs -0 unhardlink.sh

    Read the article

  • Shared files folder in Amazon Elastic Beanstalk environment

    - by por
    I'm working on a Drupal application, which is planned to be hosted in Amazon Elastic Beanstalk environment. Basically, Elastic Beanstalk enables the application to scale automatically by starting additional web server instances based on predefined rules. The shared database is running on an Amazon RDS instance, which all instances can access properly. The problem is the shared files folder (sites/default/files). We're using git as SCM, and with it we're able to deploy new versions by executing $ git aws.push. In the background Elastic Beanstalk automatically deletes ($ rm -rf) the current codebase from all servers running in the environment, and deploys the new version. The plan was to use S3 (s3fs) for shared files in the staging environment, and NFS in the production environment. We've managed to set up the environment to the extent where the shared files folder is mounted after a reboot properly. But... The Problem is that, in this setup, the deployment of new versions on running instances fail because $ rm -rf can't remove the mounted directory, and as result, the entire environment goes down and we need restart the environment, which isn't really an elegant solution. Question #1 is that what would be the proper way to manage shared files in this kind of deployment? Are you running such an environment? How did you solve the problem? By looking at Elastic Beanstalk Hostmanager code (Ruby) there seems be a way to hook our functionality (unmount if mounted in pre-deploy and mount in post-deploy) into Hostmanager (/opt/hostmanager/srv/lib/elasticbeanstalk/hostmanager/applications/phpapplication.rb) but the scripts defined in the file (i.e. /tmp/php_post_deploy_app.sh) don't seem to be working. That might be because our Ruby skills are non-existent. Question #2 is that did you manage to hook your functionality in Hostmanager in a portable way (i.e. by not changing the core Hostmanager files)?

    Read the article

  • Compress a folder of PDF files into separate zip archives

    - by Panrubius
    I wanted to take a folder full of PDF files and create a number of separate zip files, after following the advice on this question everything worked *almost*perfectly. Here's what happened: When I issued this command in Terminal: zip -s 5m -r ~/Desktop/invoices ~/Desktop/Invoices/ Everything worked really well, in that I got 11 ZIP files of approximately 5 MB each; placed in the folder specified. However, the files they outputted were named as follows: invoices.z01 invoices.z02 invoices.z03 invoices.z04 invoices.z05 invoices.z06 invoices.z07 invoices.z08 invoices.z09 invoices.z10 invoices.zip So as you can see only invoices.zip has been named correctly. I could go through and rename them one by one, but seriously, if we start doing that then what in the name of Evolution are computers for?! Now, I am also aware that I'm relatively new to the Terminal; so I could be making a very silly mistake somewhere. If that's the case, please be patient :-) Any help would be greatly appreciated. One last note: I'm quadriplegic so I would like to avoid GUI applications as much as possible, I use voice recognition software you see this working in the Terminal is much much easier.

    Read the article

  • Copy a website and preserve the file & folder structure

    - by DrStalker
    I have an old web site running on an ancient version of Oracle Portal that we need to convert to a flat-html structure. Due to damage to the server we are not able to access the administrative interface, and even if we could there is no export functionality that can work with modern software versions. It would be enough to crawl the website and have all the pages & images saved to a folder, but the file structure needs to be preserved; that is, if a page is located at http://www.oldserver.com/foo/bar/baz/mypage.html then it needs to be saved to /foo/bar/baz/mypage.html so that the various Javascript bits will continue to function. None of the web crawlers I've found have been able to do this; they all want to rename the pages (page01.html, page02.html etc) and break the folder structure. Is there any crawler out there that will recreate the site structure as it appears to a user accessing the site? It doesn't need to redo any of teh content of the pages; once rehosted the pages will all have the same names they did originally so links will continue to work.

    Read the article

  • On AWS EC2, Unable to run sudo command after modifying permissions to /usr folder

    - by Kayote
    All, We have searched quite a bit and a few of 'Eliah Kagan's' posts are great about getting access back to sudo. However, our server is on AWS EC2 & I am a complete newbie to this. We are trying to setup Cronjobs for backing up our server data. What we did: Using Putty, we created a script file: usr/share/site-db-backup/backupToS3.php, however, Ubuntu was not saving the changes we made as it reported we did not have permission as user 'Ubuntu'. Error details are: "Upload of file backupToS3.php was successful but error occurred while setting the permission &/ or timestamp. If the problem persists, turn on 'ignore permission errors' option. Permission denied. Error code: 3 Request code 9" So, we ran the command "sudo chmod -R a+rwx /usr" for granting permission to the folder 'usr'. However, now whatever sudo command is run, we get the error: "/usr/lib/sudo/sudoers.so must be only be writable by owner. fatal error, unable to load plugins." We are complete newbies to Ubuntu & EC2 so do need step by step guidance of how to get sudo back & successfully write to the Crontab script sitting in 'usr/' folder.

    Read the article

  • Cant access folder on server- Permission denied

    - by Michal Korzeniowski
    I am running a vps with ubuntu 11.04. After a clean Modx install I've tried to access http://www.encepence.pl/manager and I've got a permission denied by my server. the thing is that I can easily access any other folder under that domain and modify this folder(manager) content via ftp. I’ve tried modifying virtual host with that <Directory /var/www/blackflow/data/www/encepence.pl/manager/> Options Indexes FollowSymLinks ExecCGI AllowOverride All Order allow,deny Allow from all </Directory> But it didn't work. <Directory /var/www/blackflow/data/www/encepence.pl> Options -ExecCGI -Includes php_admin_value open_basedir "/var/www/blackflow/data:." php_admin_flag engine on </Directory> <VirtualHost 192.166.219.34:80 > ServerName encepence.pl CustomLog /var/www/httpd-logs/encepence.pl.access.log combined DocumentRoot /var/www/blackflow/data/www/encepence.pl ErrorLog /var/www/httpd-logs/encepence.pl.error.log ServerAdmin [email protected] ServerAlias www.encepence.pl SuexecUserGroup blackflow blackflow AddType application/x-httpd-php .php .php3 .php4 .php5 .phtml AddType application/x-httpd-php-source .phps php_admin_value open_basedir "/var/www/blackflow/data:." php_admin_value sendmail_path "/usr/sbin/sendmail -t -i -f [email protected]" php_admin_value upload_tmp_dir "/var/www/blackflow/data/mod-tmp" php_admin_value session.save_path "/var/www/blackflow/data/mod-tmp" VirtualDocumentRoot /var/www/blackflow/data/www/%0 </VirtualHost> Any ideas on what might have gone wrong?

    Read the article

  • Sharing a folder with Nautilus and NTFS external drive gets errors

    - by TheLQ
    I am trying to share a folder in Lubuntu over a network that's on an external NTFS drive. Due to the system that I have (rotating backup disks) this is probably the second time that the drive would of been mounted. Its manually mounted with a simple (for example) mount /dev/sdb1 /media/BACKUP On an internal NTFS disk I have successfully setup a network share and can access it. However on the external disk I can't from any other Windows computer. When setting up the share Nautilus said that it needs to change the other's permissions to allow for other users to write. However afterwords its still blank. Changing it to Read and Write just changes back to blank. Chowning the entire /media folder recursively and trying didn't work. Running PCManFM as root and changing didn't work. Adding "public=yes" to smb.conf and restarting didn't work. I'm out of idea's on what to do. What's weird is that it worked just fine on an internal NTFS disk, so why not the external one? Any solutions need to be able to managed inside of a gui (preferably Nautilus) as the person managing the machine isn't as tech savvy. Thanks

    Read the article

  • unable to copy file to folder, permission denied without explanation

    - by ValekHalfHeart
    Recently Norton Internet security deleted ml.exe (an assembler I use to program in masm32) off of my computer, thinking that one of the programs I had written with it was a virus (it was most certainly not). Fortunately, I had a copy of ml.exe backed up in an external hard drive, and tried to copy it over to my computer. The old ml.exe was located in C:\masm32\bin, so I tried to copy the new one to that location. After disabling Norton (which had opened the folder and preventing me from accessing it), I am still unable to copy the new file to C:\masm32\bin. When I tried, Windows announced that I would need Administrator permission to copy the file. Since I'm an admin, I figured this wouldn't be a problem although it was unexpected, as I have never had to provide administrator permission to access this folder before. However, instead of prompting me to enter my password, Windows simply refuses to copy the file: I repeat, I was not asked to provide a password. It simply says that I do not have permission. Does anyone know what's happening and how to fix it? Is Norton still causing problems, or it something else?

    Read the article

  • Using Samba to share a folder from a Linux guest with a Windows host in VirtualBox

    - by AmV
    I would like to share a folder from a Linux Guest with a Windows host (with read and write access if possible) in VirtualBox. I read in these two links: here and here that it's possible to do this using Samba, but I am a little bit lost and I need more information on how to proceed. So far, I managed to set up two network adapters (one NAT and one host-only) and install Samba on the Linux guest, but now I have the following questions: What do I need to type in samba.conf to share a folder from the Linux guest? (the tutorial provided in one of the links above only explains how to share home directories) Are there any Samba commands that need to be executed on the guest to enable sharing? How do I make sure that these folders are only available to the host OS and not on the Internet? Once the Linux guest is setup, how do I access each of the individual shared folders from the Windows host? I read that I need to mount a drive on Windows to do this, but do I use Samba logins, or Linux logins, also do I use localhost? or do I need to set up an IP for this? Thanks!

    Read the article

  • Mac dev folder missing, SSH not working

    - by SamGoody
    A few days ago, SSH stopped working. When I try logging in a get the following message: PTY allocation request failed on channel 0 stdin: is not a tty fatal: unrecognized command '' Connection to 74.52.61.194 closed. Web searches have shown me that there might be something wrong with /dev/std. But my computer lacks a /dev/ drive. There is an Alias to /dev/ [hidden, but I've revealed hidden files to do this search], but when I try to open it I am told that it cannot find the folder it is aliasing. Now, many a web search tells me that without a dev folder, the computer doesn't work, but it does seem to work, except the SSH. Also, are there any tools that can save my SSH preferences so that I don't have to, each time, type out the username@adrees, password, path all of which are long and complex? Not looking for a Filezilla type client, there are many of those. Looking for a command line like putty, that lets me use bash on the remote client. Am on Macbook Pro, latest version of Tiger.

    Read the article

  • Windows 7 sharing folder from command line, selecting users and triggering the "Apply" of changes

    - by clintp
    I have a drive that doesn't get mounted until after I log in. (A Truecrypt thumbdrive device, and no, I'm not making it a "System Favorite" to get around this.) I'd like to construct a batch file to share it once I've gotten it mounted because the sharing info doesn't seem to stick through a reboot. From the GUI, I'd go into the folder Properties-Sharing. And then in Advanced Sharing I'd pick the name to share it as. And then under the "Share..." button I'd pick the users and the permissions I want to grant them. After "Apply" there's a pause -- I'm not sure what's happening here, but the dialog says "Sharing Items..." -- and then everything is okay. From the command line, I've done: net share MyFolder=F:\MyFolder cacls F:\MyFolder /G FirstUser:F cacls F:\MyFolder /G OtherUser:F And this almost works. I can see the share on the network then, but nobody has permissions to do anything. If I go into the GUI and change anything (and I can see my command-line changes in there already) and press "Apply" I get the: "Sharing Items.... This may take a few minutes" Dialog... and then Voila! It works. I get the "Your folder is shared" dialog with the command-line changes I made, along with the GUI change that I made to trigger the "Sharing Items..." dialog. Everything's peachy. Is a service being restarted? Which one? What's triggering the sharing to take effect? And -- more importantly -- how do I do it from the command line?

    Read the article

  • virtual web folder served by PHP script

    - by Martin
    I am trying to configure my apache to be able to display (virtual) pages like: mywebpage.com/something1 mywebpage.com/something2 mywebpage.com/folder/something3 I would like these "somethingX" and "folder" folders to be only virtual, not physical directories. For a start it would be great to send all requests to mywebpage to one PHP script which will somehow receive the original path information (there is some SERVER array as far as I know) and call necessary PHP functions (so far I use addresses like mywebpage.com/index.php?page=blabla&otherparameters=values...). Is that possible? I am struggling with different combination, currently I am with following file in /etc/apache2/conf.d/something.conf (not working of course). What is the correct way to proceed? Thanks. <Location /myweb> SetHandler my-handler Action my-handler /srv/www/htdocs/myweb/product.php virtual </Location> My pages are in /srv/www/htdocs/myweb. I tried with Location, with Directory, with Action and SetHandler, with AddHandler... ;-) Some configurations were ignored, some caused "object not found" with nothing relevant in error log.

    Read the article

  • LighTPD and PHP not working if outside of LightTPD folder

    - by Marco83
    I need to set up a simple web server with PHP on Windows XP that a number of different people will use for local testing. I'm using LightTPD 1.4.30-4-IPv6-Win32-SSL and PHP 5.2. So far I've created this folder structure: tools/ LightTPD/ htdocs/ PHP/ I set up PHP as CGI and the document root as server_root + "/htdocs". It works fine (well, it's slow but I don't want to bother with FastCGI for now :) ). My problem is when I try to put the htdocs outside of LightTPD folder, like this: htdocs/ tools/ LightTPD/ PHP/ I update the document root to server_root + "/../../htdocs" and while static HTML pages work fine, PHP pages stop working (they return a "No input file specified"). I literally just change the document root, I didn't change anything in the php.ini or anywhere else. Please also note that I left all doc_root, user_dir and cgi.force_redirect to the default values in php.ini, and it works when htdocs is inside LightTPD, but not when I move it ouside. Any idea of why it's breaking?? Here's my lightTPD.conf: server.modules = ( "mod_access", "mod_accesslog", "mod_alias", "mod_cgi", "mod_status", ) include "variables.conf" include "mimetype.conf" # THIS WORKS server.document-root = server_root + "/htdocs" # THIS DOESN'T #server.document-root = server_root + "/../../htdocs" server.upload-dirs = ( temp_dir ) index-file.names = ( "index.php", "index.pl", "index.cgi", "index.cml", "index.html", "index.htm", "default.htm" ) server.event-handler = "libev" url.access-deny = ( "~", ".inc" ) $HTTP["url"] =~ "\.pdf$" { server.range-requests = "disable" } static-file.exclude-extensions = ( ".php", ".pl", ".cgi" ) server.errorlog = server_root + "/logs/error.log" ######### Options that are good to be but not neccesary to be changed ####### dir-listing.activate = "enable" #### CGI module cgi.assign = ( ".php" => server_root + "/../PHP/php-cgi.exe" ) status.status-url = "/server-status" status.config-url = "/server-config"

    Read the article

  • Common folder in linux

    - by rks171
    I have two users on my Ubuntu machine. I want to share some media files between these users, so I created a directory in /home/ called 'media'. I made the group 'media' and I added my user 'rks171' to the group 'media'. So: sudo groupadd media sudo mkdir -p /home/media sudo chown -R root.media /home/media sudo chmod g+s /home/media As was described in this post. Then, I added my user to the group: sudo usermod -a -G media rks171 Then I also added write permission to this folder for my group: sudo chmod -R g+w media So now, doing 'ls -lh' gives: drwxrwsr-x 2 root media 4.0K Oct 6 09:46 media I tried to copy pictures to this new directory from my user directory: mv /home/rks171/Pictures/* /home/media/ And I get 'permission denied'. I can't understand what's wrong. If I simply type, 'id', it doesn't show that my user, rks171, is part of the 'media' group. But if I type, 'id rks171', then it does show that my user, rks171, is part of the 'media' group. Anybody have any ideas why I can't get an files into this common folder?

    Read the article

  • Vbscript - Checking each subfolder for files and copy files

    - by Kenny Bones
    I'm trying to get this script to work. It's basically supposed to mirror two sets of folders and make sure they are exactly the same. If a folder is missing, the folder and it's content should be copied. Then the script should compare the DateModified attribute and only copy the files if the source file is newer than the destination file. I'm trying to get together a script that does exactly that. And so far I've been able to check all subfolder if they exist and then create them if they don't. Then I've been able to scan the top source folder for it's files and copy them if they don't exist or if the DateModified attribute is newer on the source file. What remains is basically scanning each subfolder for its files and copy them if they don't exist or if the DateModified stamp is newer. Here's the code: Dim strSourceFolder, strDestFolder strSourceFolder = "c:\users\vegsan\desktop\Source\" strDestFolder = "c:\users\vegsan\desktop\Dest\" Set fso = CreateObject("Scripting.FileSystemObject") Set objTopFolder = fso.GetFolder(strSourceFolder) Set colTopFiles = objTopFolder.Files 'Check to see if subfolders actually exist. Create if they don't Set objColFolders = objTopFolder.SubFolders For Each subFolder in objColFolders CheckFolder subFolder, strSourceFolder, strDestFolder Next ' Check all files in first top folder For Each objFile in colTopFiles CheckFiles objFile, strSourceFolder, strDestFolder Next Sub CheckFolder (strSubFolder, strSourceFolder, strDestFolder) Set fso = CreateObject("Scripting.FileSystemObject") Dim folderName, aSplit aSplit = Split (strSubFolder, "\") UBound (aSplit) If UBound (aSplit) > 1 Then folderName = aSplit(UBound(aSplit)) folderName = strDestFolder & folderName End if If Not fso.FolderExists(folderName) Then fso.CreateFolder(folderName) End if End Sub Sub CheckFiles (file, SourceFolder, DestFolder) Set fso = CreateObject("Scripting.FileSystemObject") Dim DateModified DateModified = file.DateLastModified ReplaceIfNewer file, DateMofidied, SourceFolder, DestFolder End Sub Sub ReplaceIfNewer (sourceFile, DateModified, SourceFolder, DestFolder) Const OVERWRITE_EXISTING = True Dim fso, objFolder, colFiles, sourceFileName, destFileName Dim DestDateModified, objDestFile Set fso = CreateObject("Scripting.FileSystemObject") sourceFileName = fso.GetFileName(sourceFile) destFileName = DestFolder & sourceFileName if Not fso.FileExists(destFileName) Then fso.CopyFile sourceFile, destFileName End if if fso.FileExists(destFileName) Then Set objDestFile = fso.GetFile(destFileName) DestDateModified = objDestFile.DateLastModified if DateModified <> DestDateModified Then fso.CopyFile sourceFile, destFileName End if End if End Sub

    Read the article

  • Installing sfJqueryTreeDoctrineManagerPlugin to Symfony 1.4

    - by Christine Q.
    I have faced serious difficulties while installing sfJqueryTreeDoctrineManagerPlugin to Symfony 1.4 w/ Doctrine ORM. The installation directly from the server did not work out like with previous plugins that I have installed: C:\pathsymfony plugin:install sfJqueryTreeDoctrineManagerPlugin plugin installing plugin "sfJqueryTreeDoctrineManagerPlugin" No release available for plugin "sfJqueryTreeDoctrineManagerPlugin" This is why I needed to install the plugin by downloading the tgz-archive and install it manually like this: C:\pathsymfony plugin:install "C:\path\to\downloads\sfJqueryTreeDoctrineManagerPlugin-1.2.4.tgz" plugin installing plugin "C:\path\to\downloads\sfJqueryTreeDoctrineManagerPlugin-1.2.4.tgz" sfSymfonyPluginManager Installing web data for plugin I guess everything should be fine this far? After that I edited \apps\admin\config\settings.yml like instructed in the plugins readme file. all: .settings: enabled_modules: [default, sfJqueryTreeDoctrineManager] I also checked that the plugin was enabled in \config\ProjectConfiguration.class.php like this: $this->enablePlugins(array( // other plugins, 'sfJqueryTreeDoctrineManagerPlugin' )); I published assets and cleared cache: C:\pathsymfony plugin:publish-assets >> plugin Configuring plugin - sfJqueryTreeDoctrineManagerPlugin C:\pathsymfony cc Finally I added the required helper to the newly created apps\admin\modules\category\templates\indexSuccess.php <?php use_helper("sfJqueryTreeDoctrine"); echo get_nested_set_manager("Category", "name"); When loading the page I unfortunately get the following error: 500 | Internal Server Error | InvalidArgumentException Unable to load "sfJqueryTreeDoctrineHelper.php" helper in: SF_ROOT_DIR\apps\admin\modules/businessunitgroup/lib/helper, SF_ROOT_DIR\apps\admin\lib/helper, SF_ROOT_DIR\lib/helper, SF_SYMFONY_LIB_DIR/helper. The file sfJqueryTreeDoctrineHelper.php exists indeed but not in any previously mentioned folder. The file can only be found in the folder \plugins\sfJqueryTreeDoctrineManagerPlugin\lib\helper. I guess that Symfony doesn't look to that folder while finding helpers? I have tried to move the helper file to one of the previously mentioned folders. As expected, that changes the error. Now I get: 500 | Internal Server Error | sfConfigurationException The component does not exist: "sfJqueryTreeDoctrineManager", "manager". Unfortunately I can't figure out how should I be able to retrieve the "missing" component from the correct folder. I would be very grateful for any advice to help me forward. By the way, I am aware that there are other nested-set / tree plugins available for Symfony (like sfDoctrineTreePlugin and caPropelTreePlugin) but unluckily those are either uncompatible or too limited for my needs.

    Read the article

  • Setting directory security to allow user and deny all

    - by Rita
    I have winforms app, in which I need to access a secured directory. I'm using impersonation and create WindowsIdentity to access the folder. My problem is writing unit tests to test the directory security; I'd like to a write a code that creates a directory secured to only ONE user, which isn't the current user running the UT (or else the test would be worthless). I know how to add permissions to a certain user, but how can I deny the rest, including admins? (in case the user running the UT is an admin) (will this be a wise thing to do?) DirectoryInfo directoryInfo = new DirectoryInfo(path); DirectorySecurity directorySecurity = directoryInfo.GetAccessControl(); directorySecurity.AddAccessRule(new FileSystemAccessRule("Domain\SecuredUser", FileSystemRights.FullControl, InheritanceFlags.ContainerInherit | InheritanceFlags.ObjectInherit, PropagationFlags.InheritOnly, AccessControlType.Allow)); directorySecurity.RemoveAccessRule(new FileSystemAccessRule("??", FileSystemRights.FullControl, InheritanceFlags.ContainerInherit | InheritanceFlags.ObjectInherit, PropagationFlags.InheritOnly, AccessControlType.Deny)); directoryInfo.SetAccessControl(directorySecurity); This isn't working. I don't know who am I supposed to deny. Domain\Admins, Domain\Administrators, me... No one is being denied, and when I check folder's security - The SecuredUser has access to the folder, but the permissions are not checked, even though I specified FullControl. Basically I want to code this: <authorization> <allow users ="Domain\User" /> <deny users="*" /> </authorization> I was thinking about impersonating UT run with a weak user with no permissions, but this would result in: Impersonate - Run UT - Impersonate - Access folder, and I'm not sure if this is the right design. Help would be greatly appreciated, thank you.

    Read the article

  • SharePoint: Filtering a List that has Folders

    - by Gary McGill
    I have a SharePoint document library that has a folder structure used for organizing the documents (but also for controlling access, via permissions on the folders). The documents in the library are updated every month, and we store every month's version of the document in the same folder; there's a "month" column used for filtering that will contain values like Jan 09, Feb 09, etc. It looks like this: Title Month ----- ----- SubFolder 1 SubFolder 2 [] Interesting Facts Jan 09 [] Interesting Facts Feb 09 [] Interesting Facts Mar 09 [] Fascinating Numbers Jan 09 [] Fascinating Numbers Feb 09 ... Now, because users will generally be most interested in the 'current' month, I'd like them to be able to apply a filter, and select (say) Mar 09. However, if they do this using the built-in filtering, it also filters out the folders, and they can no longer navigate the folder hierarchy. This is no good - I want them to be able to move between folders with the filter intact, so that they don't need to keep switching it off and on again. I figured I might be able to use a custom view (selecting where type=folder or month=[month]), and to an extent that does work. However, I can only get it to work for a fixed month, whereas I need the user to be able to select the month - perhaps via a drop-down control on the page (and I don't want to create 60 views for 5 years' worth of months, nor do I want to have to create a new view every month). I thought it might be possible to create a view in code (rather than via the UI), but I've not been able to figure out how to get a dynamic value (a user-specific setting) into the CAML query. Any pointers gratefully appreciated! And by the way, I am aware of the dogma that folders are bad, and that everything should just be a list. However, having considered the alternatives, I still favour using folders - if I can solve this problem. Thanks in advance.

    Read the article

  • More localized, efficient Lowest Common Ancestor algorithm given multiple binary trees?

    - by mstksg
    I have multiple binary trees stored as an array. In each slot is either nil (or null; pick your language) or a fixed tuple storing two numbers: the indices of the two "children". No node will have only one child -- it's either none or two. Think of each slot as a binary node that only stores pointers to its children, and no inherent value. Take this system of binary trees: 0 1 / \ / \ 2 3 4 5 / \ / \ 6 7 8 9 / \ 10 11 The associated array would be: 0 1 2 3 4 5 6 7 8 9 10 11 [ [2,3] , [4,5] , [6,7] , nil , nil , [8,9] , nil , [10,11] , nil , nil , nil , nil ] I've already written simple functions to find direct parents of nodes (simply by searching from the front until there is a node that contains the child) Furthermore, let us say that at relevant times, both all trees are anywhere between a few to a few thousand levels deep. I'd like to find a function P(m,n) to find the lowest common ancestor of m and n -- to put more formally, the LCA is defined as the "lowest", or deepest node in which have m and n as descendants (children, or children of children, etc.). If there is none, a nil would be a valid return. Some examples, given our given tree: P( 6,11) # => 2 P( 3,10) # => 0 P( 8, 6) # => nil P( 2,11) # => 2 The main method I've been able to find is one that uses an Euler trace, which turns the given tree, with a node A to be the invisible parent of 0 and 1 with a depth of -1, into: A-0-2-6-2-7-10-7-11-7-2-0-3-0-A-1-4-1-5-8-5-9-5-1-A And from that, simply find the node between your given m and n that has the lowest number; For example, to find P(6,11), look for a 6 and an 11 on the trace. The number between them that is the lowest is 2, and that's your answer. If A is in between them, return nil. -- Calculating P(6,11) -- A-0-2-6-2-7-10-7-11-7-2-0-3-0-A-1-4-1-5-8-5-9-5-1-A ^ ^ ^ | | | m lowest n Unfortunately, I do believe that finding the Euler trace of a tree that can be several thousands of levels deep is a bit machine-taxing...and because my tree is constantly being changed throughout the course of the programming, every time I wanted to find the LCA, I'd have to re-calculate the Euler trace and hold it in memory every time. Is there a more memory efficient way, given the framework I'm using? One that maybe iterates upwards? One way I could think of would be the "count" the generation/depth of both nodes, and climb the lowest node until it matched the depth of the highest, and increment both until they find someone similar. But that'd involve climbing up from level, say, 3025, back to 0, twice, to count the generation, and using a terribly inefficient climbing-up algorithm in the first place, and then re-climbing back up. Are there any other better ways?

    Read the article

  • Right rotate of tree in Haskell: how is it work?

    - by Roman
    I don't know haskell syntax, but I know some FP concepts (like algebraic data types, pattern matching, higher-order functions ect). Can someone explain please, what does this code mean: data Tree ? = Leaf ? | Fork ? (Tree ?) (Tree ?) rotateR tree = case tree of Fork q (Fork p a b) c -> Fork p a (Fork q b c) As I understand, first line is something like Tree-type declaration (but I don't understand it exactly). Second line includes pattern matching (I don't understand as well why do we need to use pattern matching here). And third line does something absolutely unreadable for non-haskell developer. I've found definition of Fork as fork (f,g) x = (f x, g x) but I can't move further anymore.

    Read the article

< Previous Page | 58 59 60 61 62 63 64 65 66 67 68 69  | Next Page >