Search Results

Search found 5490 results on 220 pages for 'shadow folders'.

Page 96/220 | < Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >

  • How do I get folder size with Exchange Web Services 2010 Managed API?

    - by Adam Tuttle
    I'm attempting to use EWS 2010 Managed API to get the total size of a user's mailbox. I haven't found a web service method to get this data, so I figured I would try to calculate it. I found one seemingly-applicable question on another site about finding mailbox sizes with EWS 2007, but either I'm not understanding what it's asking me to do, or that method just doesn't work with EWS 2010. Noodling around in the code insight, I was able to write what I thought was a method that would traverse the folder structure recursively and result in a combined total for all folders inside the Inbox: private int traverseChildFoldersForSize(Folder f) { int folderSizeSum = 0; if (f.ChildFolderCount > 0) { foreach (Folder c in f.FindFolders(new FolderView(10000))) { folderSizeSum += traverseChildFoldersForSize(c); } } folderSizeSum += (int)f.ManagedFolderInformation.FolderSize; return folderSizeSum; } (Assumes there aren't more than 10,000 folders inside a given folder. Figure that's a safe bet...) Unfortunately, this doesn't work. I'm initiating the recursion with this code: Folder root = Folder.Bind(svc, WellKnownFolderName.Inbox); int totalSize = traverseChildFoldersForSize(root); But a Null Pointer Exception is thrown, essentially saying that [folder].ManagedFolderInformation is a null object reference. For clarity, I also attempted to just get the size of the root folder: Console.Write(root.ManagedFolderInformation.FolderSize.ToString()); Which threw the same NPE exception, so I know that it's not just that once you get to a certain depth in the directory tree that ManagedFolderInformation doesn't exist. Any ideas on how to get the total size of the user's mailbox? Am I barking up the wrong tree?

    Read the article

  • xcodebuild: error: 'file' is not a workspace file

    - by Vladimir Voitekhovski
    there were no problem but now my job in Jenkins CI failed. I try to rename Xcode workspace like RacingPost.xcworkspace no changes. Try to delete this options at all, this job also failed. Configuration: XCODEPROJECTDIRECTORY=. TARGET_BUILD_DIR=${XCODEPROJECTDIRECTORY}/build XCODEWORKSPACE=RacingPost.xcodeproj XCODESCHEME=UnitTests XCODECONFIGURATION=ENTERPRISE-HD XCODESDK=iphoneos XCODEARGS="TEST_AFTER_BUILD=YES" XCODEBUILD_APP_NAME=RacingPost.app XCODETARGET=UnitTests XCODEPROJECT=RacingPost.xcodeproj IPA_PATH=${TARGET_BUILD_DIR}/RacingPost.ipa xcrun -sdk iphoneos PackageApplication -v "$(${XCTOOL_HOME}/xctool.sh -scheme RacingPost -project ${XCODEPROJECT} -configuration "${XCODECONFIGURATION}" -sdk "${XCODESDK}" -showBuildSettings -workspace ${WORKSPACE}| grep TARGET_BUILD_DIR | cut -d = -f 2 | cut -d . -f 1 | head -1 | sed 's/^[ ^t]*//')/${XCODEBUILD_APP_NAME}" -o "${IPA_PATH}" Output: 17:33:25 ** BUILD SUCCEEDED ** (58104 ms) 17:33:27 [RGP-ODC_RacingPost_iPad_staging] $ /bin/bash -xe /var/folders/df/575wx61n4dzdlw_48pgsjwk40000gn/T/hudson6052564280091633098.sh 17:33:27 + cd . 17:33:27 ++ /Users/epadmin/ci-tools/xctool/xctool.sh -scheme RacingPost -project RacingPost.xcodeproj - configuration ENTERPRISE-HD -sdk iphoneos -showBuildSettings -workspace /Users/epadmin/jenkins-slave/workspace/RGP- ODC_RacingPost_iPad_staging 17:33:27 ++ grep TARGET_BUILD_DIR 17:33:27 ++ cut -d = -f 2 17:33:27 ++ cut -d . -f 1 17:33:27 ++ head -1 17:33:27 ++ sed 's/^[ ^t]*//' 17:33:30 xcodebuild: error: '/Users/epadmin/jenkins-slave/workspace/RGP-ODC_RacingPost_iPad_staging' is not a workspace file. 17:33:30 + xcrun -sdk iphoneos PackageApplication -v /RacingPost.app -o ./build/RacingPost.ipa 17:33:30 error: Specified application doesn't exist or isn't a bundle directory : '/RacingPost.app' 17:33:30 Build step 'Execute shell' marked build as failure My structure of folders on node: epadmin@epclus1macp02:~/jenkins-slave/workspace/RGP-ODC_RacingPost_iPad_staging$ ls drwxr-xr-x 21 epadmin staff 714B May 29 10:28 ./ drwxr-xr-x 37 epadmin staff 1.2K May 29 10:33 ../ drwxr-xr-x 7 epadmin staff 238B May 29 10:28 .svn/ drwxr-xr-x 10 epadmin staff 340B Apr 18 09:02 RacingPost/ drwxr-xr-x 10 epadmin staff 340B May 29 09:27 RacingPost.xcodeproj/ drwxr-xr-x 10 epadmin staff 340B May 7 09:04 RacingPostUtilApp/ drwxr-xr-x 3 epadmin staff 102B May 18 02:16 Source/ drwxr-xr-x 75 epadmin staff 2.5K Apr 16 09:03 UnitTests/ drwxr-xr-x 16 epadmin staff 544B May 29 09:05 _certs/ drwxr-xr-x 3 epadmin staff 102B Apr 15 09:03 _doc/ drwxr-xr-x 4 epadmin staff 136B Apr 15 09:03 _provisioning/ drwxr-xr-x 3 epadmin staff 102B May 29 10:31 build/ -rw-r--r-- 1 epadmin staff 9.9K May 19 04:39 build.xml

    Read the article

  • Relative path from an ASP.NET user control NavigateUrl

    - by Daniel Ballinger
    I have a user control that contains a GridView. The GridView has both a HyperLinkField column and a template column that contains a HyperLink control. The ASP.NET project is structured as follows, with the Default.aspx page in each case using the user control. Application Root Controls UserControl with GridView SystemAdminFolder Default.aspx Edit.aspx OrganisationAdminFolder Default.aspx Edit.aspx StandardUserFolder Default.aspx Edit.aspx Note: The folders are being used to ensure the user has the correct role. I need to be able to set the DataNavigateUrlFormatString for the HyperLinkField and the NavigateUrl for the HyperLink to resolve to the Edit.aspx page in the corresponding folder. If I set the navigate URL to "Edit.aspx" the URL in the browser appears as 'http://Application Root/Controls/Edit.aspx' regardless of the originating directory. I can't use the Web application root operator (~/) as the path needs to be relative to the current page, not the application root. How can I use the same user control in multiple folders and resolve the URL to another page in the same folder? Note: The question is strongly based off a similar question by azhar2000s on the asp.net forums that matches my problem.

    Read the article

  • how could installations/configurations be easier in linux?

    - by ajsie
    although you can do anything in linux it tends to require a lot of tweaking in config files and reading a lot of manuals/tutorials before you can have it running in your way. i know that it gets a lot easier by time, and the apt-get installations with ubuntu/debian is heading the right way. but how can linux be more userfriendly for us in the future? i thought that if more is automated like an IDE environment, eg. typing svn will give us all the commands and description about each command when you move between commands with your keyboard. that would be great. but that's just one example. another is the navigation in the terminal between folders. now you have to type a lot just to jump from/to different folders. would be great with some more automatization here too. i know that these extra features will slow down the server, but its 2010 now, and these features are not that heavy for the cpu, but makes it more userfriendly and encourage maintainance of a server, not frighten u off. what do you think about this? should/could we have more user friendly linux environment in servers, something that has annoyed you a lot? a lot of things are done in the unix way, but maybe we should reinvent the wheel in some areas, cause apparently, its so...repeatingly today and difficult to do easy tasks. it should be easier i think..

    Read the article

  • Silverlight 4 Drag and Drop Treeview

    - by Rich
    Does anyone have an example for any of the following scenarios. Given, these are all dynamically populated trees. Not using a Heirarchal data template, but by iterating through object collections manually and appending children at the appropriate level. Treeview1 has 3 levels, but items can only be reordered within their level. So, lets say we have Drives, Folders and Files. Drives can be rearranged in an order, but not put into a Folder. When navigated down one level in a drive, the individual folders can be reordered, but not dragged between drives.. and same with files, only can be reordered, but not moved to a different folder or drive I have 2 treeviews, Treeview1 is the same as #1 above and Treeview2 is like a picklist of available items. A user can drag an item from Treeview2 to Treeview1, but it can only be placed at Treeview1's File Level. The dragged item cannot be a child of a file, or placed at the folder level, nor placed at the drive level. Also, how to handle the Above, On Top, or Below an item. I have yet to come across these examples.

    Read the article

  • Is there a detailed description of optimizations in the Android build process?

    - by Daniel Lew
    I've been curious as to all the optimizations that go into the building of an .apk. I'm curious because of two things I've tried in the past to bring down the size of my .apk: I have had a few large json assets in projects before, as well as a static sqlite database. I tried bringing down the size of the apk by gzipping them before the build process, but the resulting size is exactly the same. I just today tried pngcrush on my /drawable/ folders. The resulting build was exactly the same size as before. I would think that perhaps #1 could be explained by the zip process, but simply zipping the /drawable/ folders in #2 result in different-sized files. Perhaps the build process runs something akin to pngcrush? Regardless, I was wondering if anyone knew where to find a detailed description of all the optimizations in the Android build process. I don't want to waste my time trying to optimize what is already automated, and also I think it'd help my understanding of the resulting apk. Does anyone know if this is documented anywhere?

    Read the article

  • AjaxControlToolkit Resource Files Not Copied To Output in MSBuild Script

    - by Dario Solera
    I'm new to MSBuild, but I managed to setup the following simple script: <Project ToolsVersion="3.5" DefaultTargets="Compile" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> <PropertyGroup> <Configuration Condition="'$(Configuration)' == ''">Debug</Configuration> </PropertyGroup> <ItemGroup> <SolutionRoot Include=".." /> <BuildArtifacts Include=".\Artifacts\" /> <SolutionFile Include="..\SolutionName.sln" /> </ItemGroup> <Target Name="Clean"> <RemoveDir Directories="@(BuildArtifacts)" /> </Target> <Target Name="Init" DependsOnTargets="Clean"> <MakeDir Directories="@(BuildArtifacts)" /> </Target> <Target Name="Compile" DependsOnTargets="Init"> <MSBuild Projects="@(SolutionFile)" Properties="OutDir=%(BuildArtifacts.FullPath);Configuration=$(Configuration)" /> <MakeDir Directories="%(BuildArtifacts.FullPath)\_PublishedWebsites\RDE.XAP.UnifiedGui.Web\Temp" /> </Target> </Project> The solution has 23 projects, 4 of which are WebApps. Now, the script works fine and the output is generated correctly. The only problem I counter is with two WebApp projects in the solution that use the AJAX Control Toolkit. The toolkit has a set of folders (e.g. ar, it, es, fr) that contain localized resources. These folders are not copied in the bin directory of the WebApps when the solution is built in MSBuild, but they are copied when it is built in Visual Studio. How can I solve this in a clean manner? I know I could write a (quite convoluted) task that copies the directories after the compile, but it does not seem the right solution to me. Also, neither Google, SO and MSDN could provide more details on this kind of issue.

    Read the article

  • Sorting shell items like windows explorer

    - by Roy M Klever
    Hi, I am making a bread crumb bar in Delphi and having some problems regarding sorting the dropdown of the bread crumbs. Strangely enough, even Vista is not consequent when showing these items. I have tried many ways to figure out what is system folders, what is zip files and what is normal folders. It seems like an easy task but so far I have not found any good way of doing it. One way is to use TypeDisplayName from TSHFileinfo but these are localized names so I can not be sure they will be in correct order in every language. Here is the code I use to fill the menu: bool:= IsDesktop(SelectedPIDL); if bool then OleCheck(SHGetDesktopFolder(CurFolder)) else OleCheck(DesktopShellFolder.BindToObject(SelectedPIDL, nil, IID_IShellFolder, Pointer(CurFolder))); if CurFolder.EnumObjects(0, SHCONTF_FOLDERS, EnumIDList) = NOERROR then begin while EnumIDList.Next(1, CurPidl, Fetched) = S_OK do begin FName:= GetDisplayName(CurFolder, CurPidl, SHGDN_NORMAL); Text:= GetPIDLNameForAddressBar(CurFolder, CurPidl); if bool then Text:= PSpecialFolderItem(SpecialFolders[0]).Name + '\' + Text; if Text[Length(Text)] <> '\' then Text:= Text + '\'; NewPidl:= ConcatPIDLs(SelectedPIDL, CurPidl); SHGetFileInfo(PChar(NewPidl), 0, SFI, SizeOf(SFI), SHGFI_ATTRIBUTES or SHGFI_PIDL or SHGFI_SYSICONINDEX or SHGFI_TYPENAME); n:= SFI.dwAttributes; MenuList.Add(GetAttr(n) + FName); AddMenuItem(Text, FName, SFI.iIcon); CoTaskMemFree(CurPidl); CoTaskMemFree(NewPidl); end; end; CoTaskMemFree(SelectedPIDL); Any solution for how to get the correct sorting order? It is strange there is no way in dwAttributes of TSHFileInfo to tell if a folder is a system folder. Roy M Klever

    Read the article

  • Weird Issues with WPMU I Can't Figure Out

    - by HollerTrain
    I know I should post this on the WPMU forum, but no one writes me back and I'm just trying to find a larger audience hoping you have run into this issue as well. I have built a WPMU site for a client, and I am able to upload media into the Media Library and within a Post or Page perfectly. I thought my job was finished, yet the client can't upload any media at all. I'm located in Kentucky, they are located in New England (if that even matters). I had the client record their process of uploading as I thought they were simply not following my instructions for uploading, yet they are doing everything correctly. When uploading a file it goes through the process of allowing them to select a file and it says it uploads it, yet when it is finished uploading nothing is in the Media Library or in the Post. Video of the client trying to upload in Media Manager (http://www.screencast.com/users/CatherineWeber/folders/Jing/media/945d33fa-a752-45fd-9bc1-f76fc5a1814a) Video of the client trying to upload within a Post (http://www.screencast.com/users/CatherineWeber/folders/Jing/media/b5c60e25-f0b5-40c0-a820-c2fc9eb00906) Asking the client to disable Flash Uploader didn't work :( Yet, I can login to the WPMU site, access their blog's backend and can easily upload a ton of files. I am so lost at to what the issue is here. I am running version 2.8.4a, and will try to upgrade to latest release hoping this will fix things.

    Read the article

  • What is the root directory OR how do I set the directory in DotNetZip

    - by Chris
    where does DotNetZip get it's root directory for saving. All the save examples don't show the directory. My goal is to recurse a folder and subfolders. In each folder I want to zip all the files into one zip and delete the source files. private void CopyFolder(string srcPath, string dstPath) { if (!Directory.Exists(dstPath)) Directory.CreateDirectory(dstPath); string[] files = Directory.GetFiles(srcPath); string msg; string zipFileName; using (ZipFile z = new ZipFile(Path.Combine(srcPath,String.Format("Archive{0:yyyyMMdd}.zip", DateTime.Now)))) { z.ReadProgress += new EventHandler<ReadProgressEventArgs>(z_ReadProgress); foreach (string file in files) { FileInfo fi = new FileInfo(file); AddLog(String.Format("Adding {0}", file)); z.AddFile(file); } //z.Save(Path.Combine(srcPath, String.Format("Archive{0:yyyyMMdd}.zip", DateTime.Now))); z.Save(); if (deleteSource) { foreach (string file in files) { File.Delete(file); } } zipFileName = z.Name; } if (!compressOnly) File.Copy(Path.Combine(srcPath,zipFileName), Path.Combine(dstPath, Path.GetFileName(zipFileName))); string[] folders = Directory.GetDirectories(sourcePath); foreach (string folder in folders) { string name = Path.GetFileName(folder); string dest = Path.Combine(dstPath, name); Console.WriteLine(ln); log.Add(ln); msg = String.Format("{3}{4}Start Copy: {0}{4}Directory: {1}{4}To: {2}", DateTime.Now.ToString("G"), name, dest, ln, Environment.NewLine); AddLog(msg); if (recurseFolders) CopyFolder(folder, dest); msg = String.Format("Copied Directory: {0}{4}To: {1}\nAt: {2}{3}", folder, dest, DateTime.Now.ToString("G"), Environment.NewLine); AddLog(msg); } }

    Read the article

  • Why are mercurial subrepos behaving as unversioned files in eclipse AND torotoiseHG

    - by noam
    I am trying to use the subrepo feature of mercurial, using the mercurial eclipse plugin\tortoiseHG. These are the steps I took: Created an empty dir /root cloned all repos that I want to be subrepos inside this folder (/root/sub1, /root/sub2) Created and added the .hgsub file in the root repo /root/.hgsub and put all the mappings of the sub repos in it using tortoiseHG, right clicked on /root and selected create repository here again with tortoise, selected all the files inside /root and added them to to the root repo commited the root repo pushed the local root repo into an empty repo I have set up on kiln Then, I pulled the root repo in eclipse, using import-mercurial. Now I see that all the subrepos appear as though they are unversioned (no "orange cylinder" icon next to their corresponding folders in the eclipse file explorer). Furthermore, when I right click on one of the subrepos, I don't get all the hg commands in the "team" menu as I usually get, with root projects - no "pull", "push" etc. Also, when I made a change to a file in a subrepo, and then "committed" the root project, it told me there were no changes found. I see the same behavior also in tortoiseHG - When I am browsing files under /root, the files belonging directly to the root repo have an small icon (a V sign) on them marking they are version controlled, while the subrepos' folders aren't marked as such. Am I doing something wrong, or is it a bug?

    Read the article

  • Add new SVN "repo" in poorly constructed repo/project setup

    - by Dave Masselink
    Unfortunately, the answer to this question isn't quite as simple as it sounds... but I hope it can still be relatively simple. Please read all the way through before telling me that the answer is: "svnadmin create... duh" I'm working for a company that set up their SVN server in an odd way (at least in terms of what I'm used to). We've all been there, right? Rather than giving each project a separate repository... they have a folder on the server called "/var/www/svn/repos/" which is the actual SVN repo (has conf/, db/, README.txt, etc. in it). Then they distinguish their projects by adding top level folders into the ONE repository (ex: Project1, Project2, etc.) I don't like this setup and might one day get around to converting the setup to what I'm used to, where each project is its own repository (with separate logs, dbs, etc.) But my question is this: What is the best way to add a new empty project to the current setup? Is there anyway to add a new top level folder/project to the repo through use of svnadmin? It can/should just be an empty folder that I'll start building a new project in. I know that I could do this by checking out the whole singular repository and then adding a new top level folder into my local checkout, then re-committing. But I'd really prefer not to do this because someone has created folders/projects that are just GBs of log data... and I don't want to wait through the download of this just to add a single empty folder. Let me know if there is any more info you'd need to know. I do have root/sudo access on the server in question. Thanks in advance for your help! Dave

    Read the article

  • web development using google sites

    - by CHID
    Hi, I have hosted a website. But now my client asks to change it to http://sites.google.com. They hav registered their domain in google. I logged in the site and saw the procedures to create a website from scratch. But is there any way to directly replace the site into google sites. Like, in my site i hav included css files in a folder called stylesheets/css and access it through the link tag. And there are several folders lik that, viz images, scripts etc. 1.Now if i have to transfer it to google sites, where do i create the folders and stuff. I have the priviliges to login to the admin part of the site. 2.And also is there anyway to create database and access it in google sites? 3.Also i see that only html pages can be created. Is there a way to add php pages or other scripting languages? 4.Going forward will google sites be usefull for professional web designing. pls give ideas on whether or not google sites is a good idea to go with

    Read the article

  • Split large repo into multiple subrepos and preserve history (Mercurial)

    - by Andrew
    We have a large base of code that contains several shared projects, solution files, etc in one directory in SVN. We're migrating to Mercurial. I would like to take this opportunity to reorganize our code into several repositories to make cloning for branching have less overhead. I've already successfully converted our repo from SVN to Mercurial while preserving history. My question: how do I break all the different projects into separate repositories while preserving their history? Here is an example of what our single repository (OurPlatform) currently looks like: /OurPlatform ---- Core ---- Core.Tests ---- Database ---- Database.Tests ---- CMS ---- CMS.Tests ---- Product1.Domain ---- Product1.Stresstester ---- Product1.Web ---- Product1.Web.Tests ---- Product2.Domain ---- Product2.Stresstester ---- Product2.Web ---- Product2.Web.Tests ==== Product1.sln ==== Product2.sln All of those are folders containing VS Projects except for the solution files. Product1.sln and Product2.sln both reference all of the other projects. Ideally, I'd like to take each of those folders, and turn them into separate Hg repos, and also add new repos for each project (they would act as parent repos). Then, If someone was going to work on Product1, they would clone the Product1 repo, which contained Product1.sln and subrepo references to ReferenceAssemblies, Core, Core.Tests, Database, Database.Tests, CMS, and CMS.Tests. So, it's easy to do this by just hg init'ing in the project directories. But can it be done while preserving history? Or is there a better way to arrange this?

    Read the article

  • Cannot upload media via Wordpress uploader

    - by Justin Johnson
    This has to do with media uploading in Wordpress. Every time WP creates a folder for new uploads (it organizes uploads by year and month: yyyy/mm), it creates it with the "apache:apache' user and group, with full access to all (777 or drwxrwxrwx). However, after that, WP cannot create a folder within that folder (e.g.: mkdir 2011 succeeds, but mkdir 2011/01 fails). Also, uploads cannot be moved into these newly created folders even though the permissions are 777 (rwxrwxrwx). Once a month, I have to chown the newly created folders to be the same as user:group as the rest of the files. Once I do that, uploading works fine (which doesn't make sense to me The really frustrating part is that this problem doesn't exist in other WP installs on other domains on the same server. * I wasn't sure if this should be here or on serverfault. Edit: The containing directory /.../httpdocs/blog/wp-content/uploads has the correct ownership drwxrwxrwx 5 myuser psaserv 4096 Jun 3 18:38 uploads This is a Plesk/CentOS environment hosted by Media Temple (dv). I've written the following test script to simulate the problem <pre><?php $d = "d" . mt_rand(100, 500); var_dump( get_current_user(), $d, mkdir($d), chmod($d, 0777), mkdir("$d/$d"), chmod("$d/$d", 0777), fileowner($d), getmyuid() ); The script always creates the first directory mkdir($d) successfully. On domain A, where the WP problem is, it cannot create the nested directory mkdir("$d/$d"). However, on domain B, both directories are successfully created. I am running each script at /var/www/vhosts/domainA/httpdocs/tmp/t.php and /var/www/vhosts/domainB/httpdocs/tmp/t.php respectively I checked the permissions on tmp, httpdocs, and domain[AB] and they are the same for each path. The only thing that differs is the user.

    Read the article

  • How to know if all the Thread Pool's thread are already done with its tasks?

    - by mcxiand
    I have this application that will recurse all folders in a given directory and look for PDF. If a PDF file is found, the application will count its pages using ITextSharp. I did this by using a thread to recursively scan all the folders for pdf, then if then PDF is found, this will be queued to the thread pool. The code looks like this: //spawn a thread to handle the processing of pdf on each folder. var th = new Thread(() => { pdfDirectories = Directory.GetDirectories(pdfPath); processDir(pdfDirectories); }); th.Start(); private void processDir(string[] dirs) { foreach (var dir in dirs) { pdfFiles = Directory.GetFiles(dir, "*.pdf"); processFiles(pdfFiles); string[] newdir = Directory.GetDirectories(dir); processDir(newdir); } } private void processFiles(string[] files) { foreach (var pdf in files) { ThreadPoolHelper.QueueUserWorkItem( new { path = pdf }, (data) => { processPDF(data.path); } ); } } My problem is, how do i know that the thread pool's thread has finished processing all the queued items so i can tell the user that the application is done with its intended task?

    Read the article

  • Maven jaxb generate plugin to read xsd files from multiple directories

    - by ziggy
    If i have xsd file in the following directories src/main/resources/xsd src/main/resources/schema/common src/main/resources/schema/soap How can i instruct the maven jaxb plugin to generate jaxb classes using all schema files in the above directory? I can get it to generate the class files if i specify one of the folders but i cant get i dont know how to include all three folders. Here is how i generate the files for one folder: <plugin> <groupId>org.jvnet.jaxb2.maven2</groupId> <artifactId>maven-jaxb2-plugin</artifactId> <executions> <execution> <goals> <goal>generate</goal> </goals> </execution> </executions> <configuration> <schemaDirectory>src/main/resources/xsd</schemaDirectory> </configuration> </plugin> I tried adding multiple entries in the element but it just ignores all of them if i do that. Thanks

    Read the article

  • Permissions on Mac OSX

    - by Linda
    I think that this is a permissions issue but I am not sure and I am not sure how to repair the problem. I have a new MacBook. I have 2 external drives that were previously used on another MacBook. I have a lot of folders and XCode projects on the external drives. When I try to work on the projects, there is a message similar to this: "This file is not writable. You may not be able to save your changes, but you will be able to Save a Copy somewhere else. Do you want to edit this file anyway?" If I make changes and try to close the project I get this error: "The project and user files project.pbxproj and macbook.pbxuser for project “thirdtry.xcodeproj” are not writeable and cannot be saved. Your changes will be lost if you close the project. You may need to SCM edit these files to gain writability." I have tried just to rename the folder but that permission is not allowed either unless I individually change permissions for every file in an XCode project. As you can imagine, this could be time consuming for tons of files and projects. I can copy the project into internal memory and can run it then after renaming the folder that contains all of the files. This defeats the purpose of having all of the projects on an external drive. Also, in XCode, there is no "Build and Run" there is only "Build and Debug" now. I don't know if this is related or not. Suggestions for how to repair all permissions to all files and folders on my external drives? What about the "Build and Debug" and no "Build and Run" choice? Thanks, Linda

    Read the article

  • Is saving to database just to get an ID a bad hack?

    - by Narsil
    I hope the title is not too confusing. I am trying to make folders with linq-to-sql objects' IDs. Actually I have to create folders before I should save them. I will use them to keep user uploaded files. As you can see I have to create the folder with the FileID before I can save it there. So I just save a record which will be edited or maybe deleted File newFile = new File(); ...//add some values to fields so they don't throw rule violations db.AddFile(newFile); db.Save(); System.IO.Directory.CreateDirectory("..Uploads/"+newFile.FileId.ToString()); After that I will have to edit some fields and save again. Of course user might stop upload and I would have to delete it. I know I can write a stored procedure to get the next available FileID but some other upload happening at the same time would get the same number. So they would write in same directory which is a thing I don't want. Should I go on with this, would there be some problems? Can you think of a better way?

    Read the article

  • Cocoa Services Programming - How to identify whether the selected item is a folder or file wih NSFi

    - by rockybalboa
    Hi , I have some queries regarding Services menu validation . I would like to enable different services provided by my app based on whether a file or folder is selected in the Finder. I have set NSFilenamesPboardType as the send type for the services . I have gone through the - (id)validRequestorForSendType:(NSString *)sendType returnType:(NSString *)returnType method but my issue is that the validation there seems to be done based on the sendType and return type. In my case , the selected file and folder pasteboard type is the same and I cannot determine whether the selected item in the Finder is a file or folder during the validation process ( This is before the actual service gets invoked i.e when the services menu is being shown to the user ) ? So my question is that is there any way I can get some info about the selected item in the Finder and validate the different service menus offered by my application based on some info regarding the item rather than the basic validation of the send and return types ? I am not able to find out any manner to do so but "Folder Actions" service in Snow Leopard gets enabled only for folders so it can be done. I did a /System/Library/CoreServices/pbs -dump_pboard and it is using a NSFilenamePBoardType also yet manages to activate only for folders. Thanks in advace for any help .

    Read the article

  • how could application installations/configurations be easier in linux? [closed]

    - by ajsie
    although you can do anything in linux it tends to require a lot of tweaking in config files and reading a lot of manuals/tutorials before you can have it running in your way. i know that it gets a lot easier by time, and the apt-get installations with ubuntu/debian is heading the right way. but how can linux be more userfriendly for us in the future? i thought that if more is automated like an IDE environment, eg. typing svn will give us all the commands and description about each command when you move between commands with your keyboard. that would be great. but that's just one example. another is the navigation in the terminal between folders. now you have to type a lot just to jump from/to different folders. would be great with some more automatization here too. i know that these extra features will slow down the server, but its 2010 now, and these features are not that heavy for the cpu, but makes it more userfriendly and encourage maintainance of a server, not frighten u off. what do you think about this? should/could we have more user friendly linux environment in servers, something that has annoyed you a lot? a lot of things are done in the unix way, but maybe we should reinvent the wheel in some areas, cause apparently, its so...repeatingly today and difficult to do easy tasks. it should be easier i think..

    Read the article

  • How to Create a Folder in the Current Document Library if it's not already present?

    - by Rosh Malai
    All I want to do is to create a folder "MetaFolder" inside a document library. User can be on any document library and I would like to create this folder after item is added (so on itemAdded event handler). I do NOT want workflow so please dont suggest workflow. This code works but I have to hardcode the url but need to get url from current url. also need to verify the folder uHippo does not exists in the current doc library... public override void ItemAdded(SPItemEventProperties properties) { base.ItemAdded(properties); using (SPSite currentSite = new SPSite(properties.WebUrl)) using (SPWeb currentWeb = currentSite.OpenWeb()) { // This code works and creates Folder in the "My TEST Doc library" //SPList docLib = currentWeb.Lists["My TEST Doc Library"]; //SPListItem folder = docLib.Folders.Add(docLib.RootFolder.ServerRelativeUrl, SPFileSystemObjectType.Folder, "My folder"); //folder.Update(); string doclibname = "Not a doclib"; //SPList doclibList = currentWeb.GetList(HttpContext.Current.Request.RawUrl); // NOT WORKING. Tried properties.weburl SPList doclibList = currentWeb.GetListFromUrl("https://mycompanyportal/sites/testsitecol/testwebsite/My%20TEST%20Doc%20Library/Forms/AllItems.aspx"); if (null != doclibList) { doclibname = doclibList.Title; } // this section also not working. // getting Object reference not set to an instance of an object or something like that. //if (currentWeb.GetFolder("uHippo").Exists == false) //{ SPListItem folder = doclibList.Folders.Add(doclibList.RootFolder.ServerRelativeUrl, SPFileSystemObjectType.Folder, "uHippo"); folder.Update(); //} } }

    Read the article

  • File.Move does not inherit permissions from target directory?

    - by Joseph Kingry
    In case something goes wrong in creating a file, I've been writing to a temporary file and then moving to the destination. Something like: var destination = @"C:\foo\bar.txt"; var tempFile = Path.GetTempFileName(); using (var stream = File.OpenWrite(tempFile)) { // write to file here here } string backupFile = null; try { var dir = Path.GetDirectoryName(destination); if (!Directory.Exists(dir)) { Directory.CreateDirectory(dir); Util.SetPermissions(dir); } if (File.Exists(destination)) { backupFile = Path.Combine(Path.GetTempPath(), new Guid().ToString()); File.Move(destination, backupFile); } File.Move(tempFile, destination); if (backupFile != null) { File.Delete(backupFile); } } catch(IOException) { if(backupFile != null && !File.Exists(destination) && File.Exists(backupFile)) { File.Move(backupFile, destination); } } The problem is that the new "bar.txt" in this case does not inherit permissions from the "C:\foo" directory. Yet if I create a file via explorer/notepad etc directly in the "C:\foo" there's no issues, so I believe the permissions are correctly set on "C:\foo". Update Found Inherited permissions are not automatically updated when you move folders, maybe it applies to folders as well. Now looking for a way to force an update of file permissions. Is there a better way overall of doing this?

    Read the article

  • Can I version dotfiles within a project without merging their history into the main line?

    - by istrasci
    I'm sure this title is fairly obscure. I'm wondering if there is some way in git to tell it that you want a certain file to use different versions of a file when moving between branches, but to overall be .gitignored from the repository. Here's my scenario: I've got a Flash Builder project (for a Flex app) that I control with git. Flex apps in Flash Builder projects create three files: .actionScriptProperties, .flexProperties, and .project. These files contain lots of local file system references (source folders, output folders, etc.), so naturally we .gitignore them from our repo. Today, I wanted to use a new library in my project, so I made a separate git branch called lib, removed the old version of the library and put in the new one. Unfortunately, this Flex library information gets stored in one of those three dot files (not sure which offhand). So when I had to switch back to the first branch (master) earlier, I was getting compile errors because master was now linked to the new library (which basically negated why I made lib in the first place). So I'm wondering if there's any way for me to continue to .gitignore these files (so my other developers don't get them), but tell git that I want it to use some kind of local "branch version" so I can locally use different versions of the files for different branches.

    Read the article

  • Copy all files and folder from one directory to another directory PHP

    - by santosh
    Hi, I have directory called "mysourcedir" it has sonme files and folders. so i want to copy all content from this directory to some other "destinationfolder" on Linux server using PHP. function full_copy( $source, $target ) { if ( is_dir( $source ) ) { @mkdir( $target ); $d = dir( $source ); while ( FALSE !== ( $entry = $d->read() ) ) { if ( $entry == '.' || $entry == '..' ) { continue; } $Entry = $source . '/' . $entry; if ( is_dir( $Entry ) ) { $this->full_copy( $Entry, $target . '/' . $entry ); continue; } copy( $Entry, $target . '/' . $entry ); } $d->close(); }else { copy( $source, $target ); } } I am trying this code, but it does some problem, it creates directory "mysourcedir" at destination location. I am expecting to just copy all files and folders at destination,. Please suggest

    Read the article

< Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >