Search Results

Search found 28760 results on 1151 pages for 'search folder'.

Page 264/1151 | < Previous Page | 260 261 262 263 264 265 266 267 268 269 270 271  | Next Page >

  • Using bash to copy file to spec folders

    - by Franko
    I have a folder with a fair amount of subfolders. In some of the subfolders do I have a folder.jpg picture. What I try to do find that folder(s) and copy it to all other subfolders that got the same artist and album information then continue on to the next album etc. The structure of all the folders are "artist - year - album - [encoding information]". I have made a really simple one liner that find the folders that got the file but there am I stuck. ls -F | grep / | while read folders;do find "$folders" -name folder.jpg; done Anyone have any good tip or ideas how to solve this or pointers how to proceed? Edit: First of all, i´m real new to this (like you cant tell) so please have patience. Ok, let me break it down even more. I have a folder structure that looks like this: artist1 - year - album - [flac] artist1 - year - album - [mp3] artist1 - year - album - [AAC] artist2 - year - album - [flac] etc I like to loop over the set of folders that have the same artist and album information and look for a folder.jpg file. When I find that file do I like to copy it to all of the other folders in the same set. Ex if I find one folder.jpg in artist1 - year - album - [flac] folder do I like to have that folder.jpg copied to artist1 - year - album - [mp3] & artist1 - year - album - [AAC] but not to artist2 - year - album - [flac]. The continue the loop until all the sets been processed. I really hope that makes it a bit more easy to understand what I try to do :)

    Read the article

  • i have problem with sharing a folder through programming using c#?

    - by moon
    here is my code it shares the folder but that does not work correctly when i want to access it , it shows access denied help required, private static void ShareFolder(string FolderPath, string ShareName, string Description) { try { // Create a ManagementClass object ManagementClass managementClass = new ManagementClass("Win32_Share"); // Create ManagementBaseObjects for in and out parameters ManagementBaseObject inParams = managementClass.GetMethodParameters("Create"); ManagementBaseObject outParams; // Set the input parameters inParams["Description"] = Description; inParams["Name"] = ShareName; inParams["Path"] = FolderPath; inParams["Type"] = 0x0; // Disk Drive //Another Type: //DISK_DRIVE = 0x0; //PRINT_QUEUE = 0x1; //DEVICE = 0x2; //IPC = 0x3; //DISK_DRIVE_ADMIN = 0x80000000; //PRINT_QUEUE_ADMIN = 0x80000001; //DEVICE_ADMIN = 0x80000002; //IPC_ADMIN = 0x8000003; //inParams["MaximumAllowed"] = int maxConnectionsNum; // Invoke the method on the ManagementClass object outParams = managementClass.InvokeMethod("Create", inParams, null); // Check to see if the method invocation was successful if ((uint)(outParams.Properties["ReturnValue"].Value) != 0) { throw new Exception("Unable to share directory. Because Directory is already shared or directory not exist"); }//end if }//end try catch (Exception ex) { MessageBox.Show(ex.Message, "error!"); }//end catch }//End Method

    Read the article

  • VirtualBox limits size of .js file, that can be included from guest additions folder?

    - by c69
    This question might belong to SuperUser, but i'll try to ask it here anyway, because i believe, some web developers might encountered this weird behavior. When testing a site for IE8/winXP compatibility on VirtualBox i run into weird issue of $ is undefined, which is caused by jQuery (and jQuery UI) being not included, when referenced by relative path, which resolves to file:/// url. Seemingly because their size was too big (above 200KB). Simply replacing links to those 2 big files to http:// ones solved the issue for me. But here is the question: why did this happen ? is it a misconfiguration ? a bug ? a known design decision ? Details: VirtualBox 4.1.8 host os: win7 64bit, guest os: xp sp3 32 bit guest additions installed, page was launched from VB shared folder the bug was manifesting itself in all browsers (even in opera, which ignores ie security settings, afaik) ie configuration is default script was included like this: <script type="text/javascript" src="js/libs/jquery/jquery-1.7.2.js"> exact size limit was not deducted.

    Read the article

  • How can I intelligently group rows of integers for a faceted search?

    - by Alastair
    I'm not even quite sure what terms I should be using for what I want, so any advice on what I'm even asking for would be very welcome. Basically, my web site lists user-generated accommodations. Each has a rent price, which users will be able to query in our new faceted search box. Users search by city, and within each city I'd like to present a different rent grouping. That is to say that in City #1, if we have listings ranging from $200 - $1000, I'd like to present checkboxes for: less than $300 $301 - $500 $501 - $700 more than $700 However, if City #2 has values that range from $500 - $1500, I want the ranges above to change accordingly. So, if I say that I want 5 or 6 range options in each city, I think I have two options: Take the min and max values and just split the difference. I don't like this idea because one listing with a rent of $10,000 will throw the whole scale off. Intelligently calculate the ranges using means, medians etc. Number 2 is what I need help with. I'm a web developer that gets logic, but was never strong on math and statistics at school. Can anyone point me towards a guide that'll help me figure this out?

    Read the article

  • I am unable to upload file to my local host folder in php.

    - by Nauman khan
    Hi, I have the follwing code <form enctype="multipart/form-data" action="upload.php" method="POST"> Please choose a file: <input name="uploaded" type="file" /><br /> <input type="submit" value="Upload" /> </form> <?php $target = "upload/"; $target = $target . basename( $_FILES['uploaded']['name']) ; $ok=1; if(move_uploaded_file($_FILES['uploaded']['tmp_name'], $target)) { echo "The file ". basename( $_FILES['uploadedfile']['name']). " has been uploaded"; } else { echo "Sorry, there was a problem uploading your file."; } ?> my page is in http://localhost/nausal/upload.php now I am having the follwing error though i have created a folder in site with name upload. Notice: Undefined index: uploaded in C:\wamp\www\Nausal\upload.php on line 15 Notice: Undefined index: uploaded in C:\wamp\www\Nausal\upload.php on line 17 Sorry, there was a problem uploading your file. Please help me I am very new to php. :(

    Read the article

  • Search dir using wildcard string, return filename, and loop.

    - by Charlie Murphy
    Hello all, I'm having some problems, and looking for some help. I'm trying to create a photo gallery in javascript, that will be able to 'update' it's self automatically. I need to be able to search a directory, and grab a file with a specific prefix. I need to output the followng HTML code: <li><a href="images/resize_FILENAME.ext"><img src="images/thumb_FILENAME.ext"></a></li> The 'resize_' and 'thumb_' use a timestamp to identify, so they have the same ending, just a different prefix. So, for example, if I search the directory for an image with a prefix of 'resize_', I need to insert that into the a tag, and then remove the '_resize' prefix, and add the '_thumb' prefix for the img tag. I then need to be able to do that for each image in the directory. Any help would be greatly appreciated. Oh, I should add: I'm assuming php would be easiest for this, but if an alternative exists that would work too. I'm also using jQuery if javascript would be better.

    Read the article

  • How to do an additional search on archive in rails if record not found, by extending model?

    - by Nick Gorbikoff
    Hello, I was wondering if somebody knows an elegant solution to the following: Suppose I have a table that holds orders, with a bunch of data. So I'm at 1M records, and searches begin to take time. So I want to speed it up by archiving some data that is more than 3 years old - saving it into a table called orders-archive, and then purging them from the orders table. So if we need to research something or customer wants to pull older information - they still can, but 99% of the lookups are done on the orders no older than a year and a half - so there is no reason to keep looking through older data all the time. These move & purge operations can be then croned to be done on a weekly basis. I already did some tests and I know that I will slash my search times by about 4 times. So far so good, right? However I was thinking about how to implement older archival lookups and the only reasonable thing I can think of is some sort of if-else If not found in orders, do a search in orders-archive. However - I have about 20 tables that I want to archive and god knows how many searches / finds are done through out the code, that I don't want to modify. So I was wondering if there is an elegant rails-way solution to this problem, by extending a model somehow? Has anyone dealt with similar case before? Thank you.

    Read the article

  • Where should test classes be stored in the project?

    - by limc
    I build all my web projects at work using RAD/Eclipse, and I'm interested to know where do you guys normally store your test's *.class files. All my web projects have 2 source folders: "src" for source and "test" for testcases. The generated *.class files for both source folders are currently placed under WebContent/WEB-INF/classes folder. I want to separate the test *.class files from the src *.class files for 2 reasons:- There's no point to store them in WebContent/WEB-INF/classes and deploy them in production. Sonar and some other static code analysis tools don't produce an accurate static code analysis because it takes account of my crappy yet correct testcase code. So, right now, I have the following output folders:- "src" source folder compiles to WebContent/WEB-INF/classes folder. "test" source folder compiles to target/test-classes folder. Now, I'm getting this warning from RAD:- Broken single-root rule: A project may not contain more than one output folder. So, it seems like Eclipse-based IDEs prefer one project = one output folder, yet it provides an option for me to set up a custom output folder for my additional source folder from the "build path" dialog, and then it barks at me. I know I can just disable this warning myself, but I want to know how you guys handle this. Thanks.

    Read the article

  • In Sharepoint, how do I update the name of a folder in a document library using the web service API?

    - by Jess
    I'm using the UpdateListItems method of the Lists web service, and I can update an item in just about any kind of list, and folders in non-document library lists, but I can't seem to update the name of a folder in a document library. I must use the web services API, as sharepoint is not local. If my update batch looks like this: <Batch OnError="Continue" PreCalc="TRUE" ListVersion="0" ViewName=""> <Method ID="1" Cmd="Update"> <Field Name="ID">2</Field> <Field Name="Title">MyUpdatedFolderName</Field> <Field Name="FileLeafRef">MyUpdatedFolderName</Field> </Method> </Batch> I get no exception but the name is unchanged. If my update batch looks like this: <Batch OnError="Continue" PreCalc="TRUE" ListVersion="0" ViewName=""> <Method ID="1" Cmd="Update"> <Field Name="ID">2</Field> <Field Name="Title">MyUpdatedFolderName</Field> <Field Name="BaseName">MyUpdatedFolderName</Field> </Method> </Batch> I get an error result that the list item could not be found. I know the list item is there. Anyone have any ideas?

    Read the article

  • Very Urgent :How to start a new session if a user click on the new tab in IE or mozilla on websphere

    - by ha22109
    Hi, I have one "user search" portlet on the home page of one application running on websphere portal server.Which display the matching user records as per the search criteria filled in the search form.I have requirement to have a "back to search input" link on the results page which onclick should show the filled form on the input jsp. The issue which i am facing is if i open the application in two diff tab of same IE browser and start giving some search criteria and submit and same time search for some other input from other IE tab (in the same browser)and then go back to previous tab and click on "back to search input" link then instead of showing me the first input it will show me the imput which i entered in the next IE tab. I am setting and getting the bean(form bean) through portlet session.but in the two diff tab of same IE it will be the sae user session (and may be the same portlet session..) Please tell me solution for this. The one thing to be notice here is i can access this "user search" application even without doing login also.so it must be taking the default portlet session in this case. what wil happen once i login and then search,will it going to overwrite the portlet session and http session or howz is that?

    Read the article

  • Read all sub directories within a certain folder to display a random image.

    - by Andy
    I have this code i have been using....but i need a conditional where it will read all the sub directories of /bg to select an image as opposed to a specific folder if they were on a subpage. Heres my code so far which works perfectly for all subpages to display specific images: //This would tell us its on the homepage if it helps: $this->level() == 0 //This is the code so far $path = '/home/sites/mydomain.co.uk/public_html/public/images/bg/'.$this->slug; $homepagefile = URL_PUBLIC.'public/images/bg/'.$this->slug.'/main.jpg'; $bgimagearray = array(); $iterator = new DirectoryIterator($path); foreach ($iterator as $fileinfo) { if ($fileinfo->isFile() && !preg_match('\.jpg$/', $fileinfo->getFilename())) { $bgimagearray[] = "'" . $fileinfo->getFilename() . "'"; } } $bgimage = array_rand($bgimagearray); ?> <div id="bg"> <div> <table cellspacing="0" cellpadding="0"> <tr> <td><img src="<?php echo $file.trim($bgimagearray[$bgimage], "'"); ?>" alt=""/></td> </tr> </table> </div> </div> Any help would be appreciated, im sure its not rocket science but ive tried a few ways and cant get my head around it. Thanks in advance.

    Read the article

  • EXE stops working if containing folder is renamed. MSVCP90.dll

    - by John
    This popup comes up as soon as the app is started: The program can't start because MSVCP90.dll is missing from your computer. Before anyone says "install the VC++ runtimes", wait! If I rename the folder containing my .EXE then the app runs. If I rename it back, it breaks. The app has been running for weeks without any changes to my system/VS installation (2008 SP1), we suddenly spotted this error a few days ago. Lost as to why the name of the dir is causing issues... again this has not changed in months and all our resource paths are relative anyway, e.g "../someOtherDir/...." It doesn't just do this on my PC, we have the /bin dir (the one containing EXE) in SVN and suddenly everyone started seeing the same issue, even though the binaries themselves seem just fine. Is it possible some additional data got put into SVN and that's the cause? Since it's not just one PC, there must be something either in SVN or the EXE itself... Note this popup comes before our code even gets to run.

    Read the article

  • Fixing up Visual Studio&rsquo;s gitignore , using IFix

    - by terje
    Originally posted on: http://geekswithblogs.net/terje/archive/2014/06/13/fixing-up-visual-studiorsquos-gitignore--using-ifix.aspxDownload tool Is there anything wrong with the built-in Visual Studio gitignore ???? Yes, there is !  First, some background: When you set up a git repo, it should be small and not contain anything not really needed.  One thing you should not have in your git repo is binary files. These binary files may come from two sources, one is the output files, in the bin and obj folders.  If you have a  gitignore file present, which you should always have (!!), these folders are excluded by the standard included file (the one included when you choose Team Explorer/Settings/GitIgnore – Add.) The other source are the packages folder coming from your NuGet setup.  You do use NuGet, right ?  Of course you do !  But, that gitignore file doesn’t have any exclude clause for those folders.  You have to add that manually.  (It will very probably be included in some upcoming update or release).  This is one thing that is missing from the built-in gitignore. To add those few lines is a no-brainer, you just include this: # NuGet Packages packages/* *.nupkg # Enable "build/" folder in the NuGet Packages folder since # NuGet packages use it for MSBuild targets. # This line needs to be after the ignore of the build folder # (and the packages folder if the line above has been uncommented) !packages/build/ Now, if you are like me, and you probably are, you add git repo’s faster than you can code, and you end up with a bunch of repo’s, and then start to wonder: Did I fix up those gitignore files, or did I forget it? The next thing you learn, for example by reading this blog post, is that the “standard” latest Visual Studio gitignore file exist at https://github.com/github/gitignore, and you locate it under the file name VisualStudio.gitignore.  Here you will find all the new stuff, for example, the exclusion of the roslyn ide folders was commited on May 24th.  So, you think, all is well, Visual Studio will use this file …..     I am very sorry, it won’t. Visual Studio comes with a gitignore file that is baked into the release, and that is by this time “very old”.  The one at github is the latest.  The included gitignore miss the exclusion of the nuget packages folder, it also miss a lot of new stuff, like the Roslyn stuff. So, how do you fix this ?  … note .. while we wait for the next version… You can manually update it for every single repo you create, which works, but it does get boring after a few times, doesn’t it ? IFix Enter IFix ,  install it from here. IFix is a command line utility (and the installer adds it to the system path, you might need to reboot), and one of the commands is gitignore If you run it from a directory, it will check and optionally fix all gitignores in all git repo’s in that folder or below.  So, start up by running it from your C:/<user>/source/repos folder. To run it in check mode – which will not change anything, just do a check: IFix  gitignore --check What it will do is to check if the gitignore file is present, and if it is, check if the packages folder has been excluded.  If you want to see those that are ok, add the --verbose command too.  The result may look like this: Fixing missing packages Let us fix a single repo by adding the missing packages structure,  using IFix --fix We first check, then fix, then check again to verify that the gitignore is correct, and that the “packages/” part has been added. If we open up the .gitignore, we see that the block shown below has been added to the end of the .gitignore file.   Comparing and fixing with latest standard Visual Studio gitignore (from github) Now, this tells you if you miss the nuget packages folder, but what about the latest gitignore from github ? You can check for this too, just add the option –merge (why this is named so will be clear later down) So, IFix gitignore --check –merge The result may come out like this  (sorry no colors, not got that far yet here): As you can see, one repo has the latest gitignore (test1), the others are missing either 57 or 150 lines.  IFix has three ways to fix this: --add --merge --replace The options work as follows: Add:  Used to add standard gitignore in the cases where a .gitignore file is missing, and only that, that means it won’t touch other existing gitignores. Merge: Used to merge in the missing lines from the standard into the gitignore file.  If gitignore file is missing, the whole standard will be added. Replace: Used to force a complete replacement of the existing gitignore with the standard one. The Add and Replace options can be used without Fix, which means they will actually do the action. If you combine with --check it will otherwise not touch any files, just do a verification.  So a Merge Check will  tell you if there is any difference between the local gitignore and the standard gitignore, a Compare in effect. When you do a Fix Merge it will combine the local gitignore with the standard, and add what is missing to the end of the local gitignore. It may mean some things may be doubled up if they are spelled a bit differently.  You might also see some extra comments added, but they do no harm. Init new repo with standard gitignore One cool thing is that with a new repo, or a repo that is missing its gitignore, you can grab the latest standard just by using either the Add or the Replace command, both will in effect do the same in this case. So, IFix gitignore --add will add it in, as in the complete example below, where we set up a new git repo and add in the latest standard gitignore: Notes The project is open sourced at github, and you can also report issues there.

    Read the article

  • How big can my SharePoint 2010 installation be?

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). 3 years ago, I had published “How big can my SharePoint 2007 installation be?” Well, SharePoint 2010 has significant under the covers improvements. So, how big can your SharePoint 2010 installation be? There are three kinds of limits you should know about Hard limits that cannot be exceeded by design. Configurable that are, well configurable – but the default values are set for a pretty good reason, so if you need to tweak, plan and understand before you tweak. Soft limits, you can exceed them, but it is not recommended that you do. Before you read any of the limits, read these two important disclaimers - 1. The limit depends on what you’re doing. So, don’t take the below as gospel, the reality depends on your situation. 2. There are many additional considerations in planning your SharePoint solution scalability and performance, besides just the below. So with those in mind, here goes.   Hard Limits - Zones per web app 5 RBS NAS performance Time to first byte of any response from NAS must be less than 20 milliseconds List row size 8000 bytes driven by how SP stores list items internally Max file size 2GB (default is 50MB, configurable). RBS does not increase this limit. Search metadata properties 10,000 per item crawled (pretty damn high, you’ll never need to worry about it). Max # of concurrent in-memory enterprise content types 5000 per web server, per tenant Max # of external system connections 500 per web server PerformancePoint services using Excel services as a datasource No single query can fetch more than 1 million excel cells Office Web Apps Renders One doc per second, per CPU core, per Application server, limited to a maximum of 8 cores.   Configurable Limits - Row Size Limit 6, configurable via SPWebApplication.MaxListItemRowStorage property List view lookup 8 join operations per query Max number of list items that a single operation can process at one time in normal hours 5000 Configurable via SPWebApplication.MaxItemsPerThrottledOperation   Also you get a warning at 3000, which is configurable via SPWebApplication.MaxItemsPerThrottledOperationWarningLevel   In addition, throttle overrides can be requested, throttle overrides can be disabled, and time windows can be set when throttle is disabled. Max number of list items for administrators that a single operation can process at one time in normal hours 20000 Configurable via SPWebApplication.MaxItemsPerThrottledOperationOverride Enumerating subsites 2000 Word and Powerpoint co-authoring simultaneous editors 10 (Hard limit is 99). # of webparts on a page 25 Search Crawl DBs per search service app 10 Items per crawl db 25 million Search Keywords 200 per site collection. There is a max limit of 5000, which can then be modified by editing the web.config/client.config. Concurrent # of workflows on a content db 15. Workflows running in the timer service are not counted in this limit. Further workflows are queued. Can be configured via the Set-SPFarmConfig powershell commandlet. Number of events picked by the workflow timer job and delivered to workflows 100. You can increase this limit by running additional instances of the workflow timer service. Visio services file size 50MB Visio web drawing recalculation timeout 120 seconds Configurable via – Powershell commandlet Set-SPVisioPerformance Visio services minimum and maximum cache age for data connected diagrams 0 to 24 hours. Default is 60 minutes. Configurable via – Powershell commandlet Set-SPVisioPerformance   Soft Limits - Content Databases 300 per web app Application Pools 10 per web server Managed Paths 20 per web app Content Database Size 200GB per Content DB Size of 1 site collection 100GB # of sites in a site collection 250,000 Documents in a library 30 Million, with nesting. Depends heavily on type and usage and size of documents. Items 30 million. Depends heavily on usage of items. SPGroups one SPUser can be in 5000 Users in a site collection 2 million, depends on UI, nesting, containers and underlying user store AD Principals in a SPGroup 5000 SPGroups in a site collection 10000 Search Service Instances 20 Indexed Items in Search 100 million Crawl Log entries 100 million Search Alerts 1 million per search application Search Crawled Properties 1/2 million URL removals in search 100 removals per operation User Profiles 2 million per service application Social Tags 500 million per social database Comment on the article ....

    Read the article

  • NTFS Issues in Windows 7 and 2008 R2 - 'Is it a Bug?'

    - by renewieldraaijer
    I have been using the various versions of the Microsoft Windows product line since NT4 and I really thought I knew the ins and outs about the NTFS filesystem by now. There were always a few rules of thumb to understand what happens if you move data around. These rules were: "If you copy data, the copied data will inherit the permissions of the location it is being copied to. The same goes for moving data between disk partitions. Only when you move data within the same partition, the permissions are kept."  Recently I was asked to assist in troubleshooting some NTFS related issues. This forced me to have another good look at this theory. To my surprise I found out that this theory does not completely stand anymore. Apparently some things have changed since the release of Windows Vista / Windows 2008. Since the release of these Operating Systems, a move within the same disk partition results in the data inheriting the permissions of the location it is being copied into. A major change in the NTFS filesystem you would think!  Not quite! The above only counts when the move operation is being performed by using Windows Explorer. A move by using the 'move' command from within a cmd prompt for example, retains the NTFS permissions, just like before in Windows XP and older systems. Conclusion: The Windows Explorer is responsible for changing the ACL's of the moved data. This is a remarkable change, but if you follow this theory, the resulting ACL after a move operation is still predictable.  We could say that since Windows Vista and Windows 2008, a new rule set applies: "If you copy data, the copied data will inherit the permissions of the location it is being copied to. Same goes for moving data between disk partitions and within disk partitions. Only when you move data within the same partition by using something else than the Windows Explorer, the permissions are kept." The above behavior should be unchanged in Windows 7 / Windows 2008 R2, compared to Windows Vista / 2008. But somehow the NTFS permissions are not so predictable in Windows 7 and Windows 2008 R2. Moving data within the same disk partition the one time results in the permissions being kept and the next time results in inherited permissions from the destination location. I will try to demonstrate this in a few examples: Example 1 (Incorrect behavior): Consider two folders, 'Folder A' and 'Folder B' with the following permissions configured.                    Now we create the test file 'test file 1.txt' in 'Folder A' and afterwards move this file to 'Folder B' using Windows Explorer.                       According to the new theory, the file should inherit the permissions of 'Folder B' and therefore 'Group B' should appear in the ACL of 'test file 1.txt'. In the screenshot below the resulting permissions are displayed. The permissions from the originating location are kept, while the permissions of 'Folder B' should be inherited.                   Example 2 (Correct behavior): Again, consider the same two folders. This time we make a small modification to the ACL of 'Folder A'. We add 'Group C' to the ACL and again we create a file in 'Folder A' which we name 'test file 2.txt'.                    Next, we move 'test file 2.txt' to 'Folder B'.                       Again, we check the permissions of 'test file 2.txt' at the target location. We can now see that the permissions are inherited. This is what should be happening, and can be considered 'correct behavior' for Windows Vista / 2008 / 7 / 2008 R2. It remains uncertain why this behavior is so inconsistent. At this time, this is under investigation with Microsoft Support. The investigation has been going for the last two weeks and it is beginning to look like there is no rational reason for this, other than a bug in the Windows Explorer in Windows 7 and 2008 R2. As soon as there is any certainty on this, I will note it here in this blog.                   The examples above are harmless tests, by using my own laptop. If you would create the same set of folders and groups, and configure exactly the same permissions, you will see exactly the same behavior. Be sure to use Windows 7 or Windows 2008 R2.   Initially the problem arose at a customer site where move operations on data on the fileserver by users would result in unpredictable results. This resulted in the wrong set of people having àccess permissions on data that they should not have permissions to. Off course this is something we want to prevent at all costs.   I have also done several tests with move operations by using the move command in a cmd prompt. This way the behavior is always consistent. The inconsistent behavior is only exposed when using the Windows Explorer to initiate the move operation, and only when using Windows 7 or Windows 2008 R2 systems. It is evident that this behavior changes when the ACL of a folder has been changed, for example by adding an extra entry. The reason for this remains uncertain though. To be continued…. A dutch version of this post can be found at: http://blogs.platani.nl/?p=612

    Read the article

  • ReSharper File Location

    - by Ben Griswold
    By default, the ReSharper cache is stored in the solution folder.  It’s one extra folder and one extra .user file.  It’s no big deal but it does clutter up your solution a bit – especially since the files provide no real value. I prefer to store the ReSharper cache in the system Temp folder.  This setting is available by visiting ReSharper > Options > Environment > General. Just update where you’d like to store the ReSharper cache and you’re good to go.  Note, the .user file continues to linger around the solution folder but at least the _ReSharper.SolutionName folder is moved out of sight.

    Read the article

  • How to pass GET parameters to rewritten URL?

    - by Jakobud
    I have an .htaccess rewrite rule like this: RewriteCond %{SCRIPT_FILENME} !-d RewriteCond %{SCRIPT_FILENAME} !-f RewriteRule ^search/(.*)$ search.php?q=$1 What this does is, if someone visits http://www.mysite.com/search/test the URI that is really processed is http://www.mysite.com/search.php?q=test. Now, if I try to pass an extra random GET parameter to my rewritten URL, the parameter is ignored. So if I try to do visit here: http://www.mysite.com/search/whatever?extra=true The parameter extra is ignored. It doesn't seem to get passed at all. Can this problem be fixed? If so, how?

    Read the article

  • Mass Metadata Updates with Folders

    - by Kyle Hatlestad
    With the release of WebCenter Content PS5, a new folder architecture called 'Framework Folders' was introduced.  This is meant to replace the folder architecture of 'Folders_g'.  While the concepts of a folder structure and access to those folders through Desktop Integration Suite remain the same, the underlying architecture of the component has been completely rewritten.  One of the main goals of the new folders is to scale better at large volumes and remove the limitations of 1000 content items or sub-folders within a folder.  Along with the new architecture, it has a new look and a few additional features have been added.  One of those features are Query Folders.  These are folders that are populated simply by a query rather then literally putting items within the folders.  This is something that the Library has provided, but it always took an administrator to define them through the Web Layout Editor.  Now users can quickly define query folders anywhere within the standard folder hierarchy. [ Read More ]

    Read the article

  • SEO With a Linkwheel

    You will be amazed to know that more than 68 % of internet users do not go farther than first page on search results. That means you have to be among the top page ranks to have any business online. If you are not among those listed on the first page of search results, you are missing out on most of the users searching for products, information and content on Google and other search engines. Linkwheel is a wonderful strategy for search engine optimization (SEO). If done well, your search engine rankings will skyrocket in a very short time.

    Read the article

  • How do I make Nautilus windows stick for drag & drop?

    - by e-satis
    When you drag and drop a folder with nautilus, you must carefully set both windows on non overlapping areas of your screen, otherwise selecting one folder will bring the windows to the front, hiding the second one. On Windows, doing so will stick the explorer.exe windows to the back and let you drag and drop the folder. I suppose it detect a long click to decide whether or not bring the window to the front. Is that possible with Ubuntu? Now I know that Nautilus now has split panels by pressing F3, but that not handy. Most of the time, you open a folder, THEN decide to copy. With split panel, you must decide, THEN split the panel and go to the right folder.

    Read the article

  • How can I create a symlink to the location that Ubuntu 10.10 mounts a CD?

    - by Michael Curran
    In Ubuntu 10.10, when I insert a CD or DVD into my optical drive, the system mounts the CD in a folder called /media/XYZ where XYZ is the disk's label. This has cause problems with Wine, as in order for an application to verify that an application's CD is present, Wine uses a symlinks to point to a mounted CD's folder. In this case, that folder must be /media/XYZ, but when using a different application, the folder would be different. I would like to know if there is a way to create a symlink that will always point to the mounted folder from a given /dev/cdrom* device, or how to force the system to always mount CDs to the same address (i.e. /media/cdrom).

    Read the article

  • Be careful when Git suppresses bin Folders

    - by Marko Apfel
    Initial situation Often for Visual Studio projects the typical content of a .gitignore file contains this line bin or [B|b]in It is used to avoid that Git tries to track compile outputs as repository relevant data. Problem But keep in mind: this will also suppress bin folders of additional stuff like frameworks and toolsets. For instance Microsoft.SDKs contains a folder named Bin with a lot of programs Simian contains a folder named bin with the program themselves If you store such artifacts also in the repository - according to the principle of a “self containing project” – you could lost the content in the bin folder! Solution Till yet I don’t have a good idea. So I verify for each new added toolset or framework whether it has or has not such a bin folder. If it has, then I must add this bin folder manually to the repository so that Git track it.

    Read the article

  • How to add another OS entry in Wubi grub

    - by Amey Jah
    I am trying to install another linux distro besides ubuntu. However, I want to retain my existing windows based loader. Currently, as per my knowledge, MsDos loads grub which then loads Ubuntu (with loop back trick). Now, I have a new linux distro installed on /dev/sda8 (/boot for new distro) where as /root for that OS is installed on /dev/sda9. I tried following steps 1. Add entry into 40_custom of ubuntu grub 2. update grub But upon booting via that entry, it is not able to load the new OS and shows me blank screen. What could be the problem? Additional data: grub.cfg file of ubuntu menuentry 'Ubuntu' --class ubuntu --class gnu-linux --class gnu --class os $menuentry_id_option 'gnulinux-simple-fc296be2-8c59-4f21-a3f8-47c38cd0d537' { gfxmode $linux_gfx_mode insmod gzio insmod ntfs set root='hd0,msdos5' if [ x$feature_platform_search_hint = xy ]; then search --no-floppy --fs-uuid --set=root --hint-bios=hd0,msdos5 --hint-efi=hd0,msdos5 --hint-baremetal=ahci0,msdos5 01CD7BB998DB0870 else search --no-floppy --fs-uuid --set=root 01CD7BB998DB0870 fi loopback loop0 /ubuntu/disks/root.disk set root=(loop0) linux /boot/vmlinuz-3.5.0-19-generic root=UUID=01CD7BB998DB0870 loop=/ubuntu/disks/root.disk ro quiet splash $vt_handoff initrd /boot/initrd.img-3.5.0-19-generic } submenu 'Advanced options for Ubuntu' $menuentry_id_option 'gnulinux-advanced-fc296be2-8c59-4f21-a3f8-47c38cd0d537' { menuentry 'Ubuntu, with Linux 3.5.0-19-generic' --class ubuntu --class gnu-linux --class gnu --class os $menuentry_id_option 'gnulinux-3.5.0-19-generic-advanced-fc296be2-8c59-4f21-a3f8-47c38cd0d537' { gfxmode $linux_gfx_mode insmod gzio insmod ntfs set root='hd0,msdos5' if [ x$feature_platform_search_hint = xy ]; then search --no-floppy --fs-uuid --set=root --hint-bios=hd0,msdos5 --hint-efi=hd0,msdos5 --hint-baremetal=ahci0,msdos5 01CD7BB998DB0870 else search --no-floppy --fs-uuid --set=root 01CD7BB998DB0870 fi loopback loop0 /ubuntu/disks/root.disk set root=(loop0) echo 'Loading Linux 3.5.0-19-generic ...' linux /boot/vmlinuz-3.5.0-19-generic root=UUID=01CD7BB998DB0870 loop=/ubuntu/disks/root.disk ro quiet splash $vt_handoff echo 'Loading initial ramdisk ...' initrd /boot/initrd.img-3.5.0-19-generic } menuentry 'Ubuntu, with Linux 3.5.0-19-generic (recovery mode)' --class ubuntu --class gnu-linux --class gnu --class os $menuentry_id_option 'gnulinux-3.5.0-19-generic-recovery-fc296be2-8c59-4f21-a3f8-47c38cd0d537' { insmod gzio insmod ntfs set root='hd0,msdos5' if [ x$feature_platform_search_hint = xy ]; then search --no-floppy --fs-uuid --set=root --hint-bios=hd0,msdos5 --hint-efi=hd0,msdos5 --hint-baremetal=ahci0,msdos5 01CD7BB998DB0870 else search --no-floppy --fs-uuid --set=root 01CD7BB998DB0870 fi loopback loop0 /ubuntu/disks/root.disk set root=(loop0) echo 'Loading Linux 3.5.0-19-generic ...' linux /boot/vmlinuz-3.5.0-19-generic root=UUID=01CD7BB998DB0870 loop=/ubuntu/disks/root.disk ro recovery nomodeset echo 'Loading initial ramdisk ...' initrd /boot/initrd.img-3.5.0-19-generic } } ### END /etc/grub.d/10_lupin ### menuentry 'Linux, with Linux core repo kernel' --class arch --class gnu-linux --class gnu --class os $menuentry_id_option 'gnulinux-core repo kernel-true-0f490b6c-e92d-42f0-88e1-0bd3c0d27641'{ load_video set gfxpayload=keep insmod gzio insmod part_msdos insmod ext2 set root='hd0,msdos8' if [ x$feature_platform_search_hint = xy ]; then search --no-floppy --fs-uuid --set=root --hint-bios=hd0,msdos8 --hint-efi=hd0,msdos8 --hint-baremetal=ahci0,msdos8 0f490b6c-e92d-42f0-88e1-0bd3c0d27641 else search --no-floppy --fs-uuid --set=root 0f490b6c-e92d-42f0-88e1-0bd3c0d27641 fi echo 'Loading Linux core repo kernel ...' linux /boot/vmlinuz-linux root=UUID=0f490b6c-e92d-42f0-88e1-0bd3c0d27641 ro quiet echo 'Loading initial ramdisk ...' initrd /boot/initramfs-linux.img } menuentry 'Linux, with Linux core repo kernel (Fallback initramfs)' --class arch --class gnu-linux --class gnu --class os $menuentry_id_option 'gnulinux-core repo kernel-fallback-0f490b6c-e92d-42f0-88e1-0bd3c0d27641' { load_video set gfxpayload=keep insmod gzio insmod part_msdos insmod ext2 set root='hd0,msdos8' if [ x$feature_platform_search_hint = xy ]; then search --no-floppy --fs-uuid --set=root --hint-bios=hd0,msdos8 --hint-efi=hd0,msdos8 --hint-baremetal=ahci0,msdos8 0f490b6c-e92d-42f0-88e1-0bd3c0d27641 else search --no-floppy --fs-uuid --set=root 0f490b6c-e92d-42f0-88e1-0bd3c0d27641 fi echo 'Loading Linux core repo kernel ...' linux /boot/vmlinuz-linux root=UUID=0f490b6c-e92d-42f0-88e1-0bd3c0d27641 ro quiet echo 'Loading initial ramdisk ...' initrd /boot/initramfs-linux-fallback.img } lsblk NAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINT sda 8:0 0 931.5G 0 disk +-sda1 8:1 0 39.2M 0 part +-sda2 8:2 0 19.8G 0 part +-sda3 8:3 0 205.1G 0 part +-sda4 8:4 0 1K 0 part +-sda5 8:5 0 333.7G 0 part /host +-sda6 8:6 0 233.4G 0 part +-sda7 8:7 0 100.4G 0 part +-sda8 8:8 0 100M 0 part +-sda9 8:9 0 14.7G 0 part +-sda10 8:10 0 21.4G 0 part +-sda11 8:11 0 3G 0 part sr0 11:0 1 1024M 0 rom loop0 7:0 0 29G 0 loop / blkid /dev/loop0: UUID="fc296be2-8c59-4f21-a3f8-47c38cd0d537" TYPE="ext4" /dev/sda1: SEC_TYPE="msdos" LABEL="DellUtility" UUID="5450-4444" TYPE="vfat" /dev/sda2: LABEL="RECOVERY" UUID="78C4FAC1C4FA80A4" TYPE="ntfs" /dev/sda3: LABEL="OS" UUID="DACEFCF1CEFCC6B3" TYPE="ntfs" /dev/sda5: UUID="01CD7BB998DB0870" TYPE="ntfs" /dev/sda6: UUID="01CD7BB99CA3F750" TYPE="ntfs" /dev/sda7: LABEL="Windows 8" UUID="01CDBFB52F925F40" TYPE="ntfs" /dev/sda8: UUID="cdbb5770-d29c-401d-850d-ee30a048ca5e" TYPE="ext2" /dev/sda9: UUID="0f490b6c-e92d-42f0-88e1-0bd3c0d27641" TYPE="ext2" /dev/sda10: UUID="2e7682e5-8917-4edc-9bf9-044fea2ad738" TYPE="ext2" /dev/sda11: UUID="6081da70-d622-42b9-b489-309f922b284e" TYPE="swap Any help is appreciated. Please let me know if you need any extra data.

    Read the article

  • batch file to deploy files

    - by Martin Michalak
    hi I have created batch file which pulls info from *.txt file and deploy code from the source to destination: SET Source=%1 if exist %Source% ( ECHO Source for WEB exists ) else ( ECHO Wrong build%Source% doesn't exist GOTO Menu ) SET Server=%2 SET AppPool=%3 SET Destination=%4 SET Folder=%5 SET ENV=%6 SET AppName=%7 SET Envlog=%8 ECHO Deployment of WEB > %Envlog% %Date% %Time% echo. @ECHO Stopping App Pools @ECHO Stopping App Pools >> %Envlog% %Date% %Time% D:\ICTTools\PSEXEC.EXE -d \\%Server% cmd.exe /c c:\windows\system32\inetsrv\appcmd STOP apppool /apppool.name:%AppPool% echo. @ECHO App Pools will be stopped in the background @ECHO App Pools will be stopped in the background >> %Envlog% %Date% %Time% Pause echo. IF EXIST "%Destination%" ( ECHO Deleting %AppName% %Folder% RMDIR %Destination% /s /q ECHO Destination Folder %Folder% Deleted ECHO Destination Folder %Folder% Deleted >> %Envlog% %Date% %Time% ) else ( ECHO Destination Folder %Destination% does not exist, please check ECHO Destination Folder %Destination% does not exist, please check >> %Envlog% %Date% %Time% Pause ) echo. @ECHO Starting Robocopy for %AppName% @ECHO Starting Robocopy for %AppName% >> %Envlog% %Date% %Time% echo. START /WAIT /MIN ROBOCOPY.EXE %Source% %Destination% *.* /S /NP /R:3 /W:5 /LOG:"Logs\Robo%AppName%%ENV%.log" D:\Tools\Windiff\windiff.exe %Source% %Destination% echo. @ECHO Finished with Robocopy @ECHO Finished with Robocopy >> %Envlog% %Date% %Time% echo. @ECHO Checking if App pools stopped: @ECHO Checking if App pools stopped: >> %Envlog% %Date% %Time% D:\ICTTools\PSEXEC.EXE \\%Server% c:\windows\system32\inetsrv\appcmd LIST apppool /apppool.name:%AppPool% @echo off set /p ask=All app pools stopped? (y/n) if %ask%==y (echo Great, please continue with deployemnt) else echo Before continuing please check why app pools did not stop @echo App pools stopped?: %ask% >> %Envlog% %Date% %Time% DEL %Source%\web.config echo. @ECHO Production Config check if exist "%Destination%\%ENV%-Web.config" ( echo. ECHO The Application production configuration file does exist. ECHO The Application production configuration file does exist. >> %Envlog% %Date% %Time% COPY %Destination%\%ENV%-Web.config web.config echo. ECHO Production %ENV%-Web.config has been renamed to web.config ECHO Production %ENV%-Web.config has been renamed to web.config >> %Envlog% %Date% %Time% ) else ( ECHO The Application production configuration file is missing in Production %AppName% ECHO The Application production configuration file is missing in Production %AppName% >> %Envlog% %Date% %Time% explorer %Destination% Pause ) echo. @ECHO Confirm that configs were renamed correclty, if yes please hit any key to START APP Pools @ECHO Confirm that configs were renamed correclty, if yes please hit any key to START APP Pools >> %Envlog% %Date% %Time% Pause echo. @ECHO Start %AppName% Application Pool >> %Envlog% %Date% %Time% D:\ICTTools\PSEXEC.EXE \\%Server% c:\windows\system32\inetsrv\appcmd START apppool /apppool.name:%AppPool% @echo off set /p ask=All app pools started? (y/n) if %ask%==y (echo Great, please continue with deployemnt) else echo Before continuing please check why app pools did not start @echo App pools started?: %ask% >> %Envlog% %Date% %Time% Pause echo. @ECHO Build Version for %AppName% @ECHO Build Version for %AppName% >> %Envlog% %Date% %Time% type %Destination%\buildinfo.xml echo. ECHO ............................................... @ECHO ...........Deployment Compelted................ @ECHO ...........Deployment Compelted................>> %Envlog% %Date% %Time% ECHO ............................................... here are my issues: Lets say I am running code for 3 servers, then for each instance: For all three servers I am performing destination folder delete even so destination folder is always the same, the code should only delete it in the 1st instance (when code is deployed to first server) then I would prefer if script would check if the code from the source and destination is the same and if it is it should delete the folder or not. Then based on 1: a) deleting web.config and renaming should only happen if code in destination is new b) Robocopy should not override files if they are the same I think there is /Xo option to do that any idea how to achieve that? :)

    Read the article

  • Places Folders Open in Archive Manager

    - by PansophySR
    Am relatively new to Ubuntu. Currently running 10.10, which was an upgrade from 10.4, which was an upgrade from 9.10, which was a fresh install. Have never compressed anything in Ubuntu, but because I wanted to use the contents of a large folder on a Windows machine, I installed 7zip. Using Places, I navigated to the folder I wanted to compress, right-clicked, chose Compress, selected 7-zip and started the compression. This took many minutes to complete (the final 7z file is over 2.2 GB), but when I copied it to the windows machine, 7-zip handles it fine. However, now when I open Places, the Home Folder, User folder, Documents, Music, Pictures, Videos and Downloads all open the Archive Manager which gives an error message that it "Could not create the archive" because the "Archive type not supported." If I open Places/Computer choose the usr folder from places on the left and right-click/Properties on any of the folders, Music for instance, there is no place to change the "open with." Anyone know how to get this working again.

    Read the article

< Previous Page | 260 261 262 263 264 265 266 267 268 269 270 271  | Next Page >