Search Results

Search found 65497 results on 2620 pages for 'list files'.

Page 142/2620 | < Previous Page | 138 139 140 141 142 143 144 145 146 147 148 149  | Next Page >

  • User unable to delete folder / files "File in use by another user" Server 2003

    - by Az
    I am administering a standalone Windows 2003 Terminal Server with no domain membership. Occasionally (about once a week or so) a user will attempt to delete a sub-folder in a Shared folder and gets denied with "File in use by another user". I tried checking the shared folder snap-in and that folder is not open. She has full control and is the owner of the folder as well. I even checked in "Effective permissions" for some of the folders / files she cant delete and she truly has full control. I am able to delete the folder as Administrator with no problem. Another odd thing, she can delete the files IN the folder most of the time (this issue happens on both folders and files in the share). Sometimes merely waiting a day or two will allow her to delete the folder or files. I am curious as to why she gets the message that it is in use as creator/owner with full control yet I don't get it simply as a member of the Admin group. If anyone out there has any ideas I'd love to hear them! THANK YOU.

    Read the article

  • Updating files with a Perforce trigger before submit [migrated]

    - by phantom-99w
    I understand that this question has, in essence, already been asked, but that question did not have an unequivocal answer, so please bear with me. Background: In my company, we use Perforce submission numbers as part of our versioning. Regardless of whether this is a correct method or not, that is how things are. Currently, many developers do separate submissions for code and documentation: first the code and then the documentation to update the client-facing docs with what the new version numbers should be. I would like to streamline this process. My thoughts are as follows: create a Perforce trigger (which runs on the server side) which scans the submitted documentation files (such as .txt) for a unique term (such as #####PERFORCE##CHANGELIST##NUMBER###ROFL###LOL###WHATEVER#####) and then replaces it with the value of what the change list would be when submitted. I already know how to determine this value. What I cannot figure out, is how or where to update the files. I have already determined that using the change-content trigger (whether possible or not), which "fire[s] after changelist creation and file transfer, but prior to committing the submit to the database", is the way to go. At this point the files need to exist somewhere on the server. How do I determine the (temporary?) location of these files from within, say, a Python script so that I can update or sed to replace the placeholder value with the intended value? The online documentation for Perforce which I have found so far have not been very explicit on whether this is possible or how the mechanics of a submission at this stage would work.

    Read the article

  • How to automate downloading files?

    - by Damon
    I got a book which had a pass to access digital versions of hi-res scans of much of the artwork in the book. Amazing! Unfortunately the presentation of all the these are 177 pages of 8 images each with links to zip files of jpgs. It is extremely tedious to browse, and I would love to be able to get all the files at once rather than sitting and clicking through each one separately. archive_bookname/index.1.htm - archive_bookname/index.177.htm each of those pages have 8 links each to the files linking to files such as <snip>/downloads/_Q6Q9265.jpg.zip, <snip>/downloads/_Q6Q7069.jpg.zip, <snip>/downloads/_Q6Q5354.jpg.zip. that don't quite go in order. I cannot get a directory listing of the parent /downloads/ folder. Also, the file is behind a login-wall, so doing a non-browser tool, might be difficult without knowing how to recreate the session info. I've looked into wget a little but I'm pretty confused and have no idea if it will help me with this. Any advice on how to tackle this? Can wget do this for me automatically?

    Read the article

  • How to avoid duplicates when copying files that have been renamed at the destination

    - by Benoitt
    I have to get pictures from a folder – with subfolders which are updated automatically – with their extensions. These files have to be copied in a folder where a website based on PHP will edit them (by renaming and creating an XML file) to be downloadable and integrated in an XML feed. Because of the rename function of the script, when I perform the copy gain, all the files are duplicated, because the script has renamed the original ones already. I've tried a few things with rsync but I'm looking for something more powerful because I can't copy files with an external "history". #!/bin/bash find '/home/name/picture' -name '*.jpg' | while read FILE ; do rsync --backup --backup-dir=incremental --suffix=.old "$FILE" /var/www/media ; done wget --spider 'http://myscript.php' ; #exit 0 PS: As a little addition, I'd like to replace '.' with a 'space' just after the *.jpeg copy. My PHP script has some problem to define files with comma because of the extension. I'm finking about a command with find – like I did before – with a sed function? Is that a good idea?

    Read the article

  • Non-restored Files Corrupted on System Restore

    - by Yar
    I restored OSX 10.6.2 today (was 10.6.3 and not booting) by copying the system over from a backup. The data directories were not touched. In the data directories, I'm seeing some files as 0 bytes, and getting permission-denied errors when copying, even when using sudo cp or the Finder itself. Some programs, differently, take the files at face value and see no permission problems (such as zip), but they see the files as zero bytes, which would be game-over for recovery. cp: .git/objects/fe/86b676974a44aa7f128a55bf27670f4a1073ca: could not copy extended attributes to /eraseme/blah/.git/objects/fe/86b676974a44aa7f128a55bf27670f4a1073ca: Operation not permitted I have tried sudo chown, sudo chmod -R 777 and sudo chflags -R nouchg which do not change the end result. Strangely, this is only affecting my .git directories (perhaps because they start with a period, but renaming them -- which works -- does not change anything). What else can I do to take ownership of these files? Edit: This question comes from StackOverflow because I originally thought it was a GIT problem. It's definitely not (just) GIT. Anyway, this is to help put some of the comments in context.

    Read the article

  • Copy past speed very slow for a large number of files on Windows [closed]

    - by Arno2501
    I've run the following test I've created a folder containing 15'000 files of 400 bytes using this batch : @ECHO off SET times=15000 FOR /L %%i IN (1,1,%times%) DO ( fsutil file createnew filename%%i.txt 400 ) then I copy past it on my Windows Computer using this command : robocopy LargeNumberOfFiles\ LargeNumberOfFiles2\ After it has completed I can see that the transfer rate was 915810 Bytes/sec this is less than 1 MB/s. It took me several seconds to copy 7 MBytes Please note that this is very slow. I've tried the same with a folder with a single file of 50 Mbytes and the transfer rate is 1219512195 Bytes/sec. (yeah GB/s) instantaneous. Why copying large number of files take so much time - ressources on a windows filesystem ? Please note that I've tried to do the same on a linux system which runs on the same computer in a virtual machine (vmware player) with ext3 filesystem. I use the cp command and the copy is instantaneous ! Please also note the following : no antivirus I've tested that behaviour on multiple windows computers (always ntfs) i always get comparable results (transfer rate under 1MB/s avg 7-8 seconds to copy 7 MBytes) I've tested on multiple linux ext3 system the copy is always instantaneous for that amount (15000 files of 400 bytes) The question is about understanding what makes windows filesystem so slow to copy large number of files compared to a linux one for instance.

    Read the article

  • Memory Usage by Mapped Files (Win7 64Bit)

    - by Dexter
    When copying files from/to my external USB3 HDD memory usage in Win7 goes up to 100% and remains there. I'm not sure whether this is a problem caused by faulty drivers or not, but I already have the current version of them (Etron USB3 controller on a Gigabyte 990fxa board).. Using RAMMap it becomes obvious that the files, that are to be copied, are mapped into memory. Clicking on Empty > Empty System Working Set seems to temporarily fix the problem (without causing any trouble with the file copy process), but it needs to be done every few seconds. Is there any way to schedule this operation to happen ever 10 or so seconds on its own? What underlying system command is RAMMap using? Or, alternatively, is there any way to limit how much RAM mapped files may use in Windows 7? I know mapped files would usually be removed from memory if other programs need the memmory, but while memmory usage is at 100% the system starts freezing up for half a second or so everytime I click anything .. thus the automatic removal of unused memory contents seems to be failing here.

    Read the article

  • Distinctly LINQ &ndash; Getting a Distinct List of Objects

    - by David Totzke
    Let’s say that you have a list of objects that contains duplicate items and you want to extract a subset of distinct items.  This is pretty straight forward in the trivial case where the duplicate objects are considered the same such as in the following example: List<int> ages = new List<int> { 21, 46, 46, 55, 17, 21, 55, 55 }; IEnumerable<int> distinctAges = ages.Distinct(); Console.WriteLine("Distinct ages:"); foreach (int age in distinctAges) { Console.WriteLine(age); } /* This code produces the following output: Distinct ages: 21 46 55 17 */ What if you are working with reference types instead?  Imagine a list of search results where items in the results, while unique in and of themselves, also point to a parent.  We’d like to be able to select a bunch of items in the list but then see only a distinct list of parents.  Distinct isn’t going to help us much on its own as all of the items are distinct already.  Perhaps we can create a class with just the information we are interested in like the Id and Name of the parents.  public class SelectedItem { public int ItemID { get; set; } public string DisplayName { get; set; } } We can then use LINQ to populate a list containing objects with just the information we are interested in and then get rid of the duplicates. IEnumerable<SelectedItem> list = (from item in ResultView.SelectedRows.OfType<Contract.ReceiptSelectResults>() select new SelectedItem { ItemID = item.ParentId, DisplayName = item.ParentName }) .Distinct(); Most of you will have guessed that this didn’t work.  Even though some of our objects are now duplicates, because we are working with reference types, it doesn’t matter that their properties are the same, they’re still considered unique.  What we need is a way to define equality for the Distinct() extension method. IEqualityComparer<T> Looking at the Distinct method we see that there is an overload that accepts an IEqualityComparer<T>.  We can simply create a class that implements this interface and that allows us to define equality for our SelectedItem class. public class SelectedItemComparer : IEqualityComparer<SelectedItem> { public new bool Equals(SelectedItem abc, SelectedItem def) { return abc.ItemID == def.ItemID && abc.DisplayName == def.DisplayName; } public int GetHashCode(SelectedItem obj) { string code = obj.DisplayName + obj.ItemID.ToString(); return code.GetHashCode(); } } In the Equals method we simply do whatever comparisons are necessary to determine equality and then return true or false.  Take note of the implementation of the GetHashCode method.  GetHashCode must return the same value for two different objects if our Equals method says they are equal.  Get this wrong and your comparer won’t work .  Even though the Equals method returns true, mismatched hash codes will cause the comparison to fail.  For our example, we simply build a string from the properties of the object and then call GetHashCode() on that. Now all we have to do is pass an instance of our IEqualitlyComarer<T> to Distinct and all will be well: IEnumerable<SelectedItem> list =     (from item in ResultView.SelectedRows.OfType<Contract.ReceiptSelectResults>()         select new SelectedItem { ItemID = item.dahfkp, DisplayName = item.document_code })                         .Distinct(new SelectedItemComparer());   Enjoy. Dave Just because I can… Technorati Tags: LINQ,C#

    Read the article

  • NSFetchedResultsController fetch request - updating predicate and UITableView

    - by Macatomy
    In my iPhone Core Data app I have it configured in a master-detail view setup. The master view is a UITableView that lists objects of the List entity. The List entity has a to-many relationship with the Task entity (called "tasks"), and the Task entity has an inverse to-one relationship with List called "list". When a List object is selected in the master view, I want the detail view (another UITableView) to list the Task objects that correspond to that List object. What I've done so far is this: In the detail view controller I've declared a property for a List object: @property (nonatomic, retain) List *list; Then in the master view controller I use this table view delegate method to set the list property of the detail view controller when a list is selected: - (void)tableView:(UITableView *)aTableView didSelectRowAtIndexPath:(NSIndexPath *)indexPath { NSManagedObject *selectedObject = [[self fetchedResultsController] objectAtIndexPath:indexPath]; detailViewController.list = (List*)selectedObject; } Then, I've overriden the setter for the list property in the detail view controller like this: - (void)setList:(List*)newList { if (list != newList) { [list release]; list = [newList retain]; NSPredicate *newPredicate = [NSPredicate predicateWithFormat:@"(list == %@)", list]; [NSFetchedResultsController deleteCacheWithName:@"Root"]; [[[self fetchedResultsController] fetchRequest] setPredicate:newPredicate]; NSError *error = nil; if (![[self fetchedResultsController] performFetch:&error]) { NSLog(@"Unresolved error %@, %@", error, [error userInfo]); abort(); } } } What I'm doing here is setting a predicate on the fetched results to filter out the objects so that I only get the ones that belong to the selected List object. The fetchedResultsController getter for the detail view controller looks like this: - (NSFetchedResultsController *)fetchedResultsController { if (fetchedResultsController == nil) { NSFetchRequest *fetchRequest = [[NSFetchRequest alloc] init]; NSEntityDescription *entity = [NSEntityDescription entityForName:@"Task" inManagedObjectContext:managedObjectContext]; [fetchRequest setEntity:entity]; NSPredicate *predicate = [NSPredicate predicateWithFormat:@"FALSEPREDICATE"]; [fetchRequest setPredicate:predicate]; NSSortDescriptor *sortDescriptor = [[NSSortDescriptor alloc] initWithKey:@"name" ascending:YES]; NSArray *sortDescriptors = [[NSArray alloc] initWithObjects:sortDescriptor, nil]; [fetchRequest setSortDescriptors:sortDescriptors]; NSFetchedResultsController *aFetchedResultsController = [[NSFetchedResultsController alloc] initWithFetchRequest:fetchRequest managedObjectContext:managedObjectContext sectionNameKeyPath:nil cacheName:@"Root"]; aFetchedResultsController.delegate = self; self.fetchedResultsController = aFetchedResultsController; [aFetchedResultsController release]; [fetchRequest release]; [sortDescriptor release]; [sortDescriptors release]; } return fetchedResultsController; } Its almost unchanged from the default in the Core Data project template, the change I made is to add a predicate that always returns false, the reason being that when there is no List selected I don't want any items to be displayed in the detail view (if a list is selected the predicate is changed in the setter for the list property). However, when I select a list item, nothing really happens. Nothing in the table view changes, it stays empty. I'm sure my logic is flawed in several places, advice is appreciated Thanks

    Read the article

  • Git: Removing carriage returns from source-controlled files

    - by Blixt
    I've got a Git repository that has some files with DOS format (\r\n line endings). I would like to just run the files through dos2unix (which would change all files to UNIX format, with \n line endings), but how badly would this affect history, and is it recommended at all? I assume that the standard is to always use UNIX line endings for source-controlled files, and optionally switch to OS-specific line endings locally?

    Read the article

  • Linq to List and IEnumerable issues

    - by Otaku
    I am querying an HTML file with Linq. It looks something like this: <html> <body> <div class="Players"> <div class="role">Goalies</div> <div class="name">John Smith</div> <div class="name">Shawn Xie</div> <div class="role">Right Wings</div> <div class="name">Jack Davis</div> <div class="name">Carl Yuns</div> <div class="name">Wayne Gortonia</div> <div class="role">Centers</div> <div class="name">Lutz Gaspy</div> <div class="name">John Jacobs</div> </div </html> </body> What I'm trying to do is create a list of these folks like in a list of a structure called Players: Structure Players Public Name As String Public Position As String End Structure But I've quickly found out I don't really know what I'm doing when it comes to Linq. I've got this far my my queries: Dim goalieList = From d In player.Elements _ Where d.Value = "Goalies" _ Select From g In d.ElementsAfterSelf _ Take While (g.@class <> "role") _ Select New Players With {.Position = "Goalie", _ .Name = g.Value} Dim centersList = From d In player.Elements _ Where d.Value = "Centers" _ Select From g In d.ElementsAfterSelf _ Take While (g.@class <> "role") _ Select New Players With {.Position = "Centers", _ .Name = g.Value} Which gets me down to the the players by position, but then I can't do much with this afterwards the result type is System.Collections.Generic.IEnumerable(Of System.Collections.Generic.IEnumerable(Of Player)) What I want to do is add these two results to a new list, like: Dim playersList As List(Of Players) = Nothing playersList.AddRange(centersList) playersList.AddRange(goalieList) So that I can then query the list and use it. But it kicks the error: Unable to cast object of type 'WhereSelectEnumerableIterator2[System.Xml.Linq.XElement,System.Collections.Generic.IEnumerable1[Players]]' to type 'System.Collections.Generic.IEnumerable`1[Players]' As you can see, I may really have no idea how to work with all these objects/classes. Does anyone have any insight on what I may be doing wrong and how I can resolve it? RESOLVED: The Linq query needs to return a single iEnumerable, like this: Dim goalieList = From l In _ (From d In players.Elements _ Where d.Value = "Goalies" _ Select d.ElementsAfterSelf.TakeWhile(Function(f) f.@class <> "role")) _ Select New Players With {.Position = "Goalie", .Name = l.Value} and then use goalieList.ToList

    Read the article

  • How to decrypt encrypted files using a PEM private key

    - by Phil Cole
    I have files which have either been encrypted with a public key and the Blowfish algorithm, or a public key and the AES-256 algorithm. I'm looking to put together a perl script that would be able to use the private keys (which I do have) to decrypt the files. The public and private key files are all in PEM format, and while I can find ways of reading the PEM files, and ways of decrypting data with a key, I haven't yet found a way of going from PEM - key. Any suggestions?

    Read the article

  • C# XMLSerializer fails with List<T>

    - by Redshirt
    Help... I'm using a singleton class to save all my settings info. It's first utilized by calling Settings.ValidateSettings(@"C:\MyApp") The problem I'm having is that 'List Contacts' is causing the xmlserializer to fail to write the settings file, or to load said settings. If I comment out the List then I have no problems saving/loading the xml file. What am I doing wrong... Thanks in advance // The actual settings to save public class MyAppSettings { public bool FirstLoad { get; set; } public string VehicleFolderName { get; set; } public string ContactFolderName { get; set; } public List<ContactInfo> Contacts { get { if (contacts == null) contacts = new List<ContactInfo>(); return contacts; } set { contacts = value; } } private List<ContactInfo> contacts; } // The class in which the settings are manipulated public static class Settings { public static string SettingPath; private static MyAppSettings instance; public static MyAppSettings Instance { get { if (instance == null) instance = new MyAppSettings(); return instance; } set { instance = value; } } public static void InitializeSettings(string path) { SettingPath = Path.GetFullPath(path + "\\MyApp.xml"); if (File.Exists(SettingPath)) { LoadSettings(); } else { Instance.FirstLoad = true; Instance.VehicleFolderName = "Cars"; Instance.ContactFolderName = "Contacts"; SaveSettingsFile(); } } // load the settings from the xml file private static void LoadSettings() { XmlSerializer ser = new XmlSerializer(typeof(MyAppSettings)); TextReader reader = new StreamReader(SettingPath); Instance = (MyAppSettings)ser.Deserialize(reader); reader.Close(); } // Save the settings to the xml file public static void SaveSettingsFile() { XmlSerializer ser = new XmlSerializer(typeof(MyAppSettings)); TextWriter writer = new StreamWriter(SettingPath); ser.Serialize(writer, Settings.Instance); writer.Close(); } public static bool ValidateSettings(string initialFolder) { try { Settings.InitializeSettings(initialFolder); } catch (Exception e) { return false; } // Do some validation logic here return true; } } // A utility class to contain each contact detail public class ContactInfo { public string ContactID; public string Name; public string PhoneNumber; public string Details; public bool Active; public int SortOrder; } }

    Read the article

  • git difftool, open all diff files immediately, not in serial

    - by Seba Illingworth
    The default git diff behavior is to open each diff file in serial (wait for previous file to be closed before opening next file). I'm looking for a way to open all the files at once - in BeyondCompare for example this would open all the files in tabs within the same BC window. This would make it easier to review a complex set of changes; flick back and forwards between the diff files and ignore unimportant files.

    Read the article

  • Finding duplicate files by content across multiple directories

    - by gagneet
    I have downloaded some files from the internet related to a particular topic. Now I wish to check if the files have any duplicates. The issue is that the names of the files would be different, but the content may match. Is there any way to implement some code, which will iterate through the multiple folders and inform which of the files are duplicates?

    Read the article

  • Commenting C code, header and source files

    - by pygabriel
    I'm looking for a "best practice" to document my C code. Like in any project I have some header files ".h" and the respective source file ".c" In the header file what kind of comment you put in? And in source files? The question arise up because since I commented well my header files, the c files looks like a mess. What's your best practices in keeping the code well commented?

    Read the article

  • Why doesn't git commit -a add new files?

    - by splintor
    I'm a bit new to git, and I fail to understand why git commit -a only stages changed and deleted files but not new files. Can anyone explain why is it like this, and why there is no other commit flag to enable adding files and committing in one command? BTW, hg commit -A adds both new and deleted files to the commit

    Read the article

  • How do web browser games access temporary files

    - by Phenom
    There is a web game that I play and I used fiddler to see what temporary files it downloaded. While I was playing I deleted all those temporary files including the sounds and flash files. But it didn't affect the game at all. Why is that? I checked in fiddler and it doesn't look like the files were redownloaded.

    Read the article

  • where to find Ubercart translation files

    - by ernie
    I am trying to update language specific text files ("po files") for Ubercart, but it is unclear who/where these files are maintained. There are several places sited but I am not sure which is maintained? http://ftp.drupal.org/files/translations/6.x/ubercart/ http://l10n.privnet.biz/translation_group/ Also a description of how to do this in Drupal. In Drupal (link: admin/build/translate/import) there are several text groups to select. Do I have to repeat update for each group?

    Read the article

  • not able to get a processed list in jquery variable

    - by Pradyut Bhattacharya
    I have html list <ol id="newlist"> <li>Test <ol> <li>1</li> <li>2</li> <li>3</li> </ol> </li> <li>Another test <ol> <li>1</li> </ol> </li> <li>Cool Test <ol> <li>1</li> <li>2</li> </ol> </li> </ol> Now i have hidden the list using the css... #newlist li { display:none; list-style: none; } I want to display the list and the only the descendants which have greater than 1 descendants... the output should be... Test 1 2 3 Another test Cool Test 1 2 I have used jquery and able to get the output... the code i used... $("ol#newlist > li").show(); for (var i = 0; i < $("ol#newlist > li").length; i++) { if ($("ol#newlist > li:eq(" + i + ") ol > li").length > 1) $("ol#newlist > li:eq(" + i + ") ol > li").show(); } the sample page here Now i want all the list in a single variable like i can get the lis in a variable... var $li = $("ol#newlist > li"); but the code $li.add($("ol#newlist > li:eq(" + i + ") ol > li")); is not working... the sample page here Please help... Thanks Pradyut India

    Read the article

  • Log4Net & RollingFileAppender to generate Xml files

    - by SaguiItay
    I've managed to configure Log4Net with a RollingFileAppender in order to generate Xml files. However, the generated files are not valid XML files until a "roll" is performed - the XML doesn't have a closing XML tag. Basically, this prevents to files from being read until that are "closed"/"rolled". Anyone else encountered this issue? I my previous (custom) solution I had to write the closing tag after writing each entry, and overwrite it with the next entry... :(

    Read the article

  • How to search Jar files using Windows Search?

    - by Marcus
    I believe back when we were on Win2K, Windows Search would search through Jar files to locate specific classes but this doesn't appear to work in XP. Does anyone know how to enable this in XP? Note, to do the search in Win2K we just entered *.jar for the files and "ClassABC" for the search text string and the search would return any jar files containing class files where the title contained "ClassABC".

    Read the article

  • How to remove files and directories quickly

    - by None
    I am using a mac. When I use the "rm" command it can only remove files. The "rmdir" command only removes empty folders. If you have a directory with files and folders with files and folders in them and so on. Is there anyway to delete all the files and folders without all the strenuous command typing? Remember, I am using the mac bash shell from terminal, not Microsoft DOS or linux.

    Read the article

  • Creating .lib files in CUDA Toolkit 5

    - by user1683586
    I am taking my first faltering steps with CUDA Toolkit 5.0 RC using VS2010. Separate compilation has me confused. I tried to set up a project as a Static Library (.lib), but when I try to build it, it does not create a device-link.obj and I don't understand why. For instance, there are 2 files: A caller function that uses a function f #include "thrust\host_vector.h" #include "thrust\device_vector.h" using namespace thrust::placeholders; extern __device__ double f(double x); struct f_func { __device__ double operator()(const double& x) const { return f(x); } }; void test(const int len, double * data, double * res) { thrust::device_vector<double> d_data(data, data + len); thrust::transform(d_data.begin(), d_data.end(), d_data.begin(), f_func()); thrust::copy(d_data.begin(),d_data.end(), res); } And a library file that defines f __device__ double f(double x) { return x+2.0; } If I set the option generate relocatable device code to No, the first file will not compile due to unresolved extern function f. If I set it to -rdc, it will compile, but does not produce a device-link.obj file and so the linker fails. If I put the definition of f into the first file and delete the second it builds successfully, but now it isn't separate compilation anymore. How can I build a static library like this with separate source files? [Updated here] I called the first caller file "caller.cu" and the second "libfn.cu". The compiler lines that VS2010 outputs (which I don't fully understand) are (for caller): nvcc.exe -ccbin "C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\bin" -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v5.0\include" -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v5.0\include" -G --keep-dir "Debug" -maxrregcount=0 --machine 32 --compile -g -D_MBCS -Xcompiler "/EHsc /W3 /nologo /Od /Zi /RTC1 /MDd " -o "Debug\caller.cu.obj" "G:\Test_Linking\caller.cu" -clean and the same for libfn, then: nvcc.exe -gencode=arch=compute_20,code=\"sm_20,compute_20\" --use-local-env --cl-version 2010 -ccbin "C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\bin" -rdc=true -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v5.0\include" -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v5.0\include" -G --keep-dir "Debug" -maxrregcount=0 --machine 32 --compile -g -D_MBCS -Xcompiler "/EHsc /W3 /nologo /Od /Zi /RTC1 /MDd " -o "Debug\caller.cu.obj" "G:\Test_Linking\caller.cu" and again for libfn.

    Read the article

< Previous Page | 138 139 140 141 142 143 144 145 146 147 148 149  | Next Page >