Search Results

Search found 49453 results on 1979 pages for 'memory mapped files'.

Page 237/1979 | < Previous Page | 233 234 235 236 237 238 239 240 241 242 243 244  | Next Page >

  • ASP.NET files necessary for development

    - by apollo-creed
    I am just getting started in ASP.NET and have some existing projects to maintain. I have read that ASP.NET projects include a folder called app_data, a code behind DLL, .sln project files, .proj files etc Which of these files are necessary for the continued development of a ASP.NET website? Also, are there others which are key to building ASP.NET applications?

    Read the article

  • Windows 7 Sub-Folders hidden in "Program Files" directory

    - by ron tornambe
    I have Google searched for an hour now and I am confounded. I am using InnoSetup to install a .NET Winforms application that creates directories and folders on the fly. (I have set the folder options to display hidden files, folders...) Although the files that are added to "created" folders appear within the application, they do not show when using Windows Explorer or even when issuing a Dir from a command prompt. I have also modified the application to display (and delete) the contents of these (seemingly imaginary) folders, so I am sure they exist. What am I missing?

    Read the article

  • Webserver sending corrupt or corrupting served files

    - by NotIan
    EDIT: Looks like the problem was a rootkit that corrupted a bunch of low level linux commands, including top, ps, ifconfig, netstat and others. The problem was resolved by taking all web files off the server and wiping it. A dedicated server we operate is having a strange issue. Files are not be sent complete or are showing up with garbage data. Example: http://sustainablefitness.com/images/banner_bootcamps.jpg To make matters more confusing this corruption does NOT happen when the files are served as https, (I would post a link, but I don't have enough rep points, just add an 's' after http in the link above.) When I throw load at the server, I get dozens of (swapd)s in top this is the only thing that really jumps out. I can't post images but ( imgur.com / ZArSq.png ) is a screenshot of top. I have tried a lot of stuff so far, I am willing to try anything that I can. A dedicated server we operate is having a strange issue. Files are not be sent complete or are showing up with garbage data. Example: http://sustainablefitness.com/images/banner_bootcamps.jpg To make matters more confusing this corruption does NOT happen when the files are served as https, (I would post a link, but I don't have enough rep points, just add an 's' after http in the link above.) When I throw load at the server, I get dozens of (swapd)s in top this is the only thing that really jumps out. I can't post images but ( imgur.com / ZArSq.png ) is a screenshot of top. I have tried a lot of stuff so far, I am willing to try anything that I can.

    Read the article

  • Encrypt shared files on AD Domain.

    - by Walter
    Can I encrypt shared files on windows server and allow only authenticated domain users have access to these files? The scenario as follows: I have a software development company, and I would like to protect my source code from being copied by my programmers. One problem is that some programmers use their own laptops to developing the company's software. In this scenario it's impossible to prevent developers from copying the source code for their laptops. In this case I thought about the following solution, but i don't know if it's possible to implement. The idea is to encrypt the source code and they are accessible (decrypted) only when developers are logged into the AD domain, ie if they are not logged into the AD domain, the source code would be encrypted be useless. Can be implemented this ? What technology should be used?

    Read the article

  • Encrypt shared files on AD Domain.

    - by Walter
    Can I encrypt shared files on windows server and allow only authenticated domain users have access to these files? The scenario as follows: I have a software development company, and I would like to protect my source code from being copied by my programmers. One problem is that some programmers use their own laptops to developing the company's software. In this scenario it's impossible to prevent developers from copying the source code for their laptops. In this case I thought about the following solution, but i don't know if it's possible to implement. The idea is to encrypt the source code and they are accessible (decrypted) only when developers are logged into the AD domain, ie if they are not logged into the AD domain, the source code would be encrypted be useless. How can be implemented this using EFS?

    Read the article

  • Which memory related Tomcat JVM startup parameters are worth tuning?

    - by knorv
    I'm trying to understand the fine art of tuning Tomcat memory settings. In this quest I have the following three questions: Which memory related JVM startup parameters are worth setting when running Tomcat? Why? What are useful rule-of-thumbs when fine-tuning the memory settings for a Tomcat installation? How do you monitor the memory consumption of your live Tomcat installation?

    Read the article

  • Storage setup for large files

    - by Mecca
    I need to store over 200TB of data (all types, biggest being video files) and be able to access it over a local network. The files will be accessed for editing or searches. I don't need versioning, but a setup that would keep me safe from harddrive failures would be nice. Right now the content is on different harddrives, some external drives, some regular. I don't exclude the possibility of buying new/extra drives if necessary. If they will ever be exposed to the web, it wont be to the public, but just a couple of people. I have no idea what to buy to make this happen. I see some NAS solutions over the internet like this http://www.bestbuy.com/site/a/2266043.p?id=1218317764591&skuId=2266043 but the storage is not enough, plus it doesn't seem to be scalable. What do you recommend? Thanks

    Read the article

  • XCode 4.3 : XIB files and localisation

    - by Fabrice MAUPIN
    I have a problem with XIB Files and localizations (Xcode 4.3, Mac Os X 10.7.4) My application supports french and english localizations. For my test, i decided to change the languages and regions from "french" to "english" in system preferences. When i launch my application, it displays always old XIB files (french) et not the XIB files "attached" to my new localization ! ** I followed all the recommendations which i was able to find : i cleared all caches, clean up the project, ... and so on ! The problem is always here and persists. Can be that XCode4 has the other files to delete somewhere else ? is it possible to use another means to test my new localization ? If you are a idea, ... Thanks in advance. FM.

    Read the article

  • Windows 7 Locking Executable Files

    - by James Burgess
    Since I've been using Windows 7 RTM (as opposed to the Beta and RCs), I've been having a peculiar issue with executable files. I first noticed it whilst using Visual Studio, in that when building a project, it would often fail saying that the output file was locked - but the problem has stemmed further. When I've executed an application, closed it (cleanly), and attempted to delete/move/rename/overwrite said file, Windows 7 tells me that the file is locked/access is denied. I've made use of software like LockHunter/Unlocker but it is seemingly unable to remove these locks (most of the time, it shows no locks at all). After about 5-10 minutes, the respective files are unlocked again, but needless to say this is a bit of a workflow-breaker (as it's not simply constrained to VS). I've done the usual tasks of virus/malware scanning, and turned up with absolutely nothing. I've got no peculiar services running, and the problem was not present before I installed a Windows 7 RTM version. Any help is greatly appreciated.

    Read the article

  • Google Drive desktop client not updating existing files from other users

    - by cqm
    I've looked around and there doesn't really seem to be any troubleshooting information for the Google Drive desktop client. It all assumes you are using Google Docs on the web. Anyway, my team is trying to use Google Drive like Dropbox, where multiple people are editing files shared amongst them through the desktop, such as images. Dropbox is really good at noticing when a checksum for a file is changed, and syncing it. Google Drive's desktop client seems not to do this at all. Google Drive desktop client seems to only sync newly created files and not giving any notification at all that there is a modified version, it will never sync it, even though going online and opening that file will show the modified version. Is there any way to fix this? and the answer has nothing to do with proxy or firewall configurations. Team is using computers running OSX and Windows.

    Read the article

  • Grep all files in a directory and print matches with file name

    - by javanix
    I have a list of log files that I create as part of a video encoding script that I wrote. I would like to search all of them and print out certain statistics from the encode - how fast they were encoded, what settings were used, etc. I can search for the average framerate in one file via this 1 liner: cat ${filename} | grep average which outputs: work: average encoding speed for job is 23.211176 fps and search for the ratefactor: cat ${filename} | grep RF I would like to search all files in the directory and print off one, or prefereably both pieces of information along with the filename. Is there any way I can use find or grep to get this in a one-liner, or do I need to write a script? I would like output like this: /home/javanix/filename.log <RF line> <average line> I would like this to either work using FreeBSD 9 or Ubuntu 12.04.

    Read the article

  • Considering modified files for rebuild

    - by harik
    I have a C++ project, I am using Bakefile for build process, Makefiles are generated for msvc, mingw, gnu etc for cross-platform support. Now the problem is that if I change any .h files (which are included in other .cpp files) and performing a rebuild does not recompile modified files. But changing any .cpp file gets recompiled. Based on modified time-stamp of any file which is included in the project I expect to consider that file for rebuild. Am I missing something which required to be added as a tag in .bkl files? Please help.

    Read the article

  • Vagrant is creating files and folders in my project

    - by SERPRO
    Recently I updated Vagrant (v 1.6.3) and I noticed that in the folder of my project there are some new folders and files like: d20140610-11944-1j6n1cz/ d20140610-15421-1pkz3t8/ vagrant20140610-11944-p76ezc vagrant20140610-11944-p76ezc2 vagrant20140610-11944-yt3bhz vagrant20140610-11944-yt3bhz1 vagrant20140610-15421-mfqrig vagrant20140610-15421-mfqrig1 vagrant20140610-15421-y3r71a vagrant20140610-15421-y3r71a2 vagrant20140610-15421-y3r71a2.lock most of the files are empty, others have text like this: source "https://rubygems.org" source "http://gems.hashicorp.com" gem "vagrant", "= 1.6.3" group :plugins do gem "vagrant-login", nil, {} gem "vagrant-share", nil, {} end The directories have a file named config with this this info: BUNDLE_PATH: "/home/user/.vagrant.d/gems" Is this some kind of debug option? how can I disable it?

    Read the article

  • Vantec NexStar NAS Encloser - Writing large files

    - by peter
    I have one of these 'Vantec NexStar LX - NST-475LX-BK' drive enclosures. It is a NAS device. When I write a file to the device using eSata, or a SMB share I cannot write files over 4GB. I think this is because the drive is formatted with FAT32. But when I access the device using FTP it doesn't matter. I can write files of any size. E.g. I wrote one on there last night which was 30GB. Does this make any sense? Why? I guess the most important thing for me is data integrity.

    Read the article

  • Dreamweaver Files uploaded to Win 2008 server cause login prompt

    - by Lil
    I have a customer who uses a 4 year old version of Dreamweaver to edit her webpages. My hosting reseller account is with a company that uses Windows Server 2008. Every time my customer edits a page and uploads it, I have to set the permissions for that file to be readable, manually from the site's control panel. The customer is furious with me because her files cause the login prompt. I am able to upload files myself that remain readable to the site with both Filezilla and with Frontpage. I am assuming that her Dreamweaver settings are the cause of the problem but I don't have that program myself and don't know what to advise her. Any suggestions?

    Read the article

  • tool for advanced ID3 tags handling and audio files ordering

    - by Juhele
    I have following problem – some of my files do not have complete ID3 tags and some have typos or small differences in writing - so finally, my portable player sees “Mr. President” as different artist from “Mr President” and so on. I would need some tool which could search similar tags and then allow me to correct the typos or for example override “artist” in all selected files by manually entered text. The same with empty tag items – sometimes, the track name, album etc. is OK, but artist is missing etc. Without touching the audio quality, of course (but this should be no problem, I think). I already tried tools in Winamp, Songbird and other players and currently most advanced free tool I tried is Tagscanner. However, it is not able to to solve the problem with similar tags. Do you know such tool? Preferably free and for Windows, if possible. However, if you know some commercial app able to do this, please let me know.

    Read the article

  • Automatically copy files without overwriting, but creating numbered ones instead

    - by user1688322
    I need to copy files at a regular interval, eg once an hour so I tried setting up an xcopy batch saying it should copy the files it needs to copy to another folder. Now when it copies, it overwrites the files which is not what it is supposed to do. When a file is copied, it should create a new file instead, named something like File.txt, File-COPY1.txt, File-COPY2.txt or something like that. Any way to do that? Thanks in advance.

    Read the article

  • Memory efficient collection class

    - by Joe
    I'm building an array of dictionaries (called keys) in my iphone application to hold the section names and row counts for a tableview. the code looks like this: [self.results removeAllObjects]; [self.keys removeAllObjects]; NSUInteger i,j = 0; NSString *key = [NSString string]; NSString *prevKey = [NSString string]; if ([self.allResults count] > 0) { prevKey = [NSString stringWithString:[[[self.allResults objectAtIndex:0] valueForKey:@"name"] substringToIndex:1]]; for (NSDictionary *theDict in self.allResults) { key = [NSString stringWithString:[[theDict valueForKey:@"name"] substringToIndex:1]]; if (![key isEqualToString:prevKey]) { NSDictionary *newDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:i],@"count", prevKey,@"section", [NSNumber numberWithInt:j], @"total",nil]; [self.keys addObject:newDictionary]; prevKey = [NSString stringWithString:key]; i = 1; } else { i++; } j++; } NSDictionary *newDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:i],@"count", prevKey,@"section", [NSNumber numberWithInt:j], @"total",nil]; [self.keys addObject:newDictionary]; } [self.tableview reloadData]; The code works fine first time through but I sometimes have to rebuild the entire table so I redo this code which orks fine on the simulator, but on my device the program bombs when I execute the reloadData line : malloc: *** mmap(size=3772944384) failed (error code=12) *** error: can't allocate region *** set a breakpoint in malloc_error_break to debug malloc: *** mmap(size=3772944384) failed (error code=12) *** error: can't allocate region *** set a breakpoint in malloc_error_break to debug Program received signal: “EXC_BAD_ACCESS”. If I remove the reloadData line the code works on the device. I'm wondering if this is something to do with the way I've built the keys array (ie using autoreleased strings and dictionaries).

    Read the article

  • visual-studio-2008 versioninfo for all files updated from one place

    - by ravenspoint
    The version information, displayed when the mouse cursor hovers over the file in windows explorer, is set for a file built by visual studio in the VERSION resource. I would like to set the version in one place for all the files built by a solution, preferably when I change the version in the install properties. Is there a way to do this? The motivation for this is that if the version is not updated for a file, then the installer will leave previous versions of files instead of replacing them with new files. This happens even when the 'RemovePreviousVersions' property is set. In order to save the tedious and error prone task of updating the version in every file built and installed, I remove the version resource from all files - which is not elegant.

    Read the article

  • Files slow to save sometimes after Ubuntu upgrade

    - by Matchu
    I haven't quite been able to track down why this happens sometimes in Ubuntu 10.04 and not other times. I'll go into gedit or OpenOffice.org and try to save files, and, during some sessions, it will take up to 10 seconds to save the file, sometimes causing the program to become unresponsive. But during these same sessions, the files sometimes save instantly. This didn't start happening until after the 10.04 (Lucid) upgrade. I suspect that something is reading all the changes I make, or that there's some other big file action going on, or something like that. I disabled Tracker a while back, before the upgrade, and don't see it under the settings - could it be back under a different name under Lucid? You probably don't already know the answer, but how can I go about finding the cause of this problem?

    Read the article

  • Virtual hosting in Varnish with individual vcl files for configuration

    - by Michael Sørensen
    I wish to use varnish to put in front of an apache and a tomcat on the same server. Depending on the ip requested, it goes to a different backend. This works. Now for most of the sites the default varnish logic will work just fine. However for some specific sites I wish to use custom VCL code. I can test for host name and include config files for the specific domains, but this only works inside the individual methods recv etc. Is there a way to include a complete set of instructions, in one file, per domain, without having to manage separate files for subdomain_recv, subdomain_fetch etc? And preferably without running seperate instances of varnish. When I try to include a file on the "root level" of default.vcl, I get a compilation error. Best regards, Michael

    Read the article

  • Best Place to Store Config Files and Log Files on Windows for My Program?

    - by Dave
    I need to store log files and config files for my application. Where is the best place to store them? Right now I'm just using the current directory, which ends up putting them in the Program Files directory where my program lives. The log files will probably be accessed by the user somewhat regularly, so %APPDATA% seems a little hard to get to. Is a directory under %USERPROFILE%\My Documents the best? It needs to work for all versions of Windows from 2000 forward.

    Read the article

  • How to delete files quicker than rm -rf?

    - by Byakugan
    Is there any way how to delete folder/files quicker than with command rm -rf? It seems my disc is filled with bilions of files (sessions of php5) which were not deleted in cron so I need to delete them manually but it takes hours and it is still not helping reducing the amount. Thank you. My command: rm -rf /var/lib/php5/* Tried also these commands: find /var/lib/php5 -name "sess_*" -exec rm {} \; And perl -e 'chdir "/var/lib/php5/" or die; opendir D, "."; while ($n = readdir D) { unlink $n }'

    Read the article

  • Search Files (Preferably with index) on Windows 2000 Server

    - by ThinkBohemian
    I have many files on a windows server 2000 machine that is setup to act as a networked disk drive, is there anyway I can index the files and make that index available as a search to more people than just me? Bonus if the index can look inside of documents such as readme.txt? If there is no easy way to do this globaly (for all users) Is there a way I could generate and store an index locally on my computer? If this is the wrong place to ask this question, any advice on community more suited?

    Read the article

< Previous Page | 233 234 235 236 237 238 239 240 241 242 243 244  | Next Page >