Search Results

Search found 62606 results on 2505 pages for 'sql files'.

Page 560/2505 | < Previous Page | 556 557 558 559 560 561 562 563 564 565 566 567  | Next Page >

  • Considering modified files for rebuild

    - by harik
    I have a C++ project, I am using Bakefile for build process, Makefiles are generated for msvc, mingw, gnu etc for cross-platform support. Now the problem is that if I change any .h files (which are included in other .cpp files) and performing a rebuild does not recompile modified files. But changing any .cpp file gets recompiled. Based on modified time-stamp of any file which is included in the project I expect to consider that file for rebuild. Am I missing something which required to be added as a tag in .bkl files? Please help.

    Read the article

  • Windows 7 Locking Executable Files

    - by James Burgess
    Since I've been using Windows 7 RTM (as opposed to the Beta and RCs), I've been having a peculiar issue with executable files. I first noticed it whilst using Visual Studio, in that when building a project, it would often fail saying that the output file was locked - but the problem has stemmed further. When I've executed an application, closed it (cleanly), and attempted to delete/move/rename/overwrite said file, Windows 7 tells me that the file is locked/access is denied. I've made use of software like LockHunter/Unlocker but it is seemingly unable to remove these locks (most of the time, it shows no locks at all). After about 5-10 minutes, the respective files are unlocked again, but needless to say this is a bit of a workflow-breaker (as it's not simply constrained to VS). I've done the usual tasks of virus/malware scanning, and turned up with absolutely nothing. I've got no peculiar services running, and the problem was not present before I installed a Windows 7 RTM version. Any help is greatly appreciated.

    Read the article

  • Vantec NexStar NAS Encloser - Writing large files

    - by peter
    I have one of these 'Vantec NexStar LX - NST-475LX-BK' drive enclosures. It is a NAS device. When I write a file to the device using eSata, or a SMB share I cannot write files over 4GB. I think this is because the drive is formatted with FAT32. But when I access the device using FTP it doesn't matter. I can write files of any size. E.g. I wrote one on there last night which was 30GB. Does this make any sense? Why? I guess the most important thing for me is data integrity.

    Read the article

  • MySQL ADO.NET Connector & MSSQL Integration Services

    - by user1114330
    Here I am, day three... attempting to sync a data view on a Windows Vista box (64 bit) running MSSQL 2012 and Visual Studio 2010. Sanity is slipping and hunger for progress fills my attention. I went through hell trying to get the MySQL ODBC drivers to get the job but to no avail...everyone seems to be lost and all the threads I can find are solutions that do not work for me. The problem: System DSN's not being seen by SSIS. SSIS DSN Not Showing as ODBC Data Source I make the decision to try out the ADO.NET connector...and to my surprise it is actually in the selection list in data sources in SSIS. So I take off running to create a Data Flow Task, create an ADO.NET Source (a local MSSQL DB)...all is good as usual. Then I move swiftly to creating a ADO.NET Destination, enter my credentials...wow, I am selecting a database finally on my linux server! Happy thinking that I finally have figured a way to get the job done. Then I move to mappings...nope, something is wrong...I am getting an error that hurts my eyes: Pipeline component has returned HRESULT error code 0xC0208457 from a a method call. Error at Data Flow Task [ADO NET Destination [81]]: Failed to get properties of external columns. The table name you entered may not exist or you do not have SELECT permission on the table object and an alternative attempt to get column properties through connection has failed. Detailed error messages are" You have an error in your SQL syntax check the manual that corresponds to your MySQL server version for the right syntax to use near "database".tablename" at line 1. The descriptor files on path C:\Program Files (x86)\Microsoft SQL Server\110\DTS\ProviderDescriptors\ does not contain schema information for connection of type MySQL.Data.MySqlClient.MySqlConnection. So it looks like it can't the information and therefore I cannot map the tables properly. Any ideas on this would be ultra helpful...thanks in advance to All!

    Read the article

  • Grep all files in a directory and print matches with file name

    - by javanix
    I have a list of log files that I create as part of a video encoding script that I wrote. I would like to search all of them and print out certain statistics from the encode - how fast they were encoded, what settings were used, etc. I can search for the average framerate in one file via this 1 liner: cat ${filename} | grep average which outputs: work: average encoding speed for job is 23.211176 fps and search for the ratefactor: cat ${filename} | grep RF I would like to search all files in the directory and print off one, or prefereably both pieces of information along with the filename. Is there any way I can use find or grep to get this in a one-liner, or do I need to write a script? I would like output like this: /home/javanix/filename.log <RF line> <average line> I would like this to either work using FreeBSD 9 or Ubuntu 12.04.

    Read the article

  • Files slow to save sometimes after Ubuntu upgrade

    - by Matchu
    I haven't quite been able to track down why this happens sometimes in Ubuntu 10.04 and not other times. I'll go into gedit or OpenOffice.org and try to save files, and, during some sessions, it will take up to 10 seconds to save the file, sometimes causing the program to become unresponsive. But during these same sessions, the files sometimes save instantly. This didn't start happening until after the 10.04 (Lucid) upgrade. I suspect that something is reading all the changes I make, or that there's some other big file action going on, or something like that. I disabled Tracker a while back, before the upgrade, and don't see it under the settings - could it be back under a different name under Lucid? You probably don't already know the answer, but how can I go about finding the cause of this problem?

    Read the article

  • Automatically copy files without overwriting, but creating numbered ones instead

    - by user1688322
    I need to copy files at a regular interval, eg once an hour so I tried setting up an xcopy batch saying it should copy the files it needs to copy to another folder. Now when it copies, it overwrites the files which is not what it is supposed to do. When a file is copied, it should create a new file instead, named something like File.txt, File-COPY1.txt, File-COPY2.txt or something like that. Any way to do that? Thanks in advance.

    Read the article

  • visual-studio-2008 versioninfo for all files updated from one place

    - by ravenspoint
    The version information, displayed when the mouse cursor hovers over the file in windows explorer, is set for a file built by visual studio in the VERSION resource. I would like to set the version in one place for all the files built by a solution, preferably when I change the version in the install properties. Is there a way to do this? The motivation for this is that if the version is not updated for a file, then the installer will leave previous versions of files instead of replacing them with new files. This happens even when the 'RemovePreviousVersions' property is set. In order to save the tedious and error prone task of updating the version in every file built and installed, I remove the version resource from all files - which is not elegant.

    Read the article

  • How to delete files quicker than rm -rf?

    - by Byakugan
    Is there any way how to delete folder/files quicker than with command rm -rf? It seems my disc is filled with bilions of files (sessions of php5) which were not deleted in cron so I need to delete them manually but it takes hours and it is still not helping reducing the amount. Thank you. My command: rm -rf /var/lib/php5/* Tried also these commands: find /var/lib/php5 -name "sess_*" -exec rm {} \; And perl -e 'chdir "/var/lib/php5/" or die; opendir D, "."; while ($n = readdir D) { unlink $n }'

    Read the article

  • Virtual hosting in Varnish with individual vcl files for configuration

    - by Michael Sørensen
    I wish to use varnish to put in front of an apache and a tomcat on the same server. Depending on the ip requested, it goes to a different backend. This works. Now for most of the sites the default varnish logic will work just fine. However for some specific sites I wish to use custom VCL code. I can test for host name and include config files for the specific domains, but this only works inside the individual methods recv etc. Is there a way to include a complete set of instructions, in one file, per domain, without having to manage separate files for subdomain_recv, subdomain_fetch etc? And preferably without running seperate instances of varnish. When I try to include a file on the "root level" of default.vcl, I get a compilation error. Best regards, Michael

    Read the article

  • Missing drive space in Server 2003

    - by Tim Brigham
    I have two drives used for SQL backups which for the last week have been acting strange - the free space indicated by windows is far off from what windirstat, etc indicates. There should only be about 60 GB of drive space used and there is about 160. This would match the utilization if the two last backup files were still residing on disk. SQL server is 2000, OS Server 2003 x64. Running on a VMware 5.0 cluster. OSSEC and McAfee for this system shows clean. My current plan is to temporarily attach one of these drives this drive to another VM for analysis. Is there anything more I should be looking at? There were a lot of pages on the net when I was looking for documentation on this issue but I haven't found this case described. EDIT: Unfortunately even a full reboot did not clear this behavior. I also used process explorer to look for open file handles. No dice.

    Read the article

  • Dreamweaver Files uploaded to Win 2008 server cause login prompt

    - by Lil
    I have a customer who uses a 4 year old version of Dreamweaver to edit her webpages. My hosting reseller account is with a company that uses Windows Server 2008. Every time my customer edits a page and uploads it, I have to set the permissions for that file to be readable, manually from the site's control panel. The customer is furious with me because her files cause the login prompt. I am able to upload files myself that remain readable to the site with both Filezilla and with Frontpage. I am assuming that her Dreamweaver settings are the cause of the problem but I don't have that program myself and don't know what to advise her. Any suggestions?

    Read the article

  • tool for advanced ID3 tags handling and audio files ordering

    - by Juhele
    I have following problem – some of my files do not have complete ID3 tags and some have typos or small differences in writing - so finally, my portable player sees “Mr. President” as different artist from “Mr President” and so on. I would need some tool which could search similar tags and then allow me to correct the typos or for example override “artist” in all selected files by manually entered text. The same with empty tag items – sometimes, the track name, album etc. is OK, but artist is missing etc. Without touching the audio quality, of course (but this should be no problem, I think). I already tried tools in Winamp, Songbird and other players and currently most advanced free tool I tried is Tagscanner. However, it is not able to to solve the problem with similar tags. Do you know such tool? Preferably free and for Windows, if possible. However, if you know some commercial app able to do this, please let me know.

    Read the article

  • Best Place to Store Config Files and Log Files on Windows for My Program?

    - by Dave
    I need to store log files and config files for my application. Where is the best place to store them? Right now I'm just using the current directory, which ends up putting them in the Program Files directory where my program lives. The log files will probably be accessed by the user somewhat regularly, so %APPDATA% seems a little hard to get to. Is a directory under %USERPROFILE%\My Documents the best? It needs to work for all versions of Windows from 2000 forward.

    Read the article

  • Full-text search locks up database - error 0x8001010e

    - by Stewart May
    Hi We have a full-text catalog that is populated via a job every 15 minutes like so: ALTER FULLTEXT INDEX ON [dbo].[WorkItemLongTexts] START INCREMENTAL POPULATION We have encountered a problem where the database containing this catalog locks up. There are a couple of scenarios, we either see the job execute and the process hang with with a wait type of UNKNOWN TOKEN, or we see another process hang with a wait type of MSSEARCH. Once this happens the job continues to run but informs us that the request to start a full-text index population is ignored because a population is currently active. Looking in the full text log files we see the following error each time these problems occur: 2010-04-21 08:15:00.76 spid21s The full-text catalog health monitor reported a failure for full-text catalog "XXXFullTextCatalog" (5) in database "YYY" (14). Reason code: 0. Error: 0x8001010e(The application called an interface that was marshalled for a different thread.). The system will restart any in-progress population from the previous checkpoint. If this message occurs frequently, consult SQL Server Books Online for troubleshooting assistance. This is an informational message only. No user action is required."'' The only solution is to restart the SQL Server service and then the full text service. This is now occuring on a daily basis now so any help would be appreciated.

    Read the article

  • Looping through a directory on the web and displaying its contents (files and other directories) via

    - by al jaffe
    In the same vein as http://stackoverflow.com/questions/2593399/process-a-set-of-files-from-a-source-directory-to-a-destination-directory-in-pyth I'm wondering if it is possible to create a function that when given a web directory it will list out the files in said directory. Something like... files[] for file in urllib.listdir(dir): if file.isdir: # handle this as directory else: # handle as file I assume I would need to use the urllib library, but there doesn't seem to be an easy way of doing this, that I've seen at least.

    Read the article

  • Search Files (Preferably with index) on Windows 2000 Server

    - by ThinkBohemian
    I have many files on a windows server 2000 machine that is setup to act as a networked disk drive, is there anyway I can index the files and make that index available as a search to more people than just me? Bonus if the index can look inside of documents such as readme.txt? If there is no easy way to do this globaly (for all users) Is there a way I could generate and store an index locally on my computer? If this is the wrong place to ask this question, any advice on community more suited?

    Read the article

  • Haskell lazy I/O and closing files

    - by Jesse
    I've written a small Haskell program to print the MD5 checksums of all files in the current directory (searched recursively). Basically a Haskell version of md5deep. All is fine and dandy except if the current directory has a very large number of files, in which case I get an error like: <program>: <currentFile>: openBinaryFile: resource exhausted (Too many open files) It seems Haskell's laziness is causing it not to close files, even after its corresponding line of output has been completed. The relevant code is below. The function of interest is getList. import qualified Data.ByteString.Lazy as BS main :: IO () main = putStr . unlines =<< getList "." getList :: FilePath -> IO [String] getList p = let getFileLine path = liftM (\c -> (hex $ hash $ BS.unpack c) ++ " " ++ path) (BS.readFile path) in mapM getFileLine =<< getRecursiveContents p hex :: [Word8] -> String hex = concatMap (\x -> printf "%0.2x" (toInteger x)) getRecursiveContents :: FilePath -> IO [FilePath] -- ^ Just gets the paths to all the files in the given directory. Are there any ideas on how I could solve this problem? The entire program is available here: http://haskell.pastebin.com/PAZm0Dcb

    Read the article

  • Ignoring generated files when using "Treat warnings as errors"

    - by krystan honour
    We have started a new project but also have this problem for an existing project. The problem is that when we compile with a warning level of 4 we also want to switch on 'Treat all warnings as errors' We are unable to do this at the moment because generated files (in particular reference.cs files) are missing things like XML comments and this generates a warning, we do not want to suppress the xml comment warnings totally out of all files just for specific types of files (namely generated code). I have thought of a way this could be achieved but am not sure if these are the best way to do this or indeed where to start :) My thinking is that we need to do something with T4 templates for the code that is generated such that it does fill in XML documentation for generated code. Does anyone have any ideas, currently I'm at well over 2k warnings (its a big project) :(

    Read the article

  • Rename files and directories using substitution and variables

    - by rednectar
    I have found several similar questions that have solutions, except they don't involve variables. I have a particular pattern in a tree of files and directories - the pattern is the word TEMPLATE. I want a script file to rename all of the files and directories by replacing the word TEMPLATE with some other name that is contained in the variable ${newName} If I knew that the value of ${newName} was say "Fred lives here", then the command find . -name '*TEMPLATE*' -exec bash -c 'mv "$0" "${0/TEMPLATE/Fred lives here}"' {} \; will do the job However, if my script is: newName="Fred lives here" find . -name '*TEMPLATE*' -exec bash -c 'mv "$0" "${0/TEMPLATE/${newName}}"' {} \; then the word TEMPLATE is replaced by null rather than "Fred lives here" I need the "" around $0 because there are spaces in the path name, so I can't do something like: find . -name '*TEMPLATE*' -exec bash -c 'mv "$0" "${0/TEMPLATE/"${newName}"}"' {} \; Can anyone help me get this script to work so that all files and directories that contain the word TEMPLATE have TEMPLATE replaced by whatever the value of ${newName} is eg, if newName="A different name" and a I had directory of /foo/bar/some TEMPLATE directory/with files then the directory would be renamed to /foo/bar/some A different name directory/with files and a file called some TEMPLATE file would be renamed to some A different name file

    Read the article

  • Where to store site settings: DB? XML? CONFIG? CLASS FILES?

    - by Emin
    I am re-building a news portal of which already have a large number of visits every day. One of the major concerns when re-building this site was to maximize performance and speed. Having said this, we have done many things from caching, to all sort of other measures to ensure speed. Now towards the end of the project, I am having a dilemma of where to store my site settings that would least affect performance. The site settings will include things such as: Domain, DefaultImgPath, Google Analytics code, default emails of editors as well as more dynamic design/display feature settings such as the background color of specific DIVs and default color for links etc.. As far as I know, I have 4 choices in storing all these info. Database: Storing general settings in the DB and caching them may be a solution however, I want to limit the access to the database for only necessary and essential functions of the project which generally are insert/update/delete news items, author articles etc.. XML: I can store these settings in an XML file but I have not done this sort of thing before so I don't know what kind of problems -if any- I might face in the future. CONFIG: I can also store these settings in web.config CLASS FILE: I can hard code all these settings in a SiteSettings class, but since the site admin himself will be able to edit these settings, It may not be the best solution. Currently, I am more close to choosing web.config but letting people fiddle with it too often is something I do not want. E.g. if somehow, I miss out a validation for something and it breaks the web.config, the whole site will go down. My concern basically is that, I cannot forsee any possible consequences of using any of the methods above (or is there any other?), I was hoping to get this question over to more experienced people out here who hopefully help make my decision.

    Read the article

  • How bad is opening and closing a SQL connection for several times? What is the exact effect?

    - by Eren
    For example, I need to fill lots of DataTables with SQLDataAdapter's Fill() method: DataAdapter1.Fill(DataTable1); DataAdapter2.Fill(DataTable2); DataAdapter3.Fill(DataTable3); DataAdapter4.Fill(DataTable4); DataAdapter5.Fill(DataTable5); .... .... Even all the dataadapter objects use the same SQLConnection, each Fill method will open and close the connection unless the connection state is already open before the method call. What I want to know is how does unnecessarily opening and closing SQLConnections affect the performance of the application. How much does it need to scale to see the bad effects of this problem (100,000s of concurrent users?). In a mid-size website (daily 50000 users) does it worth bothering and searching for all the Fill() calls, keeping them together in the code and opening the connection before any Fill() call and closing afterwards?

    Read the article

  • Cannot chown my own files from NFS

    - by valpa
    We have a NFS server provide home directory for many account, which provided by a NIS server. I have account A and B. In /home/A, I try to copy "cp -a /home/B/somedir ~/". Then I found in /home/A/somedir, all files are owned by user A. Then if I do "chown -R B:B somedir", I got "Operation not permitted" error. I am user A, "cp -a" didn't preserve the original user (B). Then I cannot chown my own files. Any suggestion? I fix my own issue by "chmod 777 /home/A", "su - B" and "cp -a somedir /home/A/", and "su - A", then "chmod 755 /home/A". But it is not a good solution.

    Read the article

< Previous Page | 556 557 558 559 560 561 562 563 564 565 566 567  | Next Page >