Search Results

Search found 39784 results on 1592 pages for 'ignore files'.

Page 174/1592 | < Previous Page | 170 171 172 173 174 175 176 177 178 179 180 181  | Next Page >

  • Parsing log files in a folder in ColdFusion

    - by Simon Guo
    The problem is there is a folder ./log/ containing the files like: jan2010.xml, feb2010.xml, mar2010.xml, jan2009.xml, feb2009.xml, mar2009.xml ... each xml file would like: <root><record name="bob" spend="20"></record>...(more records)</root> I want to write a piece of ColdFusion code (log.cfm) that simply parsing those xml files. For the front end I would let user to choose a year, then the click submit button. All the content in that year will be show up in separate table by month. Each table shows the total money spent for each person. like: person cost bob 200 mike 300 Total 500 Thanks.

    Read the article

  • How to merge elements from two XML files?

    - by Googler
    Hi all, I need a c# lang code to merge two xml files into one, from the specified content. XML FILE 1 : <exchange-documents> <documentlegal> <bibliographic-data> <applicants> <applicant-name> <name>CENTURY PRODUCTS CO [US]</name> </applicant-name> </applicants> </bibliographic-data> </documentlegal> </exchange-documents> XML FILE 2: <exchange-documents> <documentpatent> <bibliographic-data> <applicants> <applicant-name> <name>CENTURY PRODUCTS CO [US]</name> </applicant-name> </applicants> </bibliographic-data> </documentpatent> </exchange-documents> I need to read the above two xml files and write it into a new xml files with selected elements? OUTPUT XML: <documentlegal> <bibliographic-data> <applicants> <applicant-name> <name>CENTURY PRODUCTS CO [US]</name> </applicant-name> </applicants> </bibliographic-data> </documentlegal> <documentpatent> <bibliographic-data> <applicants> <applicant-name> <name>CENTURY PRODUCTS CO [US]</name> </applicant-name> </applicants> </bibliographic-data> </documentpatent> I dont need the exchnage document element. Can anyone provide me a c# code to achiev the above scenario?

    Read the article

  • How to consolidate multiple LOG files into one .LDF file in SQL2000

    - by John Galt
    Here is what sp_helpfile says about my current database (recovery model is Simple) in SQL2000: name fileid filename size maxsize growth usage MasterScratchPad_Data 1 C:\SQLDATA\MasterScratchPad_Data.MDF 6041600 KB Unlimited 5120000 KB data only MasterScratchPad_Log 2 C:\SQLDATA\MasterScratchPad_Log.LDF 2111304 KB Unlimited 10% log only MasterScratchPad_X1_Log 3 E:\SQLDATA\MasterScratchPad_X1_Log.LDF 191944 KB Unlimited 10% log only I'm trying to prepare this for a detach then an attach to a sql2008 instance but I don't want to have the 2nd .LDF file (I'd like to have just one file for the log). I have backed up the database. I have issued: BACKUP LOG MasterScratchPad WITH TRUNCATE_ONLY. I have run multiple DBCC SHRINKFILE commands on both of the LOG files. How can I accomplish this goal of having just one .LDF? I cannot find anything on how to delete the one with fileid of 3 and/or how to consolidate multiple files into one log file.

    Read the article

  • How can you configure or extend BITS (Background Intelligent Transfer Service) to read files from a

    - by Mark
    I have a ASP .NET load balanced application (webservice and website). It runs on SQL server. I need to be able to provide large files for download. However, because of the load balancing situation, the files are stored in the SQL database as opposed to the file system. BITS seems to be the best approach. I have full control of the client. However, i don't know how to configure BITS to read the file from the database. I know how to write the C# code for that, but i don't know how to get BITS to hook into it as opposed to reading the file from the file system. Any ideas?

    Read the article

  • WiX, how to prevent files from uninstalling though we forgot to set Permanent="yes"

    - by Doc Brown
    We have a product installer created with Wix, containing a program package ("V1") and some configuration files. Now, we are going to make a major upgrade with a new product code, where the old version of the product is uninstalled and "V2" is installed. What we want is to save one of the configuration files from uninstalling, since it is needed for the V2, too. Unfortunately, we forgot to set the Permanent="yes" option when we delivered V1 (read this question for more information). Here comes the question: is there an easy way of preventing the uninstall of the file anyhow? Of course, we could add a custom action to the script to backup the file before uninstallation, and another custom action to restore it afterwards, but IMHO that seems to be overkill for this task, and might interfere with other parts of the MSI registration process.

    Read the article

  • Problem with referencing CSS and Javascript files relatively

    - by Markus
    I have an IIS web site. This web site contains other web sites so the structure is like this. \ MainWebSite\ App1\ App2\ All sites are Asp.net MVC Webapplications. In the MasterPage of App1, I reference the script files like this: <script type="text/javascript" src="../../Scripts/jquery-ui-1.8.custom.min.js"> </script> The Problem is that he now tries to find the File at http:\server\MainWebSite\Scripts.... how can i work around that? Should I put all my Scripts and CSS files into the root directory, is that a preferred solution?

    Read the article

  • Including javascript files from other projects inside the same solution

    - by gnomixa
    I have a project called WebResources where I have all the JS files that I intend to use in 2 other projects (all three projects sit in the same solution). I just spent 2 hours playing around with the file paths and VS just doesn't see the JS file (unless I specify the full path with the drive letter). Is there a trick to including javascript files relatively from other projects? My file structure is something like: Project1 Default.aspx Project2 WebResourcesProject /js testToInclude.js No matter how I try to include testToInclude.js inside Default.aspx, VS doesn't see it. Any ideas?

    Read the article

  • how to know what files or folder are changed before do commit

    - by Pedro
    My problem is how to know what files or folder are changed before do commit. I can add all the new files in my working copy before do commit, and the repository changes, but if for example i delete one file of the working copy i dont know the way to add this change before do commit. When you use the tortoise for example before do commit the program shows all the changes of the working copy and you can choose what changes commit and what changes dont. There is some way to do this usin sharp svn?? thanks for your answer!!!

    Read the article

  • System.Web.HttpException in asp.net mvc 2 on images and javascript files

    - by Rippo
    Hi I am getting the following errors reported by ELMAH on my asp.net mvc 2 site for javascript files, images etc. System.Web.HttpException: The remote host closed the connection I have done some research and it appears that the user/bot is clicking a link on the site before the page has fully loaded. Now this error never occurs on a controller action but always on a file that is on disk. e.g. /Content/CmsImages/logo.png /Content/CmsImages/MemberImages/Photo-001605.jpg /Content/jquery.tickertype.js So this means that all static files are being routed through the mvc pipeline. What options do I have?

    Read the article

  • Why is lua crashing after extracting zip files?

    - by Brian T Hannan
    I have the following code but it crashes every time it reaches the end of the function, but it successfully extracts all the files and puts them in the right location. require "zip" function ExtractZipAndCopyFiles(zipPath, zipFilename, destinationPath) local zfile, err = zip.open(zipPath .. zipFilename) -- iterate through each file insize the zip file for file in zfile:files() do local currFile, err = zfile:open(file.filename) local currFileContents = currFile:read("*a") -- read entire contents of current file local hBinaryOutput = io.open(destinationPath .. file.filename, "wb") -- write current file inside zip to a file outside zip if(hBinaryOutput)then hBinaryOutput:write(currFileContents) hBinaryOutput:close() end end zfile:close() end -- call the function ExtractZipAndCopyFiles("C:\\Users\\bhannan\\Desktop\\LUA\\", "example.zip", "C:\\Users\\bhannan\\Desktop\\ZipExtractionOutput\\") Why does it crash every time it reaches the end?

    Read the article

  • Document Management System - Where to Store Files?

    - by Diego AC
    Hey, stack! I'm on charge of building an ASP.NET MVC Document Management System. It have to be able to do basic document management tasks like adding, editing and searching entries and also perform versioning. Anyways, I'm targeting PDF, Office and many image formats as the file attached to each document entry in the database. My question is: What design guidelines do pros follow when building the storage mechanism? Do they store the document files in the file system? Database? How file uploading is handled? I used to upload the files to a temporal location while the user was editing the data and move it to permanent storage when the user confirmed the entry creation. Is this good? Any suggestions on improvement?

    Read the article

  • Backing up locally modified and new source files

    - by eran
    I'm wondering how other programmers are backing up changes that are not under source control yet, be it new files or modified ones. I'm mostly referring to medium size jobs - hardly worth the effort of making a private branch, but taking more than a day to complete. This is not a vendor-specific question - I'd like to see if various products have different solutions to the problem. I'd appreciate answers referring to SVN and distributed SCCs, though. I'm mostly wondering about that latters (Mercurial, GIT etc.) - it's great that you have your own local repo, but do you back it up on a regular basis along with your source files? Note - I'm not asking about a general backup strategy. For that, we have IT. I'm seeking the best way to keep locally modified stuff backed-up before they are checked back into the main repo.

    Read the article

  • View plain text files with different background colors in Mac OSX, for different programming languag

    - by Werner
    Hi, I work with Mac OS X Leopard. I usually have 5 or 10 text files opened at the same time with different programming languages; one for a bash script, another for a python one, etc. When I use exposé all of them look the same, so it is difficult to select them. I wonder how could I work with just plain text files in OSX, so when they are opened in an editor the background color changes or some other thign, so when using exposé it is clear to me which window belongs to what language. I thought about inserting some kind of info to the last line of each document, and then creat some applescript that converts it to RTF or someother text document which includes color in bacjground, so then it is opened with textmate or someother app. Do you know a better approach for this? Thanks

    Read the article

  • Loading .sql files from within PHP

    - by Josh Smeaton
    I'm creating an installation script for an application that I'm developing and need to create databases dynamically from within PHP. I've got it to create the database but now I need to load in several .sql files. I had planned to open the file and mysql_query it a line at a time - until I looked at the schema files and realised they aren't just one query per line. So, please.. how do I load an sql file from within PHP? (as phpMyAdmin does with it's import command).

    Read the article

  • Configuring Hadoop logging to avoid too many log files

    - by Eric Wendelin
    I'm having a problem with Hadoop producing too many log files in $HADOOP_LOG_DIR/userlogs (the Ext3 filesystem allows only 32000 subdirectories) which looks like the same problem in this question: http://stackoverflow.com/questions/2091287/error-in-hadoop-mapreduce My question is: does anyone know how to configure Hadoop to roll the log dir or otherwise prevent this? I'm trying to avoid just setting the "mapred.userlog.retain.hours" and/or "mapred.userlog.limit.kb" properties because I want to actually keep the log files. I was also hoping to configure this in log4j.properties, but looking at the Hadoop 0.20.2 source, it writes directly to logfiles instead of actually using log4j. Perhaps I don't understand how it's using log4j fully. Any suggestions or clarifications would be greatly appreciated.

    Read the article

  • Is it good practice to put private API in the .m files and public API in .h files in Cocoa?

    - by Paperflyer
    Many of my classes in my current project have several properties and methods that are only ever called from within the class itself. Also, they might mess with the working of the class depending on the current state of the class. Currently, all these interfaces are defined in the main interface declaration in the .h files. Is it considered good practice to put the “private” methods and properties at the top of the .m files? This won't ever affect anything since I am very likely the only person ever to look at this source code, but of course it would be interesting to know for future projects.

    Read the article

  • Upload Large files(1GB)-ASP.net

    - by Ramya Raj
    I need to upload large files of atleast 1GB file size. I am using ASP.Net, C# and IIS 5.1 as my development platform. I am using HIF.PostedFile.InputStream.Read(fileBytes,0,HIF.PostedFile.ContentLength) before using File.WriteAllBytes(filePath, fileByteArray)(doesnt go here but gives System.OutOfMemoryException' exception) Currently i have set the httpRuntime to executionTimeout="999999" maxRequestLength="2097151"(thats 2GB!) useFullyQualifiedRedirectUrl="true" minFreeThreads="8" minLocalRequestFreeThreads="4" appRequestQueueLimit="5000" enableVersionHeader="true" requestLengthDiskThreshold="8192" Also i have set maxAllowedContentLength="2097151" (guess its only for IIS7) I have changed IIS connection timeout to 999,999 secs too. I am unable to upload files of even 4578KB(Ajaz-Uploader.zip)

    Read the article

  • Securing files on IPhone

    - by clearbrian
    Hi Is there a way to decompile the binary from an IPhone app. I jailbroke my IPhone and was surprised to find other app's dbs wide open to be copied. So I exported my most important table and hardcoded it into code. Instead of loading table into array from a db I just generated code to fill the array and kept only the most basic DB info so relationships still work. Took a while but now works fine. I was just wondering am I safe, could someone decompile the binary for the app easily and extract the data. In Java its easy to decompile *.class files though thats bytecode where I presume iphone apps are more low level. I know IPhone sdk 4 can mark files as secure. Anyone know can this be overridden by jailbreaks or is this an unix lock?

    Read the article

  • How to move files in C drive using MoveFileEx APi

    - by rajivpradeep
    Hi, when i use MoveFileEx to move files in C drive, but i am getting the ERROR that ACCESS DENIED. Any solutions int i ; DWORD dw ; String^ Source = "C:\Folder\Program\test.exe" ; String^ Destination = "C:\test.exe"; // move to program Files Folder pin_ptr<const wchar_t> WSource = PtrToStringChars(Source); pin_ptr<const wchar_t> WDestination = PtrToStringChars(Destination); i = MoveFileEx( WSource, WDestination ,MOVEFILE_REPLACE_EXISTING | MOVEFILE_COPY_ALLOWED ) ; dw = GetLastError() ;

    Read the article

  • Where are the snapshot files?

    - by KiD0M4N
    Hey guys, The documentation states that the snapshots are persisted to S3 for persistence... I wanted to leverage that and create a instance of my server in a different region (my original server is in APAC, I wanted to create an instance in US-East.) I have logged into my account via CloudBerry S3, but cannot see any files in the S3 account (sorry, I am beginner in AWS.) Also, switching over to US-East removes the snapshot from view... so how can I create another instance using the same EBS volume in a different region? Why can't I see the snapshot files in my S3? Regards, Karan Misra

    Read the article

< Previous Page | 170 171 172 173 174 175 176 177 178 179 180 181  | Next Page >