Search Results

Search found 49518 results on 1981 pages for 'configuration files'.

Page 282/1981 | < Previous Page | 278 279 280 281 282 283 284 285 286 287 288 289  | Next Page >

  • Excel and Tab Delimited Files Question

    - by OneNerd
    I am encountering what I believe to be a strange issue with Excel (in this case, Excel 2007, but maybe also Excel 2003, but don't have access to it as I write this). I can reliably convert some server data over into a tab-delimited format (been doing this for years) and then open it using Excel - no issue. However, what seems to be happening is if I have an html <table inside one of the fields, it looks like Excel 2007 thinks it should be converting the table into rows and columns inside Excel (not what I want). As you might imagine, this throws off the entire spreadsheet. So question is, is there any way to set up excel to NOT do this (perhaps some setting in Excel that pertains to reading tab delimited files), or am I missing something? Thanks.

    Read the article

  • Get timestamp from Authenticode Signed files in .NET

    - by SlavaGu
    We need to verify that binary files are signed properly with digital signature (Authenticode). This can be achieved with signtool.exe pretty easily. However, we need an automatic way that also verifies signer name and timestamp. This is doable in native C++ with CryptQueryObject() API as shown in this wonderful sample: How To Get Information from Authenticode Signed Executables However we live in a managed world :) hence looking for C# solution to the same problem. Straight approach would be to pInvoke Crypt32.dll and all is done. But there is similar managed API in System.Security.Cryptography.X509Certificates Namespace. X509Certificate2 Class seems to provide some information but no timestamp. Now we came to the original question how can we get that timestamp of a digital signature in C Sharp?

    Read the article

  • How do I export a large table into 50 smaller csv files of 100,000 records each

    - by Eddie
    I am trying to export one field from a very large table - containing 5,000,000 records, for example - into a csv list - but not all together, rather, 100,000 records into each .csv file created - without duplication. How can I do this, please? I tried SELECT field_name FROM table_name WHERE certain_conditions_are_met INTO OUTFILE /tmp/name_of_export_file_for_first_100000_records.csv LINES TERMINATED BY '\n' LIMIT 0 , 100000 that gives the first 100000 records, but nothing I do has the other 4,900,000 records exported into 49 other files - and how do I specify the other 49 filenames? for example, I tried the following, but the SQL syntax is wrong: SELECT field_name FROM table_name WHERE certain_conditions_are_met INTO OUTFILE /home/user/Eddie/name_of_export_file_for_first_100000_records.csv LINES TERMINATED BY '\n' LIMIT 0 , 100000 INTO OUTFILE /home/user/Eddie/name_of_export_file_for_second_100000_records.csv LINES TERMINATED BY '\n' LIMIT 100001 , 200000 and that did not create the second file... what am I doing wrong, please, and is there a better way to do this? Should the LIMIT 0 , 100000 be put Before the first INTO OUTFILE statement, and then repeat the entire command from SELECT for the second 100,000 records, etc? Thanks for any help. Eddie

    Read the article

  • Recursively CVS add files/directories and ignore existing CVS files.

    - by meder
    There's a similar post @ http://stackoverflow.com/questions/5071/how-to-add-cvs-directories-recursively However, trying out some of the answers such as: find . -type f -print0| xargs -0 cvs add Gave: cvs add: cannot open CVS/Entries for reading: No such file or directory cvs [add aborted]: no repository And find . \! -name 'CVS' -and \! -name 'Entries' -and \! -name 'Repository' -and \! -name 'Root' -print0| xargs -0 cvs add Gave: cvs add: cannot add special file `.'; skipping Does anyone have a more thorough solution to recursively adding new files to a CVS module? It would be great if I could alias it too in ~/.bashrc or something along those lines. And yes, I do know that it is a bit dated but I'm forced to work with it for a certain project otherwise I'd use git/hg.

    Read the article

  • Using URL Routing for Web Forms with .ashx files

    - by RandomBen
    I am developing a .NET 3.5 Web Forms based website that uses URL Routing. So far I have created a few routes and I have had no issue. I now have a .ashx file that is going to handle sending .pdf files from a table in SQL Server to the website when someone clicks on a link. Normally when I create a Handler it would look like this: return BuildManager.CreateInstanceFromVirtualPath("~/ViewItem.aspx", typeof(Page)) as Page; For my .ashx file I tried: return BuildManager.CreateInstanceFromVirtualPath("~/FileServer.ashx", typeof(Page)) as Page; This doesn't work though because fileserver.ashx is not a page so casting it as typeof(Page)) as Page is going to fail. What do I cast the VirtualPath as instead of Page or is there some other way I should be doing this.

    Read the article

  • Apache/passenger/ree doesn't interpret .rb files

    - by Sergey
    I'm trying to get apache + passenger + ree to work. I think I did everything (except for setting up rails env - for now I wanna run just pure ruby) described here: http://rvm.beginrescueend.com/integration/passenger/ But when I try to go to localhost/test.rb it doesn't interpret that file and just download it. I don't know where should I look for mistakes, so here are a few files I think could be relevant: /var/log/apache2/error.log (these 2 lines are repeating) [Mon May 31 23:12:47 2010] [notice] Graceful restart requested, doing restart [Mon May 31 23:12:48 2010] [notice] Apache/2.2.14 (Ubuntu) PHP/5.3.2-1ubuntu4.2 with Suhosin-Patch Phusion_Passenger/2.2.11 configured -- resuming normal operations /etc/apache2/httpd.conf LoadModule passenger_module /home/sergey/.rvm/gems/ree-1.8.7-2010.01/gems/passenger-2.2.11/ext/apache2/mod_passenger.so PassengerRoot /home/sergey/.rvm/gems/ree-1.8.7-2010.01/gems/passenger-2.2.11 PassengerRuby /home/sergey/.rvm/bin/passenger_ruby /var/www/test.rb puts "test"

    Read the article

  • Silverlight DataGrid Refresh Between Xaml Files

    - by GB
    Hello, I have a Page.xaml file and a AddNewProject.xaml. In the Page.xaml file I have a ProjectDetailsDataGrid and a button to add a new Project. When I click on the Add New Project button the AddNewProject.xaml file becomes visible for the user to enter new project information. I am having a problem trying to refresh the ProjectDetailsDataGrid (on the Page.xaml page) to display the new info. entered from the AddNewProject.xaml page. Is there anyway to accomplish refreshing a datagrid between two seperate xaml files? Thank you for your help.

    Read the article

  • Cannot use Html.ActionLink in asp.net mvc spark files

    - by midas06
    I'm using the spark view engine with my asp.net mvc application. In my aspx pages, I can succesfully use Html.Actionlink, but when I attempt it in spark files, it doesnt show up in intellisense, and when i try to run it anyway, i get: Dynamic view compilation failed. c:\Users\midas\Documents\Visual Studio 2008\Projects\ChurchMVC\ChurchMVC\Views\Home\Index.spark(73,25): error CS1061: 'System.Web.Mvc.HtmlHelper' does not contain a definition for 'ActionLink' and no extension method 'ActionLink' accepting a first argument of type 'System.Web.Mvc.HtmlHelper' could be found (are you missing a using directive or an assembly reference?) I do have system.web.mvc referenced, and I have added in _global.spark. None of that helps. Any ideas?

    Read the article

  • CUDA: How to reuse kernels in multiple files (for unit testing)

    - by zenna
    How can I go about reusing the same kernel without getting fatal linking errors due to defining the symbol multiple times In Visual Studio I get "fatal error LNK1169: one or more multiply defined symbols found" My current structure is as follows: Interface.h has an extern interface to a C function: myCfunction() (ala the C++ integration SDK example) Kernel.cu contains the actual __global__ kernels and is NOT included in the build: __global__ my_kernel() Wrapper.cu inlcudes Kernel.cu and Interface.h and calls my_kernel<<<...>>> This all works fine. But if I add another C function in another file which also includes Kernel.cu and uses those kernels, I get the errors. So how can I reuse the kernels in Kernel.cu among many C functions in different files. The purpose of this by the way is unit testing, and integrating my kernels with CPP unit, if there is no way to reuse kernels (there must be!) then other suggestions for unit testing kernels within my existing CPP unit framework would be appreciate. Thanks Zenna

    Read the article

  • WebSharingAppDemo-CEProviderEndToEnd Queries peerProvider for NeedsScope before any files are batche

    - by Don
    I'm building an application based on the WebSharingAppDemo-CEProviderEndToEnd. When I deploy the server portion on a server, the code gives the error "The path is not valid. Check the directory for the database." during the call to NeedsScope() in the CeWebSyncService.cs file. Obviously the server can't access the client's sdf but what is supposed to happen to make this work? The app uses batching to send the data and the batches have to be marshalled across to the temp directory but this problem is occurring before any files have been batched over. There is nothing for the server to look at to determine whether the peerProivider needs scope. What am I missing? public bool NeedsScope() { Log("NeedsSchema: {0}", this.peerProvider.Connection.ConnectionString); SqlCeSyncScopeProvisioning prov = new SqlCeSyncScopeProvisioning(); return !prov.ScopeExists(this.peerProvider.ScopeName, (SqlCeConnection)this.peerProvider.Connection); }

    Read the article

  • Indy FTP, large files and NAT routers

    - by Lobuno
    Hello! I have been using Indy to transfers files via FTP for years now but have not been able to find a satisfactory solution for the following problem. When a user is uploading a large file, behind a router, sometimes the following happens: the file is uploaded OK, but under the mean time the command channel gets disconnected because of a timeout. Normally this doesn't happens with a direct connection to the server, because the server "knows" that a transfer is being taking place on the data channel. Some routers are not aware of this, though and the command channel is closed. Many programs send a NOOP command periodically to keep the command channel alive even if this is not part of the standard FTP specification. My question: how do I do that? Do I send the NOOP command in the OnWork event? Does this cause any collateral damage in some way, like, do I need to process some response? How do I best solve this problem?

    Read the article

  • Eclipse c++ with mingw comiler cant build boost regex example, can find .a library files

    - by Kim
    Hi, I'm trying to build the boost regex example in eclipse using mingw on vista. I built boost ok with mingw as there are library files XXXX.a. I could build/compile the first boost example that doesnt require any of the compiled boost libraries. When I compile the regex example I get a linker error saying it cant find the library file. I have tried various libray file names eg leave off the .a extension, leave off the lib prefix etc. Now the interesting thing is that if I leave off the library extension and rename the library file to XXX.lib it works and runs ok. So why cant it read the .a library file? It must be my setup somewhere but I dont know where or what to set. From what I read everyone is ok linking the .a file except me :( Thanks in advance, Kim

    Read the article

  • MongoDB, Carrierwave, GridFS and prevention of files' duplication

    - by Arkan
    I am dealing with Mongoid, carrierwave and gridFS to store my uploads. For example, I have a model Article, containing a file upload(a picture). class Article include Mongoid::Document field :title, :type => String field :content, :type => String mount_uploader :asset, AssetUploader end But I would like to only store the file once, in the case where I'll upload many times the same file for differents articles. I saw GridFS has a MD5 checksum. What would be the best way to prevent duplication of identicals files ? Thanks

    Read the article

  • IntelliSense and Folding Editor Not Working in Visual Studio 2008 SP1 for Certain Files Only

    - by cplotts
    Ok, I have an issue that is driving me nuts. In certain xaml files only, neither IntelliSense nor the folding editor is working. I have noticed that if I delete the local namespace and add it back, the folding editor starts working. If I delete the local namespace and don't add it back, IntelliSense starts working as well. Of course, I need to remember to add that namespace declaration back before I compile and/or check in ... which is annoying. How can you fix this?

    Read the article

  • Collection of MVC CSS files available?

    - by Jaxidian
    Just curious - are there various customized Site.css files (and accompanying images) that work with the default ASP.NET MVC 2 templates? I'm a stereotypical developer who "doesn't do pretty" so I'd like to find a design that is good enough for me to use until I later have a designer come back and fix my design. Are there collections/libraries of various designs out there that work with the default templates? I did find this but the 2 popular ones I tried seem like they're for MVC 1, plus they in no way used the default tags with the MVC 2 templates.

    Read the article

  • Problem copying files through xcopy using VBScript

    - by sushant
    I am using VBScript to copy files using xcopy. The problem is that the folder path has to be entered by the user. Assuming I put that path in a variable, say h, how do I use this variable in the xcopy command? Here is the code I tried: Dim WshShell, oExec, g, h h = "D:\newfolder" g = "xcopy $h D:\y\ /E" Set WshShell = CreateObject("WScript.Shell") Set oExec = WshShell.Exec(g) I also tried &h but it did not work. Could anyone help me work out the correct syntax? Any help is appreciated.

    Read the article

  • File Upload multiple files in asp.net (similar to gmail)

    - by superstar
    Hi Guys, I need suggestions with regards to the multiple file upload using File Upload control in asp.net(along with C#). I have a File Upload Control, so i click the 'Browse' button and when i select a file from the select file dialog, i want the file to be shown as a link below the File Upload Control( somewhat similar to gmail). This file should be seen such a way that it can be deleted, if i wanted to. And also i should be able to upload another file from the File Upload control. All these files should be uploaded to a location when i use a button click event in the end. I think i have made myself clear. Any Suggestions are really helpful. Thanks.

    Read the article

  • RichFaces rich:insert takes a long time to output large files

    - by Mark Lewis
    Hello I'm using a RichFaces <rich:insert like this: <rich:panel header="my head"> <a4j:outputPanel ajaxRendered="true"> <rich:insert src="#{MyBacking.myPath}" highlight="groovy" /> </a4j:outputPanel> </rich:panel> If I have a 60k file to output, it takes 23 seconds. I've got a requirement to output the contents of some larger files than that and obviously the larger the file, the larger the wait for content. The recommendation in the answer to another related question is to introduce paging. I will, but the question is, why does it take so long to output 60k of text using JSF/RichFaces? That is, reading off a local disk with Windows XP SP2 PC - I can see from the log the data has already been written to disk from the network. Other scripting languages appear to be faster than this - is it something to do with the JSF lifecycle having to handle the text maybe? Thanks

    Read the article

  • Showing HTML comment strings (<!-- -->) in HTML files

    - by Andrei
    Hello all. I'm building a source code search engine, and I'm returning the results on a HTML page (aspx to be exact, but the view logic is in HTML). When someone searches a string, I also return the whole line of code where this string can be found in a file. However, some lines of code come from HTML/aspx files and these lines contain HTML specific comments (). When I try to print this line on the HTML page, it interprets it as a comment and does not show it on the screen....how should I go about solving this so that it actually shows up? Any help would be welcomed. Thanks. edit: err...i see now that firebug could help me with this: <!-- -->

    Read the article

  • Memory mapped files causes low physical memory

    - by harik
    I have a 2GB RAM and running a memory intensive application and going to low available physical memory state and system is not responding to user actions, like opening any application or menu invocation etc. How do I trigger or tell the system to swap the memory to pagefile and free physical memory? I'm using Windows XP. If I run the same application on 4GB RAM machine it is not the case, system response is good. After getting choked of available physical memory system automatically swaps to pagefile and free physical memory, not that bad as 2GB system. To overcome this problem (on 2GB machine) attempted to use memory mapped files for large dataset which are allocated by application. In this case virtual memory of the application(process) is fine but system cache is high and same problem as above that physical memory is less. Even though memory mapped file is not mapped to process virtual memory system cache is high. why???!!! :( Any help is appreciated. Thanks.

    Read the article

  • Memory efficient import many data files into panda DataFrame in Python

    - by richardh
    I import into a panda DataFrame a directory of |-delimited.dat files. The following code works, but I eventually run out of RAM with a MemoryError:. import pandas as pd import glob temp = [] dataDir = 'C:/users/richard/research/data/edgar/masterfiles' for dataFile in glob.glob(dataDir + '/master_*.dat'): print dataFile temp.append(pd.read_table(dataFile, delimiter='|', header=0)) masterAll = pd.concat(temp) Is there a more memory efficient approach? Or should I go whole hog to a database? (I will move to a database eventually, but I am baby stepping my move to pandas.) Thanks! FWIW, here is the head of an example .dat file: cik|cname|ftype|date|fileloc 1000032|BINCH JAMES G|4|2011-03-08|edgar/data/1000032/0001181431-11-016512.txt 1000045|NICHOLAS FINANCIAL INC|10-Q|2011-02-11|edgar/data/1000045/0001193125-11-031933.txt 1000045|NICHOLAS FINANCIAL INC|8-K|2011-01-11|edgar/data/1000045/0001193125-11-005531.txt 1000045|NICHOLAS FINANCIAL INC|8-K|2011-01-27|edgar/data/1000045/0001193125-11-015631.txt 1000045|NICHOLAS FINANCIAL INC|SC 13G/A|2011-02-14|edgar/data/1000045/0000929638-11-00151.txt

    Read the article

  • nHibernate Domain Model and Mapping Files in Separate Projects

    - by Blake Blackwell
    Is there a way to separate out the domain objects and mapping files into two separate projects? I would like to create one project called MyCompany.MyProduct.Core that contains my domain model, and another project that is called MyCompany.MYProduct.Data.Oracle that contains my Oracle data mappings. However, when I try to unit test this I get the following error message: Named query 'GetClients' not found. Here is my mapping file: <hibernate-mapping xmlns="urn:nhibernate-mapping-2.2" assembly="MyCompany.MyProduct.Core" namespace="MyCompany.MyProduct.Core" > <class name="MyCompany.MyProduct.Core.Client" table="MY_CLIENT" lazy="false"> <id name="ClientId" column="ClientId"></id> <property name="ClientName" column="ClientName" /> <loader query-ref="GetClients"/> </class> <sql-query name="GetClients" callable="true"> <return class="Client" /> call procedure MyPackage.GetClients(:int_SummitGroupId) </sql-query> </hibernate-mapping> Here is my unit test: try { var cfg = new Configuration(); cfg.Configure(); cfg.AddAssembly( typeof( Client ).Assembly ); ISessionFactory sessionFactory = cfg.BuildSessionFactory(); IStatelessSession session = sessionFactory.OpenStatelessSession(); IQuery query = session.GetNamedQuery( "GetClients" ); query.SetParameter( "int_SummitGroupId", 3173 ); IList<Client> clients = query.List<Client>(); Assert.AreNotEqual( 0, clients.Count ); } catch( Exception ex ) { throw ex; } I think I may be improperly referencing the assembly, because if I do put the domain model object in the MyComapny.MyProduct.Data.Oracle class it works. Only when I separate out in to two projects do I run into this problem.

    Read the article

  • Transforming TT files in MsBuild

    - by Phill Duffy
    I need to build a DSL Solution using MsBuild and want to be able to transform the TT files, I have tried the guide on http://msdn.microsoft.com/en-us/library/ee847423(VS.100).aspx but I am getting the following errors: Failed to resolve include text for file:{0} and also Loading the include file '{0}' returned a null or empty string. There is a page on MSDN which has these issues and there resolutions : http://msdn.microsoft.com/en-us/library/bb126242(VS.100).aspx but don't really give me enough information to resolve the issue. One thing to note in the error it has the following path: Error 72 Failed to resolve include text for file:C:\source\XXXXXXXX\Dsl\GeneratedCode\Dsl\ToolboxHelper.tt. Line=-1, Column=-1 Dsl but the location of the actual TT file is C:\source\XXXXXXXX\Dsl\GeneratedCode\ToolboxHelper.tt

    Read the article

  • How to avoid clobbering files when creating a tar archive

    - by Andrew Grimm
    This question notes that it is possible to overwrite files when creating a tar archive, and I'm trying to see how to avoid that situation. Normally, I'd use file roller, but the version installed is playing up a bit (using 1.1 Gb of memory), and I'm not the system administrator. I looked at --confirmation and --interactive, but that only asks me if I want to add file x to the archive, not whether I want to overwrite an existing file. For example, tar --interactive -czvf innocent_text_file.txt foo* Will ask me about each file, but is perfectly happy to overwrite innocent_text_file.txt Is there any switch that acts like -i for cp? Note I am asking about creating an archive, not extracting an archive. Clarification What I'm worried about is accidentally doing something like this tar -czvf * #Don't do this! which would overwrite the first file listed in the glob. To avoid it, I want tar to complain if the first file mentioned already exists, like cp -i * #Don't do this! would check if it would cause you to overwrite an existing file.

    Read the article

  • kernelHow to read/write files within kernel module?

    - by Methos
    I know all the discussions about why one should not read/write files from kernel, instead how to use /proc or netlink to do that. I want to read/write anyway. I have also read http://www.linuxjournal.com/article/8110 However, problem is 2.6.30 does not export sys_read(). Rather its wrapped in SYSCALL_DEFINE3. So if I use that in my module, I get following warnings: WARNING: "sys_read" [xxx.ko] undefined! WARNING: "sys_open" [xxx.ko] undefined! Obviously insmod cannot load the module because linking does not happen correctly. Questions: How to read/write within kernel after 2.6.22 (where sys_read()/sys_open() are not exported)? In general, how to use system calls wrapped in macro SYSCALL_DEFINEn() from within the kernel?

    Read the article

< Previous Page | 278 279 280 281 282 283 284 285 286 287 288 289  | Next Page >