Search Results

Search found 7183 results on 288 pages for 'export to excel'.

Page 182/288 | < Previous Page | 178 179 180 181 182 183 184 185 186 187 188 189  | Next Page >

  • C# StreamReader.ReadLine() - Need to pick up line terminators

    - by Tony Trozzo
    I wrote a C# program to read an Excel .xls/.xlsx file and output to CSV and Unicode text. I wrote a separate program to remove blank records. This is accomplished by reading each line with StreamReader.ReadLine(), and then going character by character through the string and not writing the line to output if it contains all commas (for the CSV) or all tabs (for the Unicode text). The problem occurs when the Excel file contains embedded newlines (\x0A) inside the cells. I changed my XLS to CSV converter to find these new lines (since it goes cell by cell) and write them as \x0A, and normal lines just use StreamWriter.WriteLine(). The problem occurs in the separate program to remove blank records. When I read in with StreamReader.ReadLine(), by definition it only returns the string with the line, not the terminator. Since the embedded newlines show up as two separate lines, I can't tell which is a full record and which is an embedded newline for when I write them to the final file. I'm not even sure I can read in the \x0A because everything on the input registers as '\n'. I could go character by character, but this destroys my logic to remove blank lines. Any ideas would be greatly appreciated.

    Read the article

  • VSTO Development - Key Improvements In VS2010 / .NET 4.0?

    - by dferraro
    Hi all, I am trying to make a case to my bosses on why we should use VS2010 for an upcoming Excel Workbook VSTO application. I haven't used VSTO before but have used VBA. With 2010 just around the corner, I wanted to read about the improvements made to see if it was worth using 2010 to develop this application. So far I have read 2 major improvements are ease of deployments and also debugging / com interop improvements ... I was just wondering if there was anything else I wasn't aware of, or if anyone here is actually developing in VSTO and has used 2010 and both 2008 and could help make a case / arm me with information. The main concern of my bosses is deploying .NET 4.0 runtime on the Citrix servers here... however it seems that with 3.5, we would have to deploy the VSTO runtime and PIA's, etc... So really wouldn't deployments be easier with 2010 because installing just the 4.0 runtime is better than having to install the 'VSTO Runtime' as well as PIA's, etc? Or is there something I'm missing here? Anyone here deploy VSTO app in an enterprise and can speak to this? Also - I'm trying to also fight to use C# over VB.NET for this app. Does anyone know any key reasons why (except for my bias on preference of syntax) it would be better to use C# over VB for this? Any key features lacking in VB VSTO development? I've read about the VSTO Power Tools, and one of them describes LINQ enalbment of the Excel Object Model classes - however it says 'a set of C# classes'... Does anyone know if they literally mean C# - so this would not work with VB.NET, or do they just mean the code is written in C#? Anyone ever used these power tools with VB? I am going to download & play with it now, but any help again is greatly appreciated Thanks very much for any information.

    Read the article

  • Creating a (ClickOnce) setup for VSTO Outlook Add-in

    - by Ward Werbrouck
    So I created an Outlook Add-in and used the click-once setup to deploy it. The setup runs fine when the user is administrator, but otherwise: no go. Running the setup with "run as..." and logging in as admin works, but than the add-in is installed under the admin, not the current user. The addin doesn't show up in outlook. I tried following this guide: http://blogs.msdn.com/mshneer/archive/2008/04/24/deploying-your-vsto-add-in-to-all-users-part-iii.aspx But I get stuck at part I: http://blogs.msdn.com/mshneer/archive/2007/09/04/deploying-your-vsto-add-in-to-all-users-part-i.aspx I follow the examples and start excel as described: Now start Excel application. Examine the registry keys in HKCU hive e.g. you will find two interesting registry keys that appear under your HKCU hive: HKCU\Software\Microsoft\Office\TestKey registry key containing registry value TestValue You now also have HKCU\Software\Microsoft\Office\12.0\User Settings\TestPropagation registry key with Count value set to 1 But on my machine, the keys are not created... What can I try next?

    Read the article

  • OLAP Web Visualization and Reporting Recommendations

    - by Gok Demir
    I am preparing an offer for a customer. They proide weekly data to different organizations. There is huge amount data suits OLAP that needed to be visualized with charts and pivot tables on web and custom reports will be built by non-it persons (an easy gui). They will enter a date range, location which data columns to be included and generate report and optionally export the data to Excel. They currently prepare reports with MS Excel with Pivot Tables and but they need a better online tool now to show data to their customers. Tables are huge and need of drill-down functionality. My current knowledge Spring, Flex, MySql, Linux. I have some knowledge of PostgreSQL and MSSQL and Windows. What is the easiest way of doing this project. Do you think that SSRP (haven't tried yet) and ASP.NET better suits for this kind of job. Actually I prefer open source solutions. Flex have OLAP Data Grid control which do aggregation on client side. JasperServer seems promising but it seems I need enterprise version (multiple organizations and ad hoc queries). What about Modrian + Flex + PostgreSQL solution? Any previous experience will be appreciated. Yes I am confused with options.

    Read the article

  • ASP.net download page

    - by Russel
    Hi I have a Reports.aspx ASP.NET page that allows users to download excel report files by clicking on several hyperlinks. When a report hyperlink is clicked, I open a new window using the javascript window.open method and navigate off to the download.aspx page. The code-behind for the download page creates a excel file on the fly using openxml(in memory) and send it back to the browser. Here is some code from the download.aspx page: byte[] outputFileBytes = CreateExcelReport().ToArray(); Response.Clear(); Response.BufferOutput = true; Response.ContentType = "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"; Response.AddHeader("Content-Disposition", string.Format("attachment; filename={0}", "tempReport.xlsx")); Response.BinaryWrite(outputFileBytes); Response.Flush(); Response.Close(); Response.End(); My problem : Some of these reports take some time to generate. I would like to display a loading.gif file on my Reports.aspx page, while the download.aspx page is requested. Once the page request is completed, the loading.gif file should be made invisible. Is there a way to achieve this. Perhaps some kind of event. I have mootools to my disposal. Thanks PS. I know that generating reports like this is not ideal, but thats a different story all together...

    Read the article

  • Download file in coldfusion and read its content

    - by Deepak
    cfhttp with a get to download the files. Does anyone have an example of cfhttp working? Are there special settings that need to be set up on the server side to get this tag to work. When I try the following code: <CFHTTP METHOD = "get" URL="http://data.bls.gov/PDQ/servlet/SurveyOutputServlet?series_id=LNU04032231&years_option=specific_years&to_year=2010&from_year=2009&delimiter=comma&output_view&output_format=excelTable" path="/Users/Deepak" file="testfile.xls"> Nothing comes back to my computer? How do you get it to pop up the "where do you want to save the file box" dialogue box? I am submitting a form in coldfusion by hitting this link http://data.bls.gov/PDQ/servlet/SurveyOutputServlet?series_id=LNU04032231&years_option=specific_years&to_year=2010&from_year=2009&delimiter=comma&output_view&output_format=excelTable I am getting a excel file as a result. How can I save this file on my local box. Or, is it possible to directly read the content of file without saving it in my local box through coldfusion using cfftp or cfhttp? cfhttp.mimeType is application/vnd.ms-excel in this case. Thanks!!

    Read the article

  • Strange error when filling a data adapter.

    - by Tim C
    I am receiving the following error in my code (c#, .Net 3.5, VS2008) when I try to connect to an Excel sheet and fill a OleDbDataAdapter with the results of a query. First the error: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. And here is the code, which is honestly pretty simple: var excelFileName = string.Format("c:/Metadata_Tool.xlsm"); var connectionString = string.Format("Provider=Microsoft.ACE.OLEDB.12.0; Data Source={0}; Extended Properties=Excel 12.0;HDR=YES;", excelFileName); var adapter = new OleDbDataAdapter("Select * FROM [Video Tagging XML]", connectionString); var ds = new DataSet(); adapter.Fill(ds, "VTX"); DataTable data = ds.Tables["VTX"]; foreach (DataRow myRow in data.Rows) { foreach (DataColumn myColumn in data.Columns) { Console.Write("\t{0}", myRow[myColumn]); } Console.WriteLine(); } Console.ReadLine(); I get the error on the line adapter.Fill(ds,"VTX");. I did find a microsoft forum post saying to turn on JIT optimization in VS2008 from the Tools/Options/Debug/General menu, but this did not seem to help. Any help would be greatly appreciated thanks!

    Read the article

  • Architecture Guidance Needed?

    - by vijay
    We are about to automate number of process for our reporting team. (The reports are like daily reports, weekly reports, monthly reports, etc..) Mostly the process is like pulling some data from the oracle and then fill them in particular excel template files. Each reports and so their templates are different from each other. Except the excel file manipulation, there are hardly any business logic behind these. Client wanted an integrated tool and all the automated processes are placed as menus/submenus. Right now roughly there are around 30 process waiting to be automated. And we are expecting more new reports in the next quarter. I am nowhere to near having any practical experience when comes to architecuring. Already i have been maintaining two or three systems(they are more than 4yrs old.) for this prestegious client.The possiblity of the above mentioned tool will be manintained for another 3 yrs is very likely. From my past experience i've been through the pain of implmenting change requests to the rigd & undocumented code base resulting in the break down of the system and then eventually myself. So My main and top most concern is the maintainablity. When i was searching for these i came across this link, Smart Clients Using CAB and SCSF is the above link appropriate for my requirement? Also Should i place each automated processes in separate forms under a single project, or place them in separate projects under a single solution.. Please correct me if have missed any other important information. Thx.

    Read the article

  • Field specific errors for ETL

    - by AaronLS
    I am creating a ETL process in MS SQL Server and I would like to have errors specific to a particular column of a particular row. For example, the data is initially loaded from excel files into a table(we'll call the Initial table) where all columns are varchar(2000) and then I stage the data to another table(the DataTypedTable) that contains more specific data types (datetime,int, etc.) or more tightly constrained varchar lengths. I need to be able to create error messages for a specific field such as: "Jan. 13th" is not a valid date format for the submission date. Please use a format of MM/DD/YYYY These error messages would need to be stored in some way such that later in the process a automated process can create reports with the error messages such that each message references a specific row and field(someone will need to go back and correct the data in the source system and resubmit the excel file). So ideally it would be inserted into a Failures tables of some sort and contain the primary key of the failed row, the column name, and the error message. Question: So I am wondering if this can be accomplished with SSIS, or some open source tool like Talend, and if so, what would be your general approach? Or what hand coded approach you would take? Couple approaches I've thought of using SQL(up until no I have done ETL by hand in SQL procs, but I want to consider other approaches. Possible C# even.): Use a cursor to read through the Initial table, and for each row insert a blank record with only the primary key into the DataTyped table, then use a single update statement for each column, such that if that update fails I can insert a very specific error message specific to that column in the error messages table. Insert all the data as is into the DataTyped table, but have duplicate columns like SubmissionDate and SubmissionDateOld. After the initial insert the *Old columns have data, the rest are blank, and I have a single update for each column that sets the SubmissionDate based on the SubmissionDateOld. In addition to suggesting an approach, I'd like to know if you are using that approach or something similar already in the work you do.

    Read the article

  • I write barely functional scripts that tend to not be resuable and make the baby jesus cry. Please h

    - by maxxpower
    I received a request to add around 100 users to a linux box the users are already in ldap so I can't just use newusers and point it at a text file. Another admin is taking care of the ldap piece so all I have to do is create all the home directories and chown them to the correct user once he adds the users to the box. creating the directories isn't a problem, but I'd like a more elegant script for chowning them to the correct user. what I have currently basically looks like chown -R testuser1 testgroup1 /home/tetsuser1; chown -R testuser2 testgroup2 /home/testgroup2; chown -R testsuser3 testgroup1 /home/testuser3 bascially I took the request that the user name and group name popped it into excel added a column of "chown -R" to the front, then added a column of "/", copied and pasted the username column after it and then added a column of ";" and dragged it down to the second to last row. Popped it into notepad ran some quick find and replaces and in less than a minute I have a completed request and a sad empty feeling. I know this was a really ghetto method and I'm trying to get away from using excel to avoid learning new scripting techniques so here's my real question. tl;dr I made 100 home directories and chowned them to the correct users, but it was ugly. Actual question below. You have a file named idlist that looks like this (only with say 1000 users and real usernames and groups) testuser1 testgroup1 testuser2 testgroup2 testuser3 testgroup1 write a script that creates home directories for all the users and chowns the created directories to the correct user and group. To make the directories I used the following(feel free to flame/correct me on this as well. ) var= 'cut -f1 -d" " idlist' (I used backticks not apostrophes around the cut command) mkdir $var

    Read the article

  • Python DictReader - Skipping rows with missing columns?

    - by victorhooi
    heya, I have a Excel .CSV file I'm attempting to read in with DictReader. All seems to be well, except it seems to omit rows, specifically those with missing columns. Our input looks like: mail,givenName,sn,lorem,ipsum,dolor,telephoneNumber [email protected],ian,bay,3424,8403,2535,+65(2)34523534545 [email protected],mike,gibson,3424,8403,2535,+65(2)34523534545 [email protected],ross,martin,,,,+65(2)34523534545 [email protected],david,connor,,,,+65(2)34523534545 [email protected],chris,call,3424,8403,2535,+65(2)34523534545 So some of the rows have missing lorem/ipsum/dolor columns, and it's just a string of commas for those. We're reading it in with: def read_gd_dump(input_file="blah 20100423.csv"): gd_extract = csv.DictReader(open('blah 20100423.csv'), restval='missing', dialect='excel') return dict([(row['something'], row) for row in gd_extract]) And I checked that "something" (the key for our dict) isn't one of the missing columns, I had originally suspected it might be that. It's one of the columns after that. However, DictReader seems to completely skip over the rows. I tried setting restval to something, didn't seem to make any difference. I can't seem to find anything in Python's CSV docs (http://docs.python.org/library/csv.html) that would explain this behaviour, but I may have misread something. Any ideas? Thanks, Victor

    Read the article

  • ASP.NET - I am generating an .XLS file with a DLL, how do I grant permissions for writing to file? (

    - by hamlin11
    I'm generating an .XLS file with a DLL (Excel Library http://code.google.com/p/excellibrary/) I've added this DLL as a reference to my project. The code to save the .XLS to disk is running, but it's encountering a permissions issue. I've attempted to set full access for IUSRS, Network Service, and Everyone just to see if I could get it working, and none of these seems to make a difference. Here's where I'm trying to write the file: c:/temp/test1.xls Here's the error: [SecurityException: Request for the permission of type 'System.Security.Permissions.FileIOPermission, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' failed.] System.Security.CodeAccessSecurityEngine.Check(Object demand, StackCrawlMark& stackMark, Boolean isPermSet) +0 System.Security.CodeAccessPermission.Demand() +54 System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy) +2103 System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share, Int32 bufferSize, FileOptions options, String msgPath, Boolean bFromProxy) +138 System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share) +89 System.IO.File.Open(String path, FileMode mode, FileAccess access, FileShare share) +58 ExcelLibrary.Office.CompoundDocumentFormat.CompoundDocument.Create(String file) +88 ExcelLibrary.Office.Excel.Workbook.Save(String file) +73 CHC_Reports.LitAnalysis.CreateSpreadSheet_Click(Object sender, EventArgs e) in C:\Users\brian\Desktop\Enterprise Manager\CHC_Reports\LitAnalysis.aspx.vb:19 System.Web.UI.WebControls.Button.OnClick(EventArgs e) +115 System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument) +140 System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) +29 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +11041511 System.Web.UI.Page.ProcessRequest(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +11041050 System.Web.UI.Page.ProcessRequest() +91 System.Web.UI.Page.ProcessRequest(HttpContext context) +240 ASP.litanalysis_aspx.ProcessRequest(HttpContext context) +52 System.Web.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() +599 System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) +171 Any idea what I need to do to diagnose the permissions issue and allow the file creation? Thanks.

    Read the article

  • Can any linux API or tool watch for any change in any folder below e.g. /SharedRoot or do I have to

    - by Simon B.
    I have a folder with ~10 000 subfolders. Can any linux API or tool watch for any change in any folder below e.g. /SharedRoot or do I have to setup inotify for each folder? (i.e. I loose if I want to do this for 10k+ folders). I guess yes, since I've already seen examples of this inefficient method, for instance http://twistedmatrix.com/trac/browser/trunk/twisted/internet/inotify.py?rev=28866#L345 My problem: I need to keep folders time-sorted with most recently active "project" up top. When a file changes, each folder above that file should update its last-modified timestamp to match the file. Delays are ok. Opening a file (typically MS Excel) and closing again, its file date can jump up and then down again. For this reason I need to wait until after a file is closed, then queue the folder of that file for checking, and only a while later do I go and look for the newest file in its folder, since the filedate of the triggering file could already be back-dated to its original timestamp by Excel or similar programs. Also in case several files from same folder are used/created, it makes sense to buffer timestamping of that folders' parents to at least get a bunch of updates collapsed into one delayed update. I'm looking for a linux solution. I have some code that can be run on a windows server, most of the queing functionality is here: http://github.com/sesam/FolderdateFollowsFiles/blob/master/FolderdateFollowsFiles/Follower.vb Available API:s The relative of inotify on windows, ReadDirectoryChangesW, can watch a folder and its whole subtree; see bWatchSubtree on http://msdn.microsoft.com/en-us/library/aa365465(VS.85).aspx Samba? Patching samba source is a possibility, but perhaps there are already hooks available? Other possibilities, like client side (various windows versions) and spying on file activities in order to update folders recursively?

    Read the article

  • Panel not displaying while dowloading file

    - by James123
    I wrote download excel file in my code. If I click download button I need show ajax-load image (pnlPopup panel). But it is not displaying. I think because of Some "Response" statements (see below code). Download working fine, but simultaniously I want show loader panel too. <asp:Panel ID="pnlPopup" runat="server" visible="false"> <div align="center" style="margin-top: 13px;"> <asp:Image runat ="server" ID="imgDownload" src="Images/ajax-loader.gif" alt="" /> <br /> <span class="updateProgressMessage">downloading ...</span> </div> Protected Sub btnDownload_Click(ByVal sender As Object, ByVal e As EventArgs) 'Handles btnDownload.Click' Try pnlPopup.Visible = True Dim mSurvey As New Survey Dim mUser As New User Dim dtExcel As DataTable mUser = CType(Session("user"), User) dtExcel = mSurvey.CreateExcelWorkbook(mUser.UserID, mUser.Client.ID) Dim filename As String = "Download.xls" InitializeWorkbook() GenerateData(dtExcel) Response.ContentType = "application/vnd.ms-excel" Response.AddHeader("Content-Disposition", String.Format("attachment;filename={0}", filename)) Response.Clear() Response.BinaryWrite(WriteToStream.GetBuffer) Response.End() Catch ex As Exception Finally End Try End Sub

    Read the article

  • Creating multiple csv files from data within a csv file.

    - by S1syphus
    System OSX or Linux I'm trying to automate my work flow at work, each week I receive an excel file, which I convert to a csv. An example is: ,,L1,,,L2,,,L3,,,L4,,,L5,,,L6,,,L7,,,L8,,,L9,,,L10,,,L11, Title,r/t,needed,actual,Inst,needed,actual,Inst,needed,actual,Inst,needed,actual,Inst,neede d,actual,Inst,needed,actual,Inst,needed,actual,Inst,needed,actual,Inst,needed,actual,Inst,needed,actual,Inst,needed,actual,Inst EXAMPLEfoo,60,6,6,6,0,0,0,0,0,0,6,6,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 EXAMPLEbar,30,6,6,12,6,7,14,6,6,12,6,6,12,6,8,16,6,7,14,6,7.5,15,6,6,12,6,8,16,6,0,0,6,7,14 EXAMPLE1,60,3,3,3,3,5,5,3,4,4,3,3,3,3,6,6,3,4,4,3,3,3,3,4,4,3,8,8,3,0,0,3,4,4 EXAMPLE2,120,6,6,3,0,0,0,6,8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 EXAMPLE3,60,6,6,6,6,8,8,6,6,6,6,6,6,0,0,0,0,0,0,6,8,8,6,6,6,0,0,0,0,0,0,0,10,10 EXAMPLE4,30,6,6,12,6,7,14,6,6,12,6,6,12,3,5.5,11,6,7.5,15,6,6,12,6,0,0,6,9,18,6,0,0,6,6.5,13 And so you can get a picture of how it looks in excel: What I need to do, is create multiple csv files for each instance in row 1, so L1, L2, L3, L4... And within that each csv file it needs to contain the title, r/t, needed So for L1 an example out put would look like: EXAMPLEfoo,60,6 EXAMPLEbar,30,6 EXAMPLE1,60,3 EXAMPLE2,120,6 EXAMPLE3,60,6 EXAMPLE4,30,6 And for L2: EXAMPLEfoo,60,0 EXAMPLEbar,30,6 EXAMPLE1,60,3 EXAMPLE2,120,0 EXAMPLE3,60,6 EXAMPLE4,30,6 And so on. I have tried playing around with sed and awk and hit google but I have found nothing that really solves the issue. I'd imagine perl would be particular suited to this or maybe python, so I would be more than happy to accept suggestions from users. So, any suggestions? Thanks in advance.

    Read the article

  • Fast serarch of 2 dimensional array

    - by Tim
    I need a method of quickly searching a large 2 dimensional array. I extract the array from Excel, so 1 dimension represents the rows and the second the columns. I wish to obtain a list of the rows where the columns match certain criteria. I need to know the row number (or index of the array). For example, if I extract a range from excel. I may need to find all rows where column A =”dog” and column B = 7 and column J “a”. I only know which columns and which value to find at run time, so I can’t hard code the column index. I could use a simple loop, but is this efficient ? I need to run it several thousand times, searching for different criteria each time. For r As Integer = 0 To UBound(myArray, 0) - 1 match = True For c = 0 To UBound(myArray, 1) - 1 If not doesValueMeetCriteria(myarray(r,c) then match = False Exit For End If Next If match Then addRowToMatchedRows(r) Next The doesValueMeetCriteria function is a simple function that checks the value of the array element against the query requirement. e.g. Column A = dog etc. Is it more effiecent to create a datatable from the array and use the .select method ? Can I use Linq in some way ? Perhaps some form of dictionary or hashtable ? Or is the simple loop the most effiecent ? Your suggestions are most welcome.

    Read the article

  • Change Powerpoint chart data with .NET

    - by mc6688
    I have a Powerpoint template that contains 1 slide and on that slide is a chart. I'd like to be able to manipulate that charts data using .NET. So far I have code that... unzips the Powerpoint file. unzips the embedded excel file (ppt\embeddings\Microsoft_Office_Excel_Worksheet1.xlsx) It successfully manipulates the data in the excel sheet and zips it back up. Opens and manipulates ppt\charts\chart1.xml Powerpoint is then zipped up and delivered to the user The result of this is a Powerpoint file that shows a blank chart. But when I click on the chart and go to edit data it updates the data and shows the correct chart. I believe my problem is with the chart1.xml that I am generating. I have compared my generated version with a version created by Powerpoint and they are almost identical. The only differences are in the values for <c:crossAx> and <c:axId>. There are also some rounding difference in the data. But I do not feel like that would result in an blank chart. Is there another file that I need to edit? Does anyone have any ideas as to what else I should try to get this working?

    Read the article

  • Can't sent xml file from Web Application on Internet Explorer

    - by nCdy
    Error is something like that : Can't load Items.aspx from 192.168.0.172 And a text is Can't open this web-site. It can't be found. Try later code : HttpContext.Current.Response.Clear(); HttpContext.Current.Response.Cache.SetCacheability(HttpCacheability.NoCache); HttpContext.Current.Response.Charset = System.Text.Encoding.Unicode.EncodingName; HttpContext.Current.Response.ContentEncoding = System.Text.Encoding.Unicode; HttpContext.Current.Response.BinaryWrite(System.Text.Encoding.Unicode.GetPreamble()); HttpContext.Current.Response.ContentType = "application/vnd.ms-excel"; HttpContext.Current.Response.AddHeader( "content-disposition", string.Format( "attachment; filename={0}",fileName)); .... table.RenderControl(htw); HttpContext.Current.Response.Write(sw.ToString()); HttpContext.Current.Response.End(); Trouble with this file is only for Internet Explorer (works on opera / firefox ... ) And so it works for HTML with no HttpContext.Current.Response.ContentType = "application/vnd.ms-excel"; this string

    Read the article

  • Read/Write/Find/Replace huge csv file

    - by notapipe
    I have a huge (4,5 GB) csv file.. I need to perform basic cut and paste, replace operations for some columns.. the data is pretty well organized.. the only problem is I cannot play with it with Excel because of the size (2000 rows, 550000 columns). here is some part of the data: ID,Affection,Sex,DRB1_1,DRB1_2,SENum,SEStatus,AntiCCP,RFUW,rs3094315,rs12562034,rs3934834,rs9442372,rs3737728 D0024949,0,F,0101,0401,SS,yes,?,?,A_A,A_A,G_G,G_G D0024302,0,F,0101,7,SN,yes,?,?,A_A,G_G,A_G,?_? D0023151,0,F,0101,11,SN,yes,?,?,A_A,G_G,G_G,G_G I need to remove 4th, 5th, 6th, 7th, 8th and 9th columns; I need to find every _ character from column 10 onwards and replace it with a space ( ) character; I need to replace every ? with zero (0); I need to replace every comma with a tab; I need to remove first row (that has column names; I need to replace every 0 with 1, every 1 with 2 and every ? with 0 in 2nd column; I need to replace F with 2, M with 1 and ? with 0 in 3rd column; so that in the resulting file the output reads: D0024949 1 2 A A A A G G G G D0024302 1 2 A A G G A G 0 0 D0023151 1 2 A A G G G G G G (both input and output should read one line per row, ne extra blank row) Is there a memory efficient way of doing that with java(and I need a code to do that) or a usable tool for playing with this large data so that I can easily apply Excel functionality..

    Read the article

  • SSIS String or binary data would be truncated. The statement has been terminated.

    - by Subbarao
    When I run SSIS package from BIDS it runs fine without any error / problem. When I try to call it through a ASP.NET website I get the following error - "String or binary data would be truncated. The statement has been terminated." I checked all the columns / data to see if anything is exceeding the limit, everything is fine. I can run the package through command line using dtexec C:dtexec /f "C:\temp\MyTempPackage.dtsx", it executes without any problem . The problem is when I try to run it through ASP.NET. The following is the code that I am trying to use - //DTS Runtime Application Application app = new Application(); //DTS Package Package package = app.LoadPackage(packagePath, null); //Execute and Get the result DTSExecResult result = package.Execute(); I am making a call to a webservice from asp.net which has the above code. Both the webservice and website have identity impersonation enabled. I have identity enabled in my web.config for this <identity impersonate="true" userName="MyUserName" password="MyPassword"/> This problem is only when I am trying to import a Excel file (.xlsx) when I import a .txt file everything is fine. Excel Import blew up in both 32bit and 64bit enviornments. Help on how to make this to work is greatly appreciated.

    Read the article

  • Javascript not registering/executing after button click

    - by rs
    I have a textbox which i convert to tinymce textbox using tinymce.js file. I'm trying to add a new script with database fields substituted based on selection made from dropdownlist to page using registerstartupscript. In page load protected sub page_load(byVal sender as object, byval e as EventArgs) handles me.load AddScriptToPage() if not page.ispostback() then session("fname") = nothing end if end sub Script Method Sub AddScriptToPage() dim _script = " c.onRenderMenu.add(function(c, m) { " & _ add_columns() & _ "}); " ClientScript.RegisterStartupScript(Me.GetType(), keyword,_script) End Sub function add_columns() as string dim ret_string = String.empty select case ddlList.selectedvalue case 1 sqlcommand = "xyz" case 2 sqlcommand = "xyz" case 3 //Load from excel sqlcommand = nothing end select if sqlcommand isnot nothing then executereader() while reader.read() ret_string = ret_string & reader("column") end while elseif session("fname") isnot nothing dt = getdatatable(session("fname").tostring()) //method that opens oledb connection and returns columns for each dr in dt.rows ret_string = ret_string & dr("column") next end if add_columns = ret_string end function When excel file is uploaded to server this event is called protected sub btnclick(byVal sender as object, byval e as EventArgs) if hasfile then 'i have custom properties set after control validation 'Upload file 'set session value with filepath session("fname") = filepath 'call js register event AddScriptToPage() end if end sub When i click upload button i'm calling AddScriptToPage, and debug it it hits AddScripttoPage event and executes it but it doesn't showup on page. It should show column names in tinymce editor but it doesn't. But when i click another button on the same page it shows (becos session has the filename). On Upload button click page flow: Page load - AddScriptToPage is executed (registers scripts with no columns) Then AddScriptToPage in btnclick event is executed (registers scripts with columns) Why is this script not getting registered even though it is executed on vb code on uploadclick event? And why does it show up after another button click and not original button click that calls this event?

    Read the article

  • Capacity Allocation

    - by user1708730
    I am new to VB in Excel. I have a unique requirement for capacity allocation which I want to automate using excel VB and facing hard time doing so, hope you can help. The objective is to maximize profit by allocating maximum capacity to those products which have highest profit potential first. Every Month I get demand along with backlogs of previous month. I need to allocate capacity to backlogs of previous month first and then only the remaining capacity for fresh demand. There are two primary constraints: 1.The number of working days in a month (variable) 2. Not all products can be made on every production line and out of same product may be different for each production line Also there will be losses whenever there is a change over from one SKU to another depending upon the Variant Type and size of next product. If there is variant change then 8 hours of production loss needs to be accounted and 4 hours in case of size change(8 hours in case of both). I have attached sample data(Actual data has 10 production lines and 50 products) https://rapidshare.com/files/1822719405/Sample%20Data.xlsx?bin=1 Thanks in advance for help!

    Read the article

  • PHP: Check if 0?

    - by tarnfeld
    Hi, I am using a class which returns me the value of a particular row and cell of an excel spreadsheet. To build up an array of one column I am counting the rows and then looping through that number with a for() loop and then using the $array[] = $value to set the incrementing array object's value. This works great if none of the values in a cell are 0. The class returns me a number 0 so it's nothing to do with the class, I think it's the way I am looping through the rows and then assigning them to the array... I want to carry through the 0 value because I am creating graphs with the data afterwards, here is the code I have. // Get Rainfall $rainfall = array(); for($i=1;$i<=$count;$i++) { if($data->val($i,2) != 'Rainfall') // Check if not the column title { $rainfall[] = $data->val($i,2); } } For your info $data is the excel spreadsheet object and the method $data->val(row,col) is what returns me the value. In this case I am getting data from column 2. Screenshot of spreadsheet http://cl.ly/1Dmy Thanks! All help is very much appreciated!

    Read the article

  • What are your best practices for ensuring the correctness of the reports from SQL?

    - by snezmqd4
    Part of my work involves creating reports and data from SQL Server to be used as information for decision. The majority of the data is aggregated, like inventory, sales and costs totals from departments, and other dimensions. When I am creating the reports, and more specifically, I am developing the SELECTs to extract the aggregated data from the OLTP database, I worry about mistaking a JOIN or a GROUP BY, for example, returning incorrect results. I try to use some "best practices" to prevent me for "generating" wrong numbers: When creating an aggregated data set, always explode this data set without the aggregation and look for any obvious error. Export the exploded data set to Excel and compare the SUM(), AVG(), etc, from SQL Server and Excel. Involve the people who would use the information and ask for some validation (ask people to help to identify mistakes on the numbers). Never deploy those things in the afternoon - when possible, try to take a look at the T-SQL on the next morning with a refreshed mind. I had many bugs corrected using this simple procedure. Even with those procedures, I always worry about the numbers. What are your best practices for ensuring the correctness of the reports?

    Read the article

  • MATLAB, time match filter

    - by Paul
    OK, I am still getting the hang of MATLAB. I have two files in different format. One Excel file. data1.xls, size= 86400 X 62. It looks like: Date/Time par1 par2 par3 par4 par5 par6 par6 par7 par8 par9 08/02/09 00:06:45 0 3 27 9.9 -133.2 0 0 0 1 0 Another file, data2.csv, size = 144 X 27. (If nothing is missing.) It looks like: date time P01 P02 P03 P04 P05 P06 P07 P08 P09 P10 P11 8/16/2009 0:00 51 45 46 54 53 52 524 5 399 89 78 Now I am using Data10minAvg = mean(reshape(Data,300,144,62)); to get the 10 min average of the first Excel file. Now I need to match up that file I am making above with the .csv file. The problem is many timestamps are missing in the .csv file. How do I make data2.csv into a file of size 144 X 27, replacing the missing datestamps by rows of zero? It will really help me than compare data1.xls file with newdata2.csv.

    Read the article

< Previous Page | 178 179 180 181 182 183 184 185 186 187 188 189  | Next Page >