Search Results

Search found 67192 results on 2688 pages for 'excel external data'.

Page 85/2688 | < Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >

  • How to keep format within HTML converted to Excel

    - by pojomx
    Hello, I'm working with an HTML table, that contains numbers (formated) and when I export this to xls file (just change extension... hehe) I loss some of the formated data. Example: in html I have " 1,000.00 | 500.00 | 20.00 " and in excel it shows like: "1,000.00 | 500 | 20" I want it to know if it is possible to show the very same format as in html. THankyou :P

    Read the article

  • Excel Merge() vs MergeCells

    - by sleepp
    Hi, I'm using VSTO, C#, and Excel but VBA probably applies here as well. What's the difference between calling the Merge(missing) method on a range and setting the MergeCells property to true? Does Merge() fail more often? Thanks!

    Read the article

  • Reading Excel file with C# - Choose sheet

    - by Dänu
    Hey Guys I'm reading an excel file with C# and OleDB (12.0). There I have to specify the select statement with the name of the sheet I wish to read ([Sheet1$]). this.dataAdapter = new OleDbDataAdapter("SELECT * FROM [Sheet1$]", connectionString); Is it possible to select the first sheet, no matter what name? Thank you.

    Read the article

  • MS Excel connection with vb.net

    - by Noreen
    I have used the connection string below but I am getting an error when trying to create a table Dim ConnString As String = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" & strFName + _ ";Extended Properties=""Excel 12.0 Xml;HDR=YES;IMEX=1""" Cannot modify the design of table 'tablename'. It is in a read-only database.

    Read the article

  • List of objects to Excel Spreadsheet?

    - by Max Fraser
    Anyone know of some code out there that does this already? I have a bunch of pages with data grids on them in an admin website they want to export them to Excel, was hoping someone had this written already - or if not I'll post mine when I am done.

    Read the article

  • Updating data source on multiple pivot tables within Excel

    - by phrenetic
    Is there an easy way to update the data source for multiple pivot tables on a single Excel sheet at the same time? All of the pivot tables reference the same named range, but I need to create a second worksheet that has the same pivot tables, but accessing a different named range. Ideally I would like to be able to do some kind of search and replace operation (like you can do on formulae), rather than updating each individual pivot table by hand. Any suggestions?

    Read the article

  • Import excel file into sql express 2008

    - by ck
    Hi, I've got some excel files, that were exported from tables in Access, and I want to import them into sql express 2005. I need a script that will convert nvarchar(255) columns to varchar(255) and preserve links, when importing the data into sql express. Thanks

    Read the article

  • PRINTER SET UP IN EXCEL VISUAL BASIC

    - by Gina
    I am trying to assign a cell in excel for the user to type the printer name where they want the print out to go and then use that value in the Application.ActivePrinter = (use the cell value) Even though I have done the programming assigning a name to the cell and using it in a variable it is giving me an error. I have set my variable as string, text, object and variant already and it's not working. Do you know what code should I use to be able to do this?

    Read the article

  • Excel hyperlink

    - by developer
    Hi All, I have a column in excel, wherein I have all the website url values. My question is I want to turn the url values to active links. There are about 200 entries in that column with different urls in all cells. Is there a way I can create active hyperlinks to all the cells without writing a macro.

    Read the article

  • Excel VBA SQL Data

    - by user307655
    Hi All, I have a small excel program. I would like to be able to use this program to update a SQL table. What would be the function to say update line 2 in SQL table Test in Database ABC Thanks

    Read the article

  • Best way to provide Excel like WebPage?

    - by pipelinecache
    Hi everyone, is there in any means a manner to show a grid or something like Excel. I see alot of information, but how do you feel about this one? Are there grids, or third party grid that provide these features? Thanks, I very like it to here your opinion, and your idea about this feature I want to provide in a Web Page?

    Read the article

  • External File Upload Optimizations for Windows Azure

    - by rgillen
    [Cross posted from here: http://rob.gillenfamily.net/post/External-File-Upload-Optimizations-for-Windows-Azure.aspx] I’m wrapping up a bit of the work we’ve been doing on data movement optimizations for cloud computing and the latest set of data yielded some interesting points I thought I’d share. The work done here is not really rocket science but may, in some ways, be slightly counter-intuitive and therefore seemed worthy of posting. Summary: for those who don’t like to read detailed posts or don’t have time, the synopsis is that if you are uploading data to Azure, block your data (even down to 1MB) and upload in parallel. Set your block size based on your source file size, but if you must choose a fixed value, use 1MB. Following the above will result in significant performance gains… upwards of 10x-24x and a reduction in overall file transfer time of upwards of 90% (eg, uploading a 1GB file averaged 46.37 minutes prior to optimizations and averaged 1.86 minutes afterwards). Detail: For those of you who want more detail, or think that the claims at the end of the preceding paragraph are over-reaching, what follows is information and code supporting these claims. As the title would indicate, these tests were run from our research facility pointing to the Azure cloud (specifically US North Central as it is physically closest to us) and do not represent intra-cloud results… we have performed intra-cloud tests and the overall results are similar in notion but the data rates are significantly different as well as the tipping points for the various block sizes… this will be detailed separately). We started by building a very simple console application that would loop through a directory and upload each file to Azure storage. This application used the shipping storage client library from the 1.1 version of the azure tools. The only real variation from the client library is that we added code to collect and record the duration (in ms) and size (in bytes) for each file transferred. The code is available here. We then created a directory that had a collection of files for the following sizes: 2KB, 32KB, 64KB, 128KB, 512KB, 1MB, 5MB, 10MB, 25MB, 50MB, 100MB, 250MB, 500MB, 750MB, and 1GB (50 files for each size listed). These files contained randomly-generated binary data and do not benefit from compression (a separate discussion topic). Our file generation tool is available here. The baseline was established by running the application described above against the directory containing all of the data files. This application uploads the files in a random order so as to avoid transferring all of the files of a given size sequentially and thereby spreading the affects of periodic Internet delays across the collection of results.  We then ran some scripts to split the resulting data and generate some reports. The raw data collected for our non-optimized tests is available via the links in the Related Resources section at the bottom of this post. For each file size, we calculated the average upload time (and standard deviation) and the average transfer rate (and standard deviation). As you likely are aware, transferring data across the Internet is susceptible to many transient delays which can cause anomalies in the resulting data. It is for this reason that we randomized the order of source file processing as well as executed the tests 50x for each file size. We expect that these steps will yield a sufficiently balanced set of results. Once the baseline was collected and analyzed, we updated the test harness application with some methods to split the source file into user-defined block sizes and then to upload those blocks in parallel (using the PutBlock() method of Azure storage). The parallelization was handled by simply relying on the Parallel Extensions to .NET to provide a Parallel.For loop (see linked source for specific implementation details in Program.cs, line 173 and following… less than 100 lines total). Once all of the blocks were uploaded, we called PutBlockList() to assemble/commit the file in Azure storage. For each block transferred, the MD5 was calculated and sent ensuring that the bits that arrived matched was was intended. The timer for the blocked/parallelized transfer method wraps the entire process (source file splitting, block transfer, MD5 validation, file committal). A diagram of the process is as follows: We then tested the affects of blocking & parallelizing the transfers by running the updated application against the same source set and did a parameter sweep on the block size including 256KB, 512KB, 1MB, 2MB, and 4MB (our assumption was that anything lower than 256KB wasn’t worth the trouble and 4MB is the maximum size of a block supported by Azure). The raw data for the parallel tests is available via the links in the Related Resources section at the bottom of this post. This data was processed and then compared against the single-threaded / non-optimized transfer numbers and the results were encouraging. The Excel version of the results is available here. Two semi-obvious points need to be made prior to reviewing the data. The first is that if the block size is larger than the source file size you will end up with a “negative optimization” due to the overhead of attempting to block and parallelize. The second is that as the files get smaller, the clock-time cost of blocking and parallelizing (overhead) is more apparent and can tend towards negative optimizations. For this reason (and is supported in the raw data provided in the linked worksheet) the charts and dialog below ignore source file sizes less than 1MB. (click chart for full size image) The chart above illustrates some interesting points about the results: When the block size is smaller than the source file, performance increases but as the block size approaches and then passes the source file size, you see decreasing benefit to the point of negative gains (see the values for the 1MB file size) For some of the moderately-sized source files, small blocks (256KB) are best As the size of the source file gets larger (see values for 50MB and up), the smallest block size is not the most efficient (presumably due, at least in part, to the increased number of blocks, increased number of individual transfer requests, and reassembly/committal costs). Once you pass the 250MB source file size, the difference in rate for 1MB to 4MB blocks is more-or-less constant The 1MB block size gives the best average improvement (~16x) but the optimal approach would be to vary the block size based on the size of the source file.    (click chart for full size image) The above is another view of the same data as the prior chart just with the axis changed (x-axis represents file size and plotted data shows improvement by block size). It again highlights the fact that the 1MB block size is probably the best overall size but highlights the benefits of some of the other block sizes at different source file sizes. This last chart shows the change in total duration of the file uploads based on different block sizes for the source file sizes. Nothing really new here other than this view of the data highlights the negative affects of poorly choosing a block size for smaller files.   Summary What we have found so far is that blocking your file uploads and uploading them in parallel results in significant performance improvements. Further, utilizing extension methods and the Task Parallel Library (.NET 4.0) make short work of altering the shipping client library to provide this functionality while minimizing the amount of change to existing applications that might be using the client library for other interactions.   Related Resources Source code for upload test application Source code for random file generator ODatas feed of raw data from non-optimized transfer tests Experiment Metadata Experiment Datasets 2KB Uploads 32KB Uploads 64KB Uploads 128KB Uploads 256KB Uploads 512KB Uploads 1MB Uploads 5MB Uploads 10MB Uploads 25MB Uploads 50MB Uploads 100MB Uploads 250MB Uploads 500MB Uploads 750MB Uploads 1GB Uploads Raw Data OData feeds of raw data from blocked/parallelized transfer tests Experiment Metadata Experiment Datasets Raw Data 256KB Blocks 512KB Blocks 1MB Blocks 2MB Blocks 4MB Blocks Excel worksheet showing summarizations and comparisons

    Read the article

  • Printing an external file in its own printing routine

    - by Yawn.
    I basically have an application that generates reports in a .html file, I use a .html file for the ease of making tables and formatting text. Now I would like to introduce a way of printing the reports from my program. Because I use a .html file, the formatting would not the correct if I was to print it directly from my application (as far as I know). For this reason, I would like to print it just like my web browser would have in order to keep the tabular data intact and the text formatting. Does anyone know a way to do this? Thank you.

    Read the article

  • EXCEL import to sql returning NULL for decimals when in VARCHAR data type

    - by Daniel
    Hi, I am working on a peice of software which has expodentially grown over the last few years and the database needs to be regularly updated. Customers are providing us with data now on large spreadsheets which we format and will start importing into the database. I am using the Import and Export Data (32-bit) Wizard. One column in the database contains values like '1.1.1.2' etc and i am importing them in as a Varchar as that is the data type in the database. However, for values like '8.5', 'NULL' is getting imported insead. It only occurs when there is one decimal point. Is this a formatting error with excel or is it the wrong datatype?

    Read the article

< Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >