Search Results

Search found 31421 results on 1257 pages for 'software performance'.

Page 273/1257 | < Previous Page | 269 270 271 272 273 274 275 276 277 278 279 280  | Next Page >

  • Slow performance when utilizing Interop.DSOFile to search files by Custom Document Property

    - by Gradatc
    I am new to the world of VB.NET and have been tasked to put together a little program to search a directory of about 2000 Excel spreadsheets and put together a list to display based on the value of a Custom Document Property within that spreadsheet file. Given that I am far from a computer programmer by education or trade, this has been an adventure. I've gotten it to work, the results are fine. The problem is, it takes well over a minute to run. It is being run over a LAN connection. When I run it locally (using a 'test' directory of about 300 files) it executes in about 4 seconds. I'm not sure even what to expect as a reasonable execution speed, so I thought I would ask here. The code is below, if anyone thinks changes there might be of use in speeding things up. Thank you in advance! Private Sub listByPt() Dim di As New IO.DirectoryInfo(dir_loc) Dim aryFiles As IO.FileInfo() = di.GetFiles("*" & ext_to_check) Dim fi As IO.FileInfo Dim dso As DSOFile.OleDocumentProperties Dim sfilename As String Dim sheetInfo As Object Dim sfileCount As String Dim ifilesDone As Integer Dim errorList As New ArrayList() Dim ErrorFile As Object Dim ErrorMessage As String 'Initialize progress bar values ifilesDone = 0 sfileCount = di.GetFiles("*" & ext_to_check).Length Me.lblHighProgress.Text = sfileCount Me.lblLowProgress.Text = 0 With Me.progressMain .Maximum = di.GetFiles("*" & ext_to_check).Length .Minimum = 0 .Value = 0 End With 'Loop through all files in the search directory For Each fi In aryFiles dso = New DSOFile.OleDocumentProperties sfilename = fi.FullName Try dso.Open(sfilename, True) 'grab the PT Initials off of the logsheet Catch excep As Runtime.InteropServices.COMException errorList.Add(sfilename) End Try Try sheetInfo = dso.CustomProperties("PTNameChecker").Value Catch ex As Runtime.InteropServices.COMException sheetInfo = "NONE" End Try 'Check to see if the initials on the log sheet 'match those we are searching for If sheetInfo = lstInitials.SelectedValue Then Dim logsheet As New LogSheet logsheet.PTInitials = sheetInfo logsheet.FileName = sfilename PTFiles.Add(logsheet) End If 'update progress bar Me.progressMain.Increment(1) ifilesDone = ifilesDone + 1 lblLowProgress.Text = ifilesDone dso.Close() Next lstResults.Items.Clear() 'loop through results in the PTFiles list 'add results to the listbox, removing the path info For Each showsheet As LogSheet In PTFiles lstResults.Items.Add(Path.GetFileNameWithoutExtension(showsheet.FileName)) Next 'build error message to display to user ErrorMessage = "" For Each ErrorFile In errorList ErrorMessage += ErrorFile & vbCrLf Next MsgBox("The following Log Sheets were unable to be checked" _ & vbCrLf & ErrorMessage) PTFiles.Clear() 'empty PTFiles for next use End Sub

    Read the article

  • Tasks carried out for a Software Project

    - by Sara
    Hi, We were asked to propose a web based system for a shop, for an assignment. As i'm a newbee for project management stuff find it quite difficult to come with Tasks for our Gantt chart. Can someone pls suggest a sample gantt or main tasks followed in developing such system Thanks

    Read the article

  • Facial recognition/detection PHP or software for photo and video galleries

    - by Peter
    I have a very large photo gallery with thousands of similar people, objects, locations, things. The majority of the people in the photos have their own user accounts and avatar photos to match. There are also logical short lists of people potentially in the photo based on additional data available for each photo. I allow users to tag photos with their friends and people they know but an automated process would be better. I've used photo tagger/finder from face.com integrating with Facebook photos and the Google Picasa photo tagger for personal albums also does the same thing and is exactly what I'm looking to do. Is there a PHP script, API for Google Picasa, face.com or other recognition service or any other open source project that provides server-side facial recognition and/or grouping photos by similarity? Examples: As you can see, various photo sharing sites offer the feature, but are there any that provide an API for images stored on my own server or something extensive enough to link into my own gallery and tagging system? viewdle - Face recognition/Tagging for video PHP - Face detection in pure PHP Xarg OpenCV Face.com - app for finding and tagging photos in Facebook Google Picasa - photo sharing TeraSnaps - photo sharing site Google Portrait - photo grouping from Google Image results FaceOnIt - Video face recognition PittPatt - Detection, Recognition, Video Face Mining BetaFace ChaosFace - Real-time Face Detector

    Read the article

  • a good book about software design

    - by Idan
    i'm looking for a book that talks about sofware decision like : when should i use thread pool and shouldn't. and in the first case, explains how. how should i acess my DB , how big my transactions should be how to read XML, to use DOM or SAX, what library to choose, and best ways to parse how to handle client-server app best efficient way and more stuff like that. is a book like that exist ? (preferably in c++ but not that important)

    Read the article

  • cache memory performance

    - by Krewie
    Hello, i just have a general question about cache memory. How would a program perform badly on a cache based system ? , since cache memory stores adresses from main memory that is requested, aswell as adresses that ranges around the same adress as the one copied from the main memory.

    Read the article

  • Spell checker software

    - by Naren
    Hello Guys, I have been assigned a task to find a decent spell checker (UK English) preferably the free one for a project that we are doing. I have looked at Google AJAX API for this. The project contains some young person's (kids less than 18 years old) data which shouldn't allow exposing or storing outside the application boundaries. Google logs the data for research purpose that means Google owns the data whatever we send over the wire through Google API. Is this right? I fired an email to Google regarding the privacy of data and storage but they haven't come back. If you have some knowledge regarding this please share with me. At this point our servers might not have access to external entities that means we might not be able to use Web API for this over the wire. But it may change in the future. That means I have to find out some spell checker alternatives that can sit in our environment and do the job or an external APIs. Would you mind share your findings and knowledge in this regard. I would prefer free services but never know if you have some cracking spell checker for a few quid’s then I don't mind recommending to the project board. Technology using ASP.NET 3.5/4.0, MVC, jQuery, SQL Sever 2008 etc Cheers, Naren

    Read the article

  • Setting up a "cookieless domain" to improve site performance

    - by Django Reinhardt
    I was reading in Google's documentation about improving site speed. One of their recommendations is serving static content (images, css, js, etc.) from a "cookieless domain": Static content, such as images, JS and CSS files, don't need to be accompanied by cookies, as there is no user interaction with these resources. You can decrease request latency by serving static resources from a domain that doesn't serve cookies. Google then says that the best way to do this is to buy a new domain and set it to point to your current one: To reserve a cookieless domain for serving static content, register a new domain name and configure your DNS database with a CNAME record that points the new domain to your existing domain A record. Configure your web server to serve static resources from the new domain, and do not allow any cookies to be set anywhere on this domain. In your web pages, reference the domain name in the URLs for the static resources. This is pretty straight forward stuff, except for the bit where it says to "configure your web server to serve static resources from the new domain, and do not allow any cookies to be set anywhere on this domain". From what I've read, there's no setting in IIS that allows you to say "serve static resources", so how do I prevent ASP.NET from setting cookies on this new domain? At present, even if I'm just requesting a .jpg from the new domain, it sets a cookie on my browser, even though our application's cookies are set to our old domain. For example, ASP.NET sets an ".ASPXANONYMOUS" cookie that (as far as I'm aware) we're not telling it to do. Apologies if this is a real newb question, I'm new at this! Thanks.

    Read the article

  • Can I redistribute Phing with non-free software?

    - by Matt1776
    I am having trouble understanding the terms of the LGPL in light of a program that is not written in C or C++. They speak of libraries being linked and 'derivitive' works. If I were to package a php program and sell it, but within the program the deployment mechanism used the phing package (full up with the entire contents as is and un modified) - would I be violating the terms of the LGPL? For example, If this was a C program that was compiled by linking the phing 'library' then the answer would be easier, it is a derivitive work and therefore unless released under the GPL will not be considered free and also a violation. But this situation is different. I am not linking and not producing a derivitive, i am simply using phing as a deployment tool to move files around and set up the enviornment. Can someone shed some light? Thank you!

    Read the article

  • Setting data source for reports in Crystal Reports 2008 Java (performance)

    - by Daniel
    Hi, we have Crystal Reports 2008 on the server, and use the Java SDK to display reports and convert them to PDF. Since the server hat its own database, we have to set the data source on the DatabaseController to make CR connect to this database. We do it like specified in the docs, and tried the functions in CRJavaHelper, but for a reason unknown to me setting the connection string takes 300ms to 1500ms. What is the fastest way to tell Crystal which datasource to use in its reports? I already saw a JNDI name somewhere, but i don't believe CR actually does a JNDI lookup to find an existing datasource, doesn't it?

    Read the article

  • Visual Studio Performance when editing XAML/Silverlight files

    - by driAn
    When I work on Silverlight projects within Visual Studio 2008, I regularly notice that the XAML editor hangs for up to 10 seconds. This because Visual Studio consumes 100% CPU during that timeframe. Any ideas how I could fix that? I assume this is some kind of background compiling for itellisense or something similiar. It happens during editing, multiple times an hour, without me doing any special actions. System: Server 2008 Std Visual Studio 2008 SP1 latest updates... I wonder if anyone else experienced this issue. Any help would be appreciated.

    Read the article

  • Manual testing Vs Automated testing

    - by mgj
    Respected all, As many know testing can be mainly classified into manual and automated testing. With regard to this certain questions come to mind. Hope you can help... They include: What is the basic difference between the two types of testing? What are the elements of challenges involved in both manual and automated testing? What are the different skill sets required by a software tester for manual and automated testing respectively? What are the different job prospects and growth opportunities among software testers who do manual testing automated testing respectively? Is manual testing under rated to automated testing in anyway(s)? If yes, kindly specify the way. How differently are the manual testers treated in comparison to automated testers in the corporate world?( If they truly are differentiated in any terms as such ) I hope you can share your knowledge in answering these questions.. Thank you for your time..:)

    Read the article

  • Question about SQL Server HierarchyID depth-first performance

    - by AndalusianCat
    I am trying to implement hierarchyID in a table (dbo.[Message]) containing roughly 50,000 rows (will grow substantially in the future). However it takes 30-40 seconds to retrieve about 25 results. The root node is a filler in order to provide uniqueness, therefor every subsequent row is a child of that dummy row. I need to be able to traverse the table depth-first and have made the hierarchyID column (dbo.[Message].MessageID) the clustering primary key, have also added a computed smallint (dbo.[Message].Hierarchy) which stores the level of the node. Usage: A .Net application passes through a hierarchyID value into the database and I want to be able to retrieve all (if any) children AND parents of that node (besides the root, as it is filler). A simplified version of the query I am using: @MessageID hierarchyID /* passed in from application */ SELECT m.MessageID, m.MessageComment FROM dbo.[Message] as m WHERE m.Messageid.IsDescendantOf(@MessageID.GetAncestor((@MessageID.GetLevel()-1))) = 1 ORDER BY m.MessageID From what I understand, the index should be detected automatically without a hint. From searching forums I have seen people utilizing index hints, at least in the case of breadth-first indexes, as apparently CLR calls may be opaque to the query optimizer. I have spent the past few days trying to find a solution for this issue, but to no avail. I would greatly appreciate any assistance, and as this is my first post, I apologize in advance if this would be considered a 'noobish' question, I have read the MS documentation and searched countless forums, but have not came across a succinct description of the specific issue.

    Read the article

  • Improving Performance of Crystal Reports using Stored Procedures

    - by mjh41
    Recently I updated a Crystal Report that was doing all of its work on the client-side (Selects, formulas, etc) and changed all of the logic to be done on the server-side through Stored Procedures using an Oracle 11g database. Now the report is only being used to display the output of the stored procedures and nothing else. Everything I have read on this subject says that utilizing stored procedures should greatly reduce the running time of the report, but it still takes roughly the same amount of time to retrieve the data from the server. Is there something wrong with the stored procedure I have written, or is the issue in the Crystal Report itself? Here is the stored procedure code along with the package that defines the necessary REF CURSOR. CREATE OR REPLACE PROCEDURE SP90_INVENTORYDATA_ALL ( invdata_cur IN OUT sftnecm.inv_data_all_pkg.inv_data_all_type, dCurrentEndDate IN vw_METADATA.CASEENTRCVDDATE%type, dCurrentStartDate IN vw_METADATA.CASEENTRCVDDATE%type ) AS BEGIN OPEN invdata_cur FOR SELECT vw_METADATA.CREATIONTIME, vw_METADATA.RESRESOLUTIONDATE, vw_METADATA.CASEENTRCVDDATE, vw_METADATA.CASESTATUS, vw_METADATA.CASENUMBER, (CASE WHEN vw_METADATA.CASEENTRCVDDATE < dCurrentStartDate AND ( (vw_METADATA.CASESTATUS is null OR vw_METADATA.CASESTATUS != 'Closed') OR TO_DATE(vw_METADATA.RESRESOLUTIONDATE, 'MM/DD/YYYY') >= dCurrentStartDate) then 1 else 0 end) InventoryBegin, (CASE WHEN (to_date(vw_METADATA.RESRESOLUTIONDATE, 'MM/DD/YYYY') BETWEEN dCurrentStartDate AND dCurrentEndDate) AND vw_METADATA.RESRESOLUTIONDATE is not null AND vw_METADATA.CASESTATUS is not null then 1 else 0 end) CaseClosed, (CASE WHEN vw_METADATA.CASEENTRCVDDATE BETWEEN dCurrentStartDate AND dCurrentEndDate then 1 else 0 end) CaseCreated FROM vw_METADATA WHERE vw_METADATA.CASEENTRCVDDATE <= dCurrentEndDate ORDER BY vw_METADATA.CREATIONTIME, vw_METADATA.CASESTATUS; END SP90_INVENTORYDATA_ALL; And the package: CREATE OR REPLACE PACKAGE inv_data_all_pkg AS TYPE inv_data_all_type IS REF CURSOR RETURN inv_data_all_temp%ROWTYPE; END inv_data_all_pkg;

    Read the article

  • Hadoop: Iterative MapReduce Performance

    - by S.N
    Is it correct to say that the parallel computation with iterative MapReduce can be justified only when the training data size is too large for the non-parallel computation for the same logic? I am aware that the there is overhead for starting MapReduce jobs. This can be critical for overall execution time when a large number of iterations is required. I can imagine that the sequential computation is faster than the parallel computation with iterative MapReduce as long as the memory allows to hold a data set in many cases. Is it the only benefit to use the iterative MapReduce? If not, what are the other benefits could be?

    Read the article

  • performance problem looping through table rows

    - by Sridhar
    Hi, I am using jquery to loop through table rows and save the data. If the table has 200 rows it is performing slow. I am getting the javascript message "Stop Running this script" in IE when I call this method. Following is the code I am using to loop through table rows. Can you please let me know if there is a better way to do this. function SaveData() { var $table = $('#' + gridid); var rows = $table.find('tbody > tr').get(); var transactions = []; var $row, empno, newTransaction, $rowChildren; $.each(rows, function(index, row) { $row = $(row); $rowChildren = $row.children("td"); if ($rowChildren.find("input[id*=hRV]").val() === '1') { empno = $rowChildren.find("input[id*=tEmpno]").val(); newTransaction = new Array(); newTransaction[0] = company; newTransaction[1] = $rowChildren.find("input[id*=tEmpno]").val(); newTransaction[2] = $rowChildren.find("input[id*=tPC]").val(); newTransaction[3] = $rowChildren.find("input[id*=hQty]").val(); newTransaction[4] = $rowChildren.find("input[id*=hPR]").val(); newTransaction[5] = $rowChildren.find("input[id*=tJC]").val(); newTransaction[6] = $rowChildren.find("input[id*=tL1]").val(); newTransaction[7] = $rowChildren.find("input[id*=tL2]").val(); newTransaction[8] = $rowChildren.find("input[id*=tL3]").val(); newTransaction[9] = $rowChildren.find("input[id*=tL4]").val(); newTransaction[10] = $rowChildren.find("input[id*=tL5]").val(); newTransaction[11] = $rowChildren.find("input[id*=tL6]").val(); newTransaction[12] = $rowChildren.find("input[id*=tL7]").val(); newTransaction[13] = $rowChildren.find("input[id*=tL8]").val(); newTransaction[14] = $rowChildren.find("input[id*=tL9]").val(); newTransaction[15] = $rowChildren.find("input[id*=tL10]").val(); newTransaction[16] = $rowChildren.find("input[id*=tSF]").val(); newTransaction[17] = $rowChildren.find("input[id*=tCG]").val(); newTransaction[18] = $rowChildren.find("input[id*=tTF]").val(); newTransaction[19] = $rowChildren.find("input[id*=tWK]").val(); newTransaction[20] = $rowChildren.find("input[id*=tAI]").val(); newTransaction[21] = $rowChildren.find("input[id*=tWC]").val(); newTransaction[22] = $rowChildren.find("input[id*=tPI]").val(); newTransaction[23] = "E"; var record = newTransaction.join(';'); transactions.push(record); } }); if (transactions.length > 0) { var strTransactions = transactions.join('|'); //send data to server //here ajax function is called to save data. } }

    Read the article

  • Funniest code names for software projects

    - by furtelwart
    Developers are creative. Not as they create wonderfull GUIs or proof their sense for art with good color combinations, but with code names. Every project has a code name, sometimes official, sometimes private (with a good reason!). Here are my favourites: Android: 1.6 = Donut 2.0 = Eclaire (picture of Google's eclaire) grml (Live distribution based on Debian GNU/Linux, comes from Austria therefore in German) Hustenstopper (cough stopper) Eierspass (egg fun) Meilenschwein (mile pig, it's a pun with milestone) Lackdose-Allergie (lacquer can allergy, it's a pun with lactose allergy) Hello-Wien (pun with Halloween, Wien being German for Vienna) I really like to see the funniest code names you ever heard of. Aren't there any more funny project names?

    Read the article

  • Which Secure Software Development Practices do you Employ?

    - by Michael Howard-MSFT
    I work on a project known as the Security Development Lifecycle (SDL) project at Microsoft (http://microsoft.com/sdl) - in short it's a set of practices that must be used by product groups before they ship products to help improve security. Over the last couple of years, we have published a great deal of SDL documentation, as customers ask for more information about what we're doing. But what I'd like to know is: 1) What are you doing within your organization to help improve the security of your product? 2) What works? What doesn't work? 3) How did you get management to agree to this work? Thanks.

    Read the article

  • Hierarchical data in Linq - options and performance

    - by Anthony
    I have some hierarchical data - each entry has an id and a (nullable) parent entry id. I want to retrieve all entries in the tree under a given entry. This is in a SQL Server 2005 database. I am querying it with LINQ to SQL in C# 3.5. LINQ to SQL does not support Common Table Expressions directly. My choices are to assemble the data in code with several LINQ queries, or to make a view on the database that surfaces a CTE. Which option (or another option) do you think will perform better when data volumes get large? Is SQL Server 2008's HierarchyId type supported in Linq to SQL?

    Read the article

  • performance issue in a select query from a single table

    - by daedlus
    Hi , I have a table as below dbo.UserLogs ------------------------------------- Id | UserId |Date | Name| P1 | Dirty ------------------------------------- There can be several records per userId[even in millions] I have clustered index on Date column and query this table very frequently in time ranges. The column 'Dirty' is non-nullable and can take either 0 or 1 only so I have no indexes on 'Dirty' I have several millions of records in this table and in one particular case in my application i need to query this table to get all UserId that have at least one record that is marked dirty. I tried this query - select distinct(UserId) from UserLogs where Dirty=1 I have 10 million records in total and this takes like 10min to run and i want this to run much faster than this. [i am able to query this table on date column in less than a minute.] Any comments/suggestion are welcome. my env 64bit,sybase15.0.3,Linux

    Read the article

  • Lucene.NET performance

    - by Paul Knopf
    I have a website that runs of a third party search provider that is expensive. I am going to roll my own. Is Lucene.NET capable of ~25,000 products (or documents), each with maybe ten attributes used for filtering? I am looking to do a "narrow/drill down" or "faceted search". Does that sound like to much to ask from Lucene.NET?

    Read the article

< Previous Page | 269 270 271 272 273 274 275 276 277 278 279 280  | Next Page >