Search Results

Search found 13249 results on 530 pages for 'performance tuning'.

Page 400/530 | < Previous Page | 396 397 398 399 400 401 402 403 404 405 406 407  | Next Page >

  • Returning IQueryable or Enumerated Object

    - by Tarik
    Hello everyone, I was wondering about the performance difference between these two scenarios and what could the disadvantages be over each other? First scenario : public class Helper //returns IQueryable { public IQueryable<Customer> CurrentCustomer { get{return new DataContext().Where(t=>t.CustomerId == 1); } } public class SomeClass { public void Main() { Console.WriteLine(new Helper().CurrentCustomer.First().Name; } } The second scenario : public class Helper //returns Enumerated result { public Customer CurrentCustomer { get{return new DataContext().First(t=>t.CustomerId == 1); } } public class SomeClass { public void Main() { Console.WriteLine(new Helper().CurrentCustomer.Name; } } Thanks in advance.

    Read the article

  • Design for Vacation Tracking System

    - by Aaronaught
    I have been tasked with developing a system for tracking our company's paid time-off (vacation, sick days, etc.) At the moment we are using an Excel spreadsheet on a shared network drive, and it works pretty well, but we are concerned that we won't be able to "trust" employees forever and sometimes we run into locking issues when two people try to open the spreadsheet at once. So we are trying to build something a little more robust. I would like some input on this design in terms of maintainability, scalability, extensibility, etc. It's a pretty simple workflow we need to represent right now: I started with a basic MS Access schema like this: Employees (EmpID int, EmpName varchar(50), AllowedDays int) Vacations (VacationID int, EmpID int, BeginDate datetime, EndDate datetime) But we don't want to spend a lot of time building a schema and database like this and have to change it later, so I think I am going to go with something that will be easier to expand through configuration. Right now the vacation table has this schema: Vacations (VacationID int, PropName varchar(50), PropValue varchar(50)) And the table will be populated with data like this: VacationID | PropName | PropValue -----------+--------------+------------------ 1 | EmpID | 4 1 | EmpName | James Jones 1 | Reason | Vacation 1 | BeginDate | 2/24/2010 1 | EndDate | 2/30/2010 1 | Destination | Spectate Swamp 2 | ... | ... I think this is a pretty good, extensible design, we can easily add new properties to the vacation like the destination or maybe approval status, etc. I wasn't too sure how to go about managing the database of valid properties, I thought of putting them in a separate PropNames table but it gets complicated to manage all the different data types and people say that you shouldn't put CLR type names into a SQL database, so I decided to use XML instead, here is the schema: <VacationProperties> <PropertyNames>EmpID,EmpName,Reason,BeginDate,EndDate,Destination</PropertyNames> <PropertyTypes>System.Int32,System.String,System.String,System.DateTime,System.DateTime,System.String</PropertyTypes> <PropertiesRequired>true,true,false,true,true,false</PropertiesRequired> </VacationProperties> I might need more fields than that, I'm not completely sure. I'm parsing the XML like this (would like some feedback on the parsing code): string xml = File.ReadAllText("properties.xml"); Match m = Regex.Match(xml, "<(PropertyNames)>(.*?)</PropertyNames>"; string[] pn = m.Value.Split(','); // do the same for PropertyTypes, PropertiesRequired Then I use the following code to persist configuration changes to the database: string sql = "DROP TABLE VacationProperties"; sql = sql + " CREATE TABLE VacationProperties "; sql = sql + "(PropertyName varchar(100), PropertyType varchar(100) "; sql = sql + "IsRequired varchar(100))"; for (int i = 0; i < pn.Length; i++) { sql = sql + " INSERT VacationProperties VALUES (" + pn[i] + "," + pt[i] + "," + pv[i] + ")"; } // GlobalConnection is a singleton new SqlCommand(sql, GlobalConnection.Instance).ExecuteReader(); So far so good, but after a few days of this I then realized that a lot of this was just a more specific kind of a generic workflow which could be further abstracted, and instead of writing all of this boilerplate plumbing code I could just come up with a workflow and plug it into a workflow engine like Windows Workflow Foundation and have the users configure it: In order to support routing these configurations throw the workflow system, it seemed natural to implement generic XML Web Services for this instead of just using an XML file as above. I've used this code to implement the Web Services: public class VacationConfigurationService : WebService { [WebMethod] public void UpdateConfiguration(string xml) { // Above code goes here } } Which was pretty easy, although I'm still working on a way to validate that XML against some kind of schema as there's no error-checking yet. I also created a few different services for other operations like VacationSubmissionService, VacationReportService, VacationDataService, VacationAuthenticationService, etc. The whole Service Oriented Architecture looks like this: And because the workflow itself might change, I have been working on a way to integrate the WF workflow system with MS Visio, which everybody at the office already knows how to use so they could make changes pretty easily. We have a diagram that looks like the following (it's kind of hard to read but the main items are Activities, Authenticators, Validators, Transformers, Processors, and Data Connections, they're all analogous to the services in the SOA diagram above). The requirements for this system are: (Note - I don't control these, they were given to me by management) Main workflow must interface with Excel spreadsheet, probably through VBA macros (to ease the transition to the new system) Alerts should integrate with MS Outlook, Lotus Notes, and SMS (text messages). We also want to interface it with the company Voice Mail system but that is not a "hard" requirement. Performance requirements: Must handle 250,000 Transactions Per Second Should be able to handle up to 20,000 employees (right now we have 3) 99.99% uptime ("four nines") expected Must be secure against outside hacking, but users cannot be required to enter a username/password. Platforms: Must support Windows XP/Vista/7, Linux, iPhone, Blackberry, DOS 2.0, VAX, IRIX, PDP-11, Apple IIc. Time to complete: 6 to 8 weeks. My questions are: Is this a good design for the system so far? Am I using all of the recommended best practices for these technologies? How do I integrate the Visio diagram above with the Windows Workflow Foundation to call the ConfigurationService and persist workflow changes? Am I missing any important components? Will this be extensible enough to support any scenario via end-user configuration? Will the system scale to the above performance requirements? Will we need any expensive hardware to run it? Are there any "gotchas" I should know about with respect to cross-platform compatibility? For example would it be difficult to convert this to an iPhone app? How long would you expect this to take? (We've dedicated 1 week for testing so I'm thinking maybe 5 weeks?)

    Read the article

  • Choosing between Berkeley DB Core and Berkeley DB JE

    - by zokier
    I'm designing a Java based web-app and I need a key-value store. Berkeley DB seems fitting enough for me, but there appears to be TWO Berkeley DBs to choose from: Berkeley DB Core which is implemented in C, and Berkeley DB Java Edition which is implemented in pure Java. The question is, how to choose which one to use? With web-apps scalability and performance is quite important (who knows, maybe my idea will become the next Youtube), and I couldn't find easily any meaningful benchmarks between the two. I have yet to familiarize with Cores Java API, but I find it hard to believe that it could be much worse than Java Editions, which seems to be quite nice. If some other key-value store would be much better, feel free to recommend that too. I'm storing smallish binary blobs, and keys probably will be hashes of the data, or some other unique id.

    Read the article

  • Silverlight Threading and its usage

    - by Harryboy
    Hello Experts, Scenario : I am working on LOB application, as in silverlight every call to service is Async so automatically UI is not blocked when the request is processed at server side. Silverlight also supports threading as per my understanding if you are developing LOB application threads are most useful when you need to do some IO operation but as i am not using OOB application it is not possible to access client resource and for all server request it is by default Async. In above scenario is there any usage of Threading or can anyone provide some good example where by using threading we can improve performance. I have tried to search a lot on this topic but everywhere i have identified some simple threading example from which it is very difficult to understand the real benefit. Thanks for help

    Read the article

  • web2py or grok (zope) on a big portal,

    - by Robert
    Hi, I am planning to make some big project (1 000 000 users, approximately 500 request pre second - in hot time). For performance I'm going to use no relational dbms (each request could cost lot of instructions in relational dbms like mysql) - so i can't use DAL. My question is: how web2py is working with a big traffic, is it work concurrently? I'm consider to use web2py or Gork - Zope, How is working zodb(Z Object Database) with a lot of data? Is there some comparison with object-relational postgresql? Could you advice me please.

    Read the article

  • how Postfix anti spam configuration works with DNS-based Blackhole List providers

    - by Ashish
    Hello, I have setup a Postfix mail server for incoming mails that is required to never reply to external enviornment i.e it will accept all incoming mails and never reply anything that can be used as a trace to locate and verify it's existence. I have implemented the Postfix anti-UCE configuration by using the following settings in postfix main.cf for countering spam generating mail servers: 'smtpd_recipient_restrictions = reject_rbl_client zen.spamhaus.org, reject_rbl_client bl.spamcop.net' Now i have certain doubts/questions: How Postfix is able to communicate with Black hole list providers i.e How this whole process works?, e.g here they are zen.spamhaus.org, bl.spamcop.net, so that i can test the performance of whole process. Can a header be added in the received mail regarding the status of the results of the above verification process, since i will not reply any traces from my incoming mail receiving Postfix server, so i need this feature? Please post relevant links for reference. Thanks in advance!!! Ashish

    Read the article

  • Nhibernate equivalent of LinqToEntitiesDomainService in RIA

    - by VexXtreme
    Hi, When using Entity Framework with RIA domain services, domain services are inherited from LinqToEntitiesDomainService, which, I suppose, allows you to make linq queries on a low level (client-side) which propagate into ORM; meaning that all queries are performed on the database and only relevant results are retrieved to the server and thus the client. Example: var query = context.GetCustomersQuery().Where(x => x.Age > 50); Right now we have a domain service which inherits from DomainService, and retrieves data through NHibernate session as in: virtual public IQueryable<Customer> GetCustomers() { return sessionManager.Session.Linq<Customer>(); } The problem with this approach is that it's impossible to make specific queries without retrieving entire tables to the server (or client) and filtering them there. Is there a way to make linq querying work with NHibernate over RIA like it works with EF? If not, we're willing to switch to EF because of this, because performance impact would be just too severe. Thanks

    Read the article

  • Fastest way to convert datatable to generic list

    - by Joel Coehoorn
    I have a data tier select method that returns a datatable. It's called from a business tier method that should then return a strongly typed generic List. What I want to do is very similar (but not the same as) this question: http://stackoverflow.com/questions/208532/how-do-you-convert-a-datatable-into-a-generic-list What's different is that I want the list to contain strongly-typed objects rather than datarows (also, I don't have linq avaiable here yet). I'm concerned about performance. The business tier method will in turn be called from the presentation tier, and the results will be iterated for display to the user. It seems very wasteful to add an extra iteration at the business tier, only do it again right away for the presentation, so I want this to be as quick as possible. This is a common task, so I'm really looking for a good pattern that can be repeated over and over.

    Read the article

  • Cassandra and asp.net (C#)

    - by Sergey Osypchuk
    I am interested to create portal on cassandra services, since I faced some performance and scale issues starting from 1 million of records. Definitely, it could be solved, but I am interested on other options. My main issues is cost of updating all necessary indexes, to make reading fast. First, is cassandra is good way for asp.net programmers? I mean, maybe there is some other projects, which worth to take a look And second, can you provide any documentation samples on how to start with cassandra programming from C#?

    Read the article

  • Ext GWT (GXT) tooltip over a grid row

    - by Eduardo Palma
    I'm developing a custom tooltip using Ext GWT (GXT) for a project of mine, and this tooltip has to appear over Grid rows when they're selected. I can't use the default GXT tooltip or quicktip because I need be able to add Components (like buttons) to this tooltip. The problem is that the GXT Grid component doesn't expose a event related to mousing over a row (although there's RowClick and RowMouseDown). I tried adding a listener to the Grid with the OnMouseOver and OnMouseOut events anyway, but it doesn't work as expected. It fires these events up whenever you mouse over any of the divs and spans that composes a row. The only way I see to solve this is to subclass the GridView component and make each row become a Component itself, but that would be a lot of work and would probably impact performance as well. I can't help but think there's a better way to do this. Could someone more experienced with GXT give me a light?

    Read the article

  • Why is WPFToolkit DataGrid so slow when binding?

    - by Schneider
    I have a very simple test application where I have two objects, each with a small collection of items. when I select an object I display its collection in a WPFToolkit DataGrid. The problem is there is a noticeable delay, such that if you press up/down keys to toggle selection between objects you can see it can't keep up. Why is the performance so bad? <Window x:Class="SlowGridBinding.MainWindow" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:Controls="clr-namespace:Microsoft.Windows.Controls;assembly=WPFToolkit" Title="MainWindow" Height="350" Width="525"> <StackPanel> <ListBox ItemsSource="{Binding Shops}" DisplayMemberPath="Name" IsSynchronizedWithCurrentItem="True"/> <Controls:DataGrid ItemsSource="{Binding Shops/Vegetables}" AutoGenerateColumns="True"/> </StackPanel> The DataContext is populated with some test classes filled with 50 items of random test data.

    Read the article

  • How to read time phased data from Project Server 2007 directly from Project Server Database ?

    - by Nikhil Vaghela
    I am working on a custom web part for Project Web Access, for Project Server 2007. We are so far using PSI web services only to read and write data from and to Project Server 2007 databases. But there is a signinficant performance issue when you retrieve time phased data through Statusing web service, it is basically an expensive call for querying time phased data for each tasks. I want to access Time phased data entered by user for each tasks by directly hitting the Project Server Database. [ I do not want the solution suggested at this link : http://blogs.msdn.com/project_programmability/archive/2007/05/24/getting-at-the-task-time-phased-data.aspx as it reads data from reporting database which gets entry only after the project is published. ] I want to get time phased data as soon as user enters it. Any idea ? Thanks.

    Read the article

  • Serialize struct with pointers to NSData

    - by leolobato
    Hey guys, I need to add some kind of archiving functionality to a Objective-C Trie implementation (NDTrie on github), but I have very little experience with C and it's data structures. struct trieNode { NSUInteger key; NSUInteger count, size; id object; __strong struct trieNode ** children; __strong struct trieNode * parent; }; @interface NDTrie (Private) - (struct trieNode*)root; @end What I need is to create an NSData with the tree structure from that root - or serialize/deserialize the whole tree some other way (conforming to NSCoding?), but I have no clue how to work with NSData and a C struct containing pointers. Performance on deserializing the resulting object would be crucial, as this is an iPhone project and I will need to load it in the background every time the app starts. What would be the best way to achieve this? Thanks!

    Read the article

  • Memory leak in xbap application

    - by Arvind
    Hi, We are using many custom controls by inheriting form the WPFcontrols as the base and customizing it for our need. However, the memory used by these controls are not released, even after pages using the controls are closed, until the whole application is closed. As these application has to work for a whole day performance decreases as more and more memory gets held up. When we profiled our page we found that the controls where not getting collected as there where some binding reference or some borders or brushes etc not getting cleared from that control. We tried to use the Unload event of the controls to remove the events and some references from the control. This reduced the leak to some extent but this was slowing down closing of the page also the unload event was getting triggered when the control was even collapsed. Is there any other ways to overcome the leak? Are there any best practices to prevent memory leaks? Thanks Arvind

    Read the article

  • Difference between performSelectorInBackground and NSOperation Subclass

    - by AmitSri
    I have created one testing app for running deep counter loop. I run the loop fuction in background thread using performSelectorInBackground and also NSOperation subclass separately. I am also using performSelectorOnMainThread to notify main thread within backgroundthread method and [NSNotificationCenter defaultCenter] postNotificationName within NSOperation subclass to notify main thread for updating UI. Initially both the implementation giving me same result and i am able to update UI without having any problem. The only difference i found is the Thread count between two implementations. The performSelectorInBackground implementation created one thread and got terminated after loop finished and my app thread count again goes to 1. The NSOperation subclass implementation created two new threads and keep exists in the application and i can see 3 threads after loop got finished in main() function. So, my question is why two threads created by NSOperation and why it didn't get terminated just like the first background thread implementation? I am little bit confuse and unable to decide which implementation is best in-terms of performance and memory management. Thanks

    Read the article

  • What are modern and old compilers written in?

    - by ulum
    As a compiler, other than an interpreter, only needs to translate the input and not run it the performance of itself should be not that problematic as with an interpreter. Therefore, you wouldn't write an interpreter in, let's say Ruby or PHP because it would be far too slow. However, what about compilers? If you would write a compiler in a scripting language maybe even featuring rapid development you could possibly cut the source code and initial development time by halv, at least I think so. To be sure: With scripting language I mean interpreted languages having typical features that make programming faster, easier and more enjoyable for the programmer, usually at least. Examples: PHP, Ruby, Python, maybe JavaScript though that may be an odd choice for a compiler What are compilers normally written in? As I suppose you will respond with something low-level like C, C++ or even Assembler, why? Are there compilers written in scripting languages? What are the (dis)advantages of using low or high level programming languages for compiler writing?

    Read the article

  • Windows Server Appfabric

    - by yuben
    I am considering using Windows Server Appfabric for it caching functionality. I have an existing classic ASP application that I want to rewrite in ASP.NET MVC. However, I want to be able to do this "piecemeal" i.e. a few pages at a time. The problem is session state between the ASP and ASP.Net MVC application. I could use a database but I would like to use Appfabric since it has good scalabilty, admin, etc. My question is: does the Appfabric caching service/functionality have an API that I could wrap in .Net and expose to my classic ASP application as a com object? I could then change all the Session and Application caching in the classic application to use the com object i.e. Appfabric. In this way I can share session state between ASP.Net MVC and classic ASP. I will have to test the performance penalty associated with interop as well.

    Read the article

  • Is tcerl for Mnesia production ready? Is there any alternatives?

    - by Sanoj
    I would like to create a scalable web service using Mnesia as database. However Mnesia per default isn't scalable for persistent storgage since it is using Dets (which has a 2GB limit) as backend. I have seen discussions about extending Mnesia with MnesiaEx and use tcerl as backend. It sounds good and have showed good performance. However, I have seen in a talk about Tokyo Cabinet and CouchDB with Mnesia that there are some issues: issues with durability issues with memory leaks issues with crashes Is tcerl + Mnesia really production ready? And is there any other alternatives? How doe´s companies overcome these issues if they use Mnesia in bigger systems? Is there a working solution with Mnesia and Tokyo Tyrant that is working better?

    Read the article

  • Android: Filtering a SimpleCursorAdapter ListView

    - by Diego Tori
    Right now, I'm running into issues trying to implement a FilterQueryProvider in my custom SimpleCursorAdapter, since I'm unsure of what to do in the FilterQueryProvider's runQuery function. In other words, since the query that comprises my ListView basically gets the rowID, name, and a third column from my databases's table, I want to be able to filter the cursor based on the partial value of the name column. However, I am uncertain of whether I can do this directly from runQuery without expanding my DB class since I want to filter the existing cursor, or will I have to create a new query function in my DB class that partially searches my name column, and if so, how would I go about creating the query statement while using the CharSequence constraint argument in runQuery? I am also concerned about the performance issues associated with trying to run multiple queries based on partial text since the DB table in question has about 1300-1400 rows. In other words, would I run into a bottleneck trying to filter the cursor?

    Read the article

  • What is the leading LINQ for JavaScript library?

    - by Tom Tresansky
    I'm looking for a JavaScript library that will allow me to query complex JSON objects using a LINQ-like syntax. A quick search found a couple of promising options that look they might offer what I need: LINQ to JavaScript and jLinq Does any one have any experience using them? What are some pros and cons? Is the performance comparable? Does the function-passing syntax of LINQ to JavaScript offer any hidden benefits (I personally find the syntax of jLinq more appealing on first glance)? What have you found lacking in either project? Did you ever try contacting the authors? How responsive were they? What project is more widely used?

    Read the article

  • jquery error() calls showing up in firebug profile

    - by Aros
    I am working on an ASP.NET application that make a lot of jquery and javascript calls and trying to optimize the client side code as much as possible. (This web application is only designed to run on special hardware that has very low memory and processing power.) The profiler in firebug is great for figuring out what calls are taking up the most time. I have already optimized a lot of my selectors and it is much faster. However the profile shows a lot of jquery error() calls. In the attached image of the firebug profile window you can see it was called 52 times, accounting for 15.4 of the processing time. Is that normal for jquery to call its error() like that? My code works flawlessy, and there are no error messages in the firefox error console. It seems like that is a significant performance hit. Is there anyway to get more info on what the errors are? Thanks.

    Read the article

  • Why are listener lists Lists?

    - by Joonas Pulakka
    Why are listener lists (e.g. in Java those that use addXxxListener() and removeXxxListener() to register and unregister listeners) called lists, and usually implemented as Lists? Wouldn't a Set be a better fit, since in the case of listeners there's No matter in which order they get called (although there may well be such needs, but they're special cases; ordinary listener mechanisms make no such guarantees), and No need to register the same listener more than once (whether doing that should result in calling the same listener 1 times or N times, or be an error, is another question) Is it just a matter of tradition? Sets are some kind of lists under the hood anyway. Are there performance differences? Is iterating through a List faster or slower than iterating through a Set? Does either take more or less memory? The differences are certainly almost negligible.

    Read the article

  • Alternative to distributed caching

    - by Chen
    Hi, There is a technical requirement to scale a new system easily. This new system consists of three tiered applications (as a batch processors). Each tier will contains at least 2 servers with the same application resides on each server. So, when one of the tier reaches peak performance, we could extend the scalability easily by adding a new server and the same application to off-load some of the processing loads. The problem is that one or two of the three tiers require heavy caching (about 3 million records and increasing). I'm thinking of using distributed caching system to overcome this problem but the new distributed caching system will means an additional point of failure as applications now need to interact with additional caching systems for processing. I'm currently looking at ncache but just wondering if there is an alternatives to this problem? or is there any other comparable distributed caching system that maybe similar or better than ncache and provide enterprise supports too? Thanks, Chen

    Read the article

  • Rendering formatted text in a direct3d application

    - by Fire Lancer
    I need to render some formatted text (colours, different font sizes, underlines, bold, etc) however I'm not sure how to go about doing it. D3DXFont only allows text of a single font/size/weight/colour/etc to be rendered at once, and I cant see a practical way to "combine" multiple calls to ID3DXFont::DrawText to do such things... I looked around and there doesn't seem to be any existing libraries that do these things, but I have no idea how to implement such a text renderer, and I couldn't even find any documentation on how such a text render would work, only rendering simple fixed width, ASCII bitmap fonts which looking at it is probably an entirely different approach that is only suitable for rendering simple blocks of text where Unicode is not important. If there's no direct3d font renders capable of doing this, is there any other renderers (eg for use in rendering rich text in a normal window), and would rendering those to a texture in RAM, then uploading that to the video card to render onto the back buffer yield reasonable performance?

    Read the article

  • jQuery Tips and Tricks

    - by roosteronacid
    Miscellaneous Creating an HTML Element and keeping a reference, Checking if an element exists, Writing your own selectors by Andreas Grech The data function - bind data to elements by TenebrousX The noConflict function - Freeing up the $ variable by Oli Check the index of an element in a collection by redsquare The jQuery metadata plug-in by kRON Live event handlers by TM Isolate the $ variable in noConflict mode by nickf Replace anonymous functions with named functions by ken Microsoft AJAX framework and jQuery bridge by Slace jQuery tutorials by egyamado Remove elements from a collection and preserve chainability by roosteronacid Declare $this at the beginning of anonymous functions by Ben FireBug lite, Hotbox plug-in, tell when an image has been loaded and Google CDN by Colour Blend Judicious use of third-party jQuery scripts by harriyott The each function by Jan Zich Form Extensions plug-in by Chris S Syntax No-conflict mode by roosteronacid Shorthand for the ready-event by roosteronacid Line breaks and chainability by roosteronacid Nesting filters by Nathan Long Cache a collection and execute commands on the same line by roosteronacid Contains selector by roosteronacid [Defining properties at element creation][26] by roosteronacid Optimization Optimize performance of complex selectors by roosteronacid The context parameter by lupefiasco Save and reuse searches by Nathan Long

    Read the article

< Previous Page | 396 397 398 399 400 401 402 403 404 405 406 407  | Next Page >