Search Results

Search found 30858 results on 1235 pages for 'database tuning'.

Page 410/1235 | < Previous Page | 406 407 408 409 410 411 412 413 414 415 416 417  | Next Page >

  • Advanced All In One .NET Framework (should i go for a software factory ?)

    - by alfredo dobrekk
    Hi, i m starting a new project that would basically take input from user and save them to database among about 30 screens, and i would like to find a framework that will allow the maximum number of these features out of the box : .net c#. windows form. unit testing continuous integration logging screens with lists, combo boxes, text boxes, add, delete, save, cancel that are easy to update when you add a property to your classes or a field to your database. auto completion on controls to help user find its way use of an orm like nhibernate easy multithreading and display of wait screens for user easy undo redo tabbed child windows search forms ability to grant access to some functionnalities according to user profiles mvp/mvvm or whatever design patterns either some code generation from database to c# classe or generation of database schema from c# classes some kind of database versioning / upgrade to easily update database when i release patches to application once in production automatic control resizing code metrics analysis some code generator i can use against my entities that would generate some rough form i can rearrange after code documentation generator ... At this point i have 3 options : Build from scratch on top of clr :( Find functionnalities among several open source framework and use them as a stack for infrastucture Find a "software factory" I know its lot but i really would like to use existing code to build upon so i can focus on business rules. What open source tools would u use to achieve these ?

    Read the article

  • PDO closeCursor Error

    - by Metropolis
    Hey Everyone, I currently have a database layer that I wrote myself and I have been using it now for over a year without any problems. The database class uses PDO, and there are two different databases that I regularly connect to (MySQL and MS SQL). The MS SQL database is used for Accpac accounting storage, and the MySQL database is used for everything else. In one of the MySQL databases I have all of the dsn's listed which I use to create the string I need to connect to the MS SQL databases. I have a new program I am trying to write which I am taking employee data from one of the MySQL databases, and using the employee ID to get the employee's information from the MS SQL database. For some reason, whenever I run the program it will get through about 1200 records (out of 11k) and then crash with an error like the following, Fatal error: Call to a member function closeCursor() on a non-object I have tried moving the loops around in many different ways, and I have tried manually closing the connections by setting the database handle to null. Nothing I do seems to work. Thanks for any help! Metropolis

    Read the article

  • Drools - Doing Complex Stuff inside a Rule Condition or Consequence

    - by mfcabrera
    Hello, In my company we are planning to use Drools a BRE for couple of projects. Now we trying to define some best-practices. My question is what should be and shouldn't be done inside a Rule Condition/Consequence. Given that we can write Java directly or call methods (for example From a Global object in the Working Memory). Example. Given a Rule that evaluates a generic Object (e.g. Person) have property set to true. Now, that specific propertie can only be defined for that Object going to the database and fetching that info. So we have two ways of implementing that: Alternative A: Go to the database and fetch the object property (true/false, a code) Insert the Object in the working memory Evaluate the rule Alternative B: Insert a Global Object that has a method that connects to the database and check for the property for the given object. Insert the Object to eval in Working Memory In the rule, call the Global Object and perform the access to the database Which of those is considered better? I really like A, but sometimes B is more straightforward, however what would happen if something like a Exception from the Database is raised? I have seen the alternative B implemented in the Drools 5.0 Book from Packt Publishing,however they are doing a mocking and they don't talk about the actual implications of going to the database at all. Thank you,

    Read the article

  • Where do you put your dependencies?

    - by The All Foo
    If I use the dependency injection pattern to remove dependencies they end up some where else. For example, Snippet 1, or what I call Object Maker. I mean you have to instantiate your objects somewhere...so when you move dependency out of one object, you end up putting it another one. I see that this consolidates all my dependencies into one object. Is that the point, to reduce your dependencies so that they all reside in a single ( as close to as possible ) location? Snippet 1 - Object Maker <?php class ObjectMaker { public function makeSignUp() { $DatabaseObject = new Database(); $TextObject = new Text(); $MessageObject = new Message(); $SignUpObject = new ControlSignUp(); $SignUpObject->setObjects($DatabaseObject, $TextObject, $MessageObject); return $SignUpObject; } public function makeSignIn() { $DatabaseObject = new Database(); $TextObject = new Text(); $MessageObject = new Message(); $SignInObject = new ControlSignIn(); $SignInObject->setObjects($DatabaseObject, $TextObject, $MessageObject); return $SignInObject; } public function makeTweet( $DatabaseObject = NULL, $TextObject = NULL, $MessageObject = NULL ) { if( $DatabaseObject == 'small' ) { $DatabaseObject = new Database(); } else if( $DatabaseObject == NULL ) { $DatabaseObject = new Database(); $TextObject = new Text(); $MessageObject = new Message(); } $TweetObject = new ControlTweet(); $TweetObject->setObjects($DatabaseObject, $TextObject, $MessageObject); return $TweetObject; } public function makeBookmark( $DatabaseObject = NULL, $TextObject = NULL, $MessageObject = NULL ) { if( $DatabaseObject == 'small' ) { $DatabaseObject = new Database(); } else if( $DatabaseObject == NULL ) { $DatabaseObject = new Database(); $TextObject = new Text(); $MessageObject = new Message(); } $BookmarkObject = new ControlBookmark(); $BookmarkObject->setObjects($DatabaseObject,$TextObject,$MessageObject); return $BookmarkObject; } }

    Read the article

  • Looking for an email template engine for end-users ...

    - by RizwanK
    We have a number of customers that we have to send monthly invoices too. Right now, I'm managing a codebase that does SQL queries against our customer database and billing database and places that data into emails - and sends it. I grow weary of maintaining this every time we want to include a new promotion or change our customer service phone numbers. So, I'm looking for a replacement to move more of this into the hands of those requesting the changes. In my ideal world, I need : A WYSIWYG (man, does anyone even say that anymore?) email editor that generates templates based upon the output from a Database Query. The ability to drag and drop various fields from the database query into the email template. Display of sample email results with the database query. Web application, preferably not requiring IIS. Involve as little code as possible for the end-user, but allow basic functionality (i.e. arrays/for loops) Either comes with it's own email delivery engine, or writes output in a way that I can easily write a Python script to deliver the email. Support for generic Database Connectors. (I need MSSQL and MySQL) F/OSS So ... can anyone suggest a project like this, or some tools that'd be useful for rolling my own? (My current alternative idea is using something like ERB or Tenjin, having them write the code, but not having live-preview for the editor would suck...)

    Read the article

  • Quick and easy flood protection?

    - by James P
    I have a site where a user submits a message using AJAX to a file called like.php. In this file the users message is submitted to a database and it then sends a link back to the user. In my Javascript code I disabled the text box the user types into when they submit the AJAX request. The only problem is, a malicious user can just constantly send POST requests to like.php and flood my database. So I would like to implement simple flood protection. I don't really want the hassle of another database table logging users IPs and such... as if they are flooding my site there will be a lot of database read/writes slowing it down. I thought about using sessions, like have a session that contains a timestamp that gets checked every time they send data to like.php, and if the current time is before the timestamp let them add data to the database, otherwise send out an error and block them. If they are allowed to enter something into the database, update their session with a new timestamp. What do you think? Would this be the best way to go about it or are there easier alternatives? Thanks for any help. :)

    Read the article

  • Who needs singletons?

    - by sexyprout
    Imagine you access your MySQL database via PDO. You got some functions, and in these functions, you need to access the database. The first thing I thought of is global, like: $db = new PDO('mysql:host=127.0.0.1;dbname=toto', 'root', 'pwd'); function some_function() { global $db; $db->query('...'); } But it's considered as a bad practice. So, after a little search, I ended up with the Singleton pattern, which "applies to situations in which there needs to be a single instance of a class." According to the example of the manual, we should do this: class Database { private static $instance, $db; private function __construct(){} static function singleton() { if(!isset(self::$instance)) self::$instance = new __CLASS__; return self:$instance; } function get() { if(!isset(self::$db)) self::$db = new PDO('mysql:host=127.0.0.1;dbname=toto', 'user', 'pwd') return self::$db; } } function some_function() { $db = Database::singleton(); $db->get()->query('...'); } some_function(); But I just can't understand why you need that big class when you can do it merely with: class Database { private static $db; private function __construct(){} static function get() { if(!isset(self::$rand)) self::$db = new PDO('mysql:host=127.0.0.1;dbname=toto', 'user', 'pwd'); return self::$db; } } function some_function() { Database::get()->query('...'); } some_function(); This last one works perfectly and I don't need to worry about $db anymore. But maybe I'm forgetting something. So, who's wrong, who's right?

    Read the article

  • How to extend an existing Ruby on Rails CMS to host multiple sites?

    - by Andrew
    I am trying to build a CMS I can use to host multiple sites. I know I'm going to end up reinventing the wheel a million times with this project, so I'm thinking about extending an existing open source Ruby on Rails CMS to meet my needs. One of those needs is to be able to run multiple sites, while using only one code-base. That way, when there's an update I want to make, I can update it in one place, and the change is reflected on all of the sites. I think that this will be able to scale by running multiple instances of the application. I think that I can use the domain/subdomain to determine which data to display. For example, someone goes to subdomain1.mysite.com and the application looks in the database for the content for subdomain1. The problem I see is with most pre-built CMS solutions, they are only designed to host one site, including the one I want to use. So the database is structured to work with one site. However, I had the idea that I could overcome this by "creating a new database" for each site, then specifying which database to connect to based on the domain/subdomain as I mentioned above. I'm thinking of hosting this on Heroku, so I'm wondering what my options for this might be. I'm not very familiar with Amazon S3, or Amazon SimpleDB, but I feel like there's some sort of "cloud database" that would make this solution a lot more realistic, than creating a new MySQL database for each site. What do you think? Am I thinking about this the wrong way? What advice do you have to offer in this area?

    Read the article

  • servlet connection to DB

    - by underW
    Initially, after reading books on the subject, I firmly believed that the algorithm for working with a database from a servlet is as follows: create a connection - connect to the database - form a request - send the request to the database - get the query results - process them - close connection - OK. Now, with a better understanding of the practical side, I realized that nobody does it that way, and everything happens through a connection pool according to the following algorithm: initialize the servlet - create a connection pool - a request comes from a user - take a free connection from the pool - form a request - send the request to the database - get the query results - process them - return the connection back to the pool - ok. Now I have this problem: We have 100 users, they are divided into 10 groups, each group has it's own username and password to connect to the database. Moreover, each group may have different rights to the database. How am I supposed to use a connection pool in this situation? If I understand correctly, a pool is nothing more than just a group of similar connections with a single login and password. And here I have 10 pairs of username / password. It looks like I cannot use the pool in this situation. What should I do?

    Read the article

  • How to synchronize two (or n) replication processes for SQL Server databases?

    - by Yauheni Sivukha
    There are two master databases and two read-only copies updated by standard transactional replication. It is needed to map some entity from both read-only databases, lets say that A databases contains orders and B databases contains lines. The problem is that replication to one database can lag behind replication of second database, and at the moment of mapping R-databases will have inconsistent data. For example. We stored 2 orders with lines at 19:00 and 19:03. Mapping process started at 19:05, but to the moment of mapping A database replication processed all changes up to 19:03, but B database replication processed only changes up to 19:00. After mapping we will have order entity with order as of 19:03 and lines as of 19:00. The troubles are guaranteed:) In my particular case both databases have temporal model, so it is possible to fetch data for every time slice, but the problem is to identify time of latest replication. Question: How to synchronize replication processes for several databases to avoid situation described above? Or, in other words, how to compare last time of replication in each database? UPD: The only way I see to synchronize is to continuously write timestamps into service tables in each database and to check these timestamps on replicated servers. Is that acceptable solution?

    Read the article

  • Null reference but it's not?

    - by Clint Davis
    This is related to my previous question but it is a different problem. I have two classes: Server and Database. Public Class Server Private _name As String Public Property Name() As String Get Return _name End Get Set(ByVal value As String) _name = value End Set End Property Private _databases As List(Of Database) Public Property Databases() As List(Of Database) Get Return _databases End Get Set(ByVal value As List(Of Database)) _databases = value End Set End Property Public Sub LoadTables() Dim db As New Database(Me) db.Name = "test" Databases.Add(db) End Sub End Class Public Class Database Private _server As Server Private _name As String Public Property Name() As String Get Return _name End Get Set(ByVal value As String) _name = value End Set End Property Public Property Server() As Server Get Return _server End Get Set(ByVal value As Server) _server = value End Set End Property Public Sub New(ByVal ser As Server) Server = ser End Sub End Class Fairly simple. I use like this: Dim s As New Server s.Name = "Test" s.LoadTables() The problem is in the LoadTables in the Server class. When it hits Databases.Add(db) it gives me a NullReference error (Object reference not set). I don't understand how it is getting that, all the objects are set. Any ideas? Thanks.

    Read the article

  • Memory Leaks in pickerView using sqlite

    - by Danamo
    I'm adding self.notes array to a pickerView. This is how I'm setting the array: NSMutableArray *notesArray = [[NSMutableArray alloc] init]; [notesArray addObject:@"-"]; [notesArray addObjectsFromArray:[dbManager getTableValues:@"Notes"]]; self.notes = notesArray; [notesArray release]; The info for the pickerView is taken from the database in this method: -(NSMutableArray *)getTableValues:(NSString *)table { NSMutableArray *valuesArray = [[NSMutableArray alloc] init]; if (sqlite3_open([self.databasePath UTF8String], &database) != SQLITE_OK) { sqlite3_close(database); NSAssert(0, @"Failed to open database"); } else { NSString *query = [[NSString alloc] initWithFormat:@"SELECT value FROM %@", table]; sqlite3_stmt *statement; if (sqlite3_prepare_v2(database, [query UTF8String], -1, &statement, nil) == SQLITE_OK) { while (sqlite3_step(statement) == SQLITE_ROW) { NSString *value =[NSString stringWithUTF8String:(char *)sqlite3_column_text(statement, 0)]; [valuesArray addObject:value]; [value release]; } sqlite3_reset(statement); } [query release]; sqlite3_finalize(statement); sqlite3_close(database); } return valuesArray; } But I keep getting memory leaks in Instruments for these lines: NSMutableArray *valuesArray = [[NSMutableArray alloc] init]; and [valuesArray addObject:value]; What am I doing wrong here? Thanks for your help!

    Read the article

  • Migrate from Oracle to MySQL

    - by Cassy
    Hi together. We ran into serious performance problems with our Oracle database and we would like to try to migrate to a MySQL-based database (either MySQL directly or, more preferrable, Infobright). The thing is, we need to let the old and the new system overlap for at least some weeks if not months, before we actually know, if all features of the new database match our needs. So, here is our situation: The Oracle database consists of multiple tables with each millions of rows. During the day, there are literally thousands of statements, which we cannot stop for migration. Every morning, new data is imported into the Oracle database, replacing some thousands of rows. Copying this process is not a problem, so we could, in theory, import in both databases in parallel. But, and here lies the challenge, for this to work, we need to have an export from the Oracle database with a consistent state from one day. (We cannot export some tables on Monday and some others on Tuesday, etc.) This means, that at least the export should be finished in less than one day. Our first thought was to dump the schema, but I wasn't able to find a tool to import an Oracle dump file into mysql. Exporting tables in CSV files might work, but I'm afraid it could take too long. So my question now is: What should I do? Is there any tool to import Oracle dump files into MySQL? Does anybody have any experience with such a large-scale migration? Thanks in advance, Cassy PS: Please, don't suggest performance optimization techniques for Oracle, we already tried a lot :-)

    Read the article

  • SharePoint 2010 – SQL Server has an unsupported version 10.0.2531.0

    - by Jeff Widmer
    I am trying to perform a database attach upgrade to SharePoint Foundation 2010. At this point I am trying to attach the content database to a Web application by using Windows Powershell: Mount-SPContentDatabase -Name <DatabaseName> -DatabaseServer <ServerName> -WebApplication <URL> [-Updateuserexperience] I am following the directions from this TechNet article: Attach databases and upgrade to SharePoint Foundation 2010.  When I go to mount the content database I am receiving this error: Mount-SPContentDatabase : Could not connect to [DATABASE_SERVER] using integrated security: SQL server at [DATABASE_SERVER] has an unsupported version 10.0.2531.0. Please refer to “http://go.microsoft.com/fwlink/?LinkId=165761” for information on the minimum required SQL Server versions and how to download them. At first this did not make sense because the default SharePoint Foundation 2010 website was running just fine.  But then I realized that the default SharePoint Foundation site runs off of SQL Server Express and that I had just installed SQL Server Web Edition (since the database is greater than 4GB) and restored the database to this version of SQL Server. Checking the documentation link above I see that SharePoint Server 2010 requires a 64-bit edition of SQL Server with the minimum required SQL Server versions as follows: SQL Server 2008 Express Edition Service Pack 1, version number 10.0.2531 SQL Server 2005 Service Pack 3 cumulative update package 3, version number 9.00.4220.00 SQL Server 2008 Service Pack 1 cumulative update package 2, version number 10.00.2714.00 The version of SQL Server 2008 Web Edition with Service Pack 1 (the version I installed on this machine) is 10.0.2531.0. SELECT @@VERSION: Microsoft SQL Server 2008 (SP1) - 10.0.2531.0 (X64)   Mar 29 2009 10:11:52   Copyright (c) 1988-2008 Microsoft Corporation  Web Edition (64-bit) on Windows NT 6.1 <X64> (Build 7600: ) (VM) But I had to read the article several times since the minimum version number for SQL Server Express is 10.0.2531.0.  At first I thought I was good with the version of SQL Server 2008 Web that I had installed, also 10.0.2531.0.  But then I read further to see that there is a cumulative update (hotfix) for SQL Server 2008 SP1 (NOT the Express edition) that is required for SharePoint 2010 and will bump the version number to 10.0.2714.00. So the solution was to install the Cumulative update package 2 for SQL Server 2008 Service Pack 1 on my SQL Server 2008 Web Edition to allow SharePoint 2010 to work with SQL Server 2008 (other than the SQL Server 2008 Express version). SELECT @@VERSION (After installing Cumulative update package 2): Microsoft SQL Server 2008 (SP1) - 10.0.2714.0 (X64)   May 14 2009 16:08:52   Copyright (c) 1988-2008 Microsoft Corporation  Web Edition (64-bit) on Windows NT 6.1 <X64> (Build 7600: ) (VM)

    Read the article

  • SQLAuthority News – SafePeak’s SQL Server Performance Contest – Winners

    - by pinaldave
    SafePeak, the unique automated SQL performance acceleration and performance tuning software vendor, announced the winners of their SQL Performance Contest 2011. The contest quite unique: the writer of the best / most interesting and most community liked “performance story” would win an expensive gadget. The judges were the community DBAs that could participating and Like’ing stories and could also win expensive prizes. Robert Pearl SQL MVP, was the contest supervisor. I liked most of the stories and decided then to contact SafePeak and suggested to participate in the give-away and they have gladly accepted the same. The winner of best story is: Jason Brimhall (USA) with a story about a proc with a fair amount of business logic. Congratulations Jason! The 3 participants won the second prize of $100 gift card on amazon.com are: Michael Corey (USA), Hakim Ali (USA) and Alex Bernal (USA). And 5 participants won a printed copy of a book of mine (Book Reviews of SQL Wait Stats Joes 2 Pros: SQL Performance Tuning Techniques Using Wait Statistics, Types & Queues) are: Patrick Kansa (USA), Wagner Bianchi (USA), Riyas.V.K (India), Farzana Patwa (USA) and Wagner Crivelini (Brazil). The winners are welcome to send safepeak their mail address to receive the prizes (to “info ‘at’ safepeak.com”). Also SafePeak team asked me to welcome you all to continue sending stories, simply because they (and we all) like to read interesting stuff) as well as to send them ideas for future contests. You can do it from here: www.safepeak.com/SQL-Performance-Contest-2011/Submit-Story Congratulations to everybody! I found this very funny video about SafePeak: It looks like someone (maybe the vendor) played with video’s once and created this non-commercial like video: SafePeak dynamic caching is an immediate plug-n-play performance acceleration and scalability solution for cloud, hosted and business SQL server applications. By caching in memory result sets of queries and stored procedures, while keeping all those cache correct and up to date using unique patent pending technology, SafePeak can fix SQL performance problems and bottlenecks of most applications – most importantly: without actual code changes. By the way, I checked their website prior this contest announcement and noticed that they are running these days a special end year promotion giving between 30% to 45% discounts. Since the installation is quick and full testing can be done within couple of days – those have the need (performance problems) and have budget leftovers: I suggest you hurry. A free fully functional trial is here: www.safepeak.com/download, while those that want to start with a quote should ping here www.safepeak.com/quote. Good luck! Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Special Activities in the OTN Lounge

    - by Bob Rhubart
    What is the OTN Lounge? It's the place for Oracle OpenWorld and JavaOne attendees to hang out, get off your feet, rest up between sessions, recharge your laptop, tablet, or phone, connect with other community members, pick the brains of subject matter experts and community leaders, enjoy some refreshments (coffee and soft drinks in the morning, beer in the afternoon), and avoid the crowds by watching keynote presentations on a plasma screen. But in addition to general chillaxin' the OTN Lounge also hosts several special activities throughout the week… OTN Lounge Special Activities Sunday Oracle Social Network Developer Challenge Kick-off   (7:00pm - 8:30pm)Want to learn more about Oracle Social Network? Love working with APIs? Enter the Oracle Social Network Developer Challenge and build your dream integration with Oracle's secure, purposeful social network for business. Demonstrate your skills, work with the latest and greatest and compete for $500 in Amazon gift cards. Go to theappslab.com/osnregisterr Read and agree to the terms and rules. Register yourself with your name, corporate email address, and company. Watch your inbox for a confirmation email from Oracle Social Network. Start coding (individual or teams welcome) Show off your work to the judges in the OTN Lounge, Wednesday, 4:00pm - 6:00pm Monday (Lounge hours: 8:00am - 7:00pm) RAC Attack (9:00am - 1:00pm) Learn about Oracle Real Application Clustering (RAC) in this collaborative event. You'll work with experts from the IOUG RAC SIG to get an Oracle Database 11gR2 RAC cluster running inside a virtual machine. For more information: RAC attack at Oracle Open World (Pythian Blog) RAC Attack - Oracle Cluster Database at Home/Events (WikiBooks) Oracle Social Network Developer Challenge Office Hours (4:00pm - 8:00pm)Meet the people behind Oracle Social Network. Tuesday (Lounge hours: 8:00am - 7:00pm) RAC Attack (9:00am - 1:00pm) Oracle Social Network Developer Challenge Office Hours (4:30pm - 8:00pm) Oracle Database / Oracle Fusion Middleware Tweet Meet (4:30pm - 6:00pm) Free as in beer! Oracle Database and Oracle Fusion Middleware tweeters, gather in the OTN Lounge for refreshments and conversation with fellow tweeters and Oracle Database and Middleware experts. Wednesday (Lounge Hours: 8:00am - 6:00pm) RAC Attack (9:00am - 1:00pm) Oracle Social Network Developer Challenge Judging (4:00pm - 6:00pm) ADF Oracle ADF / Oracle Fusion Middleware Meet-up (4:30pm - 5:30pm) Join other Oracle ADF and Oracle Fusion Middleware developers and meet the product managers and engineers behind Oracle ADF, ADF Mobile, and ADF Essentials. Did we mention free beer? Thursday (Lounge Hours: 8:00am - 2:00pm) RAC Attack (9:00am - 1:00pm) The OTN Lounge is located in the Howard St .tent, located by no small coincidence on Howard St. between 3rd and 4th, directly between Moscone North and Moscone South. An Oracle OpenWorld or JavaOne conference badge is required for access to the OTN Lounge.

    Read the article

  • Entity Framework Code-First, OData & Windows Phone Client

    - by Jon Galloway
    Entity Framework Code-First is the coolest thing since sliced bread, Windows  Phone is the hottest thing since Tickle-Me-Elmo and OData is just too great to ignore. As part of the Full Stack project, we wanted to put them together, which turns out to be pretty easy… once you know how.   EF Code-First CTP5 is available now and there should be very few breaking changes in the release edition, which is due early in 2011.  Note: EF Code-First evolved rapidly and many of the existing documents and blog posts which were written with earlier versions, may now be obsolete or at least misleading.   Code-First? With traditional Entity Framework you start with a database and from that you generate “entities” – classes that bridge between the relational database and your object oriented program. With Code-First (Magic-Unicorn) (see Hanselman’s write up and this later write up by Scott Guthrie) the Entity Framework looks at classes you created and says “if I had created these classes, the database would have to have looked like this…” and creates the database for you! By deriving your entity collections from DbSet and exposing them via a class that derives from DbContext, you "turn on" database backing for your POCO with a minimum of code and no hidden designer or configuration files. POCO == Plain Old CLR Objects Your entity objects can be used throughout your applications - in web applications, console applications, Silverlight and Windows Phone applications, etc. In our case, we'll want to read and update data from a Windows Phone client application, so we'll expose the entities through a DataService and hook the Windows Phone client application to that data via proxies.  Piece of Pie.  Easy as cake. The Demo Architecture To see this at work, we’ll create an ASP.NET/MVC application which will act as the host for our Data Service.  We’ll create an incredibly simple data layer using EF Code-First on top of SQLCE4 and we’ll expose the data in a WCF Data Service using the oData protocol.  Our Windows Phone 7 client will instantiate  the data context via a URI and load the data asynchronously. Setting up the Server project with MVC 3, EF Code First, and SQL CE 4 Create a new application of type ASP.NET MVC 3 and name it DeadSimpleServer.  We need to add the latest SQLCE4 and Entity Framework Code First CTP's to our project. Fortunately, NuGet makes that really easy. Open the Package Manager Console (View / Other Windows / Package Manager Console) and type in "Install-Package EFCodeFirst.SqlServerCompact" at the PM> command prompt. Since NuGet handles dependencies for you, you'll see that it installs everything you need to use Entity Framework Code First in your project. PM> install-package EFCodeFirst.SqlServerCompact 'SQLCE (= 4.0.8435.1)' not installed. Attempting to retrieve dependency from source... Done 'EFCodeFirst (= 0.8)' not installed. Attempting to retrieve dependency from source... Done 'WebActivator (= 1.0.0.0)' not installed. Attempting to retrieve dependency from source... Done You are downloading SQLCE from Microsoft, the license agreement to which is available at http://173.203.67.148/licenses/SQLCE/EULA_ENU.rtf. Check the package for additional dependencies, which may come with their own license agreement(s). Your use of the package and dependencies constitutes your acceptance of their license agreements. If you do not accept the license agreement(s), then delete the relevant components from your device. Successfully installed 'SQLCE 4.0.8435.1' You are downloading EFCodeFirst from Microsoft, the license agreement to which is available at http://go.microsoft.com/fwlink/?LinkID=206497. Check the package for additional dependencies, which may come with their own license agreement(s). Your use of the package and dependencies constitutes your acceptance of their license agreements. If you do not accept the license agreement(s), then delete the relevant components from your device. Successfully installed 'EFCodeFirst 0.8' Successfully installed 'WebActivator 1.0.0.0' You are downloading EFCodeFirst.SqlServerCompact from Microsoft, the license agreement to which is available at http://173.203.67.148/licenses/SQLCE/EULA_ENU.rtf. Check the package for additional dependencies, which may come with their own license agreement(s). Your use of the package and dependencies constitutes your acceptance of their license agreements. If you do not accept the license agreement(s), then delete the relevant components from your device. Successfully installed 'EFCodeFirst.SqlServerCompact 0.8' Successfully added 'SQLCE 4.0.8435.1' to EfCodeFirst-CTP5 Successfully added 'EFCodeFirst 0.8' to EfCodeFirst-CTP5 Successfully added 'WebActivator 1.0.0.0' to EfCodeFirst-CTP5 Successfully added 'EFCodeFirst.SqlServerCompact 0.8' to EfCodeFirst-CTP5 Note: We're using SQLCE 4 with Entity Framework here because they work really well together from a development scenario, but you can of course use Entity Framework Code First with other databases supported by Entity framework. Creating The Model using EF Code First Now we can create our model class. Right-click the Models folder and select Add/Class. Name the Class Person.cs and add the following code: using System.Data.Entity; namespace DeadSimpleServer.Models { public class Person { public int ID { get; set; } public string Name { get; set; } } public class PersonContext : DbContext { public DbSet<Person> People { get; set; } } } Notice that the entity class Person has no special interfaces or base class. There's nothing special needed to make it work - it's just a POCO. The context we'll use to access the entities in the application is called PersonContext, but you could name it anything you wanted. The important thing is that it inherits DbContext and contains one or more DbSet which holds our entity collections. Adding Seed Data We need some testing data to expose from our service. The simplest way to get that into our database is to modify the CreateCeDatabaseIfNotExists class in AppStart_SQLCEEntityFramework.cs by adding some seed data to the Seed method: protected virtual void Seed( TContext context ) { var personContext = context as PersonContext; personContext.People.Add( new Person { ID = 1, Name = "George Washington" } ); personContext.People.Add( new Person { ID = 2, Name = "John Adams" } ); personContext.People.Add( new Person { ID = 3, Name = "Thomas Jefferson" } ); personContext.SaveChanges(); } The CreateCeDatabaseIfNotExists class name is pretty self-explanatory - when our DbContext is accessed and the database isn't found, a new one will be created and populated with the data in the Seed method. There's one more step to make that work - we need to uncomment a line in the Start method at the top of of the AppStart_SQLCEEntityFramework class and set the context name, as shown here, public static class AppStart_SQLCEEntityFramework { public static void Start() { DbDatabase.DefaultConnectionFactory = new SqlCeConnectionFactory("System.Data.SqlServerCe.4.0"); // Sets the default database initialization code for working with Sql Server Compact databases // Uncomment this line and replace CONTEXT_NAME with the name of your DbContext if you are // using your DbContext to create and manage your database DbDatabase.SetInitializer(new CreateCeDatabaseIfNotExists<PersonContext>()); } } Now our database and entity framework are set up, so we can expose data via WCF Data Services. Note: This is a bare-bones implementation with no administration screens. If you'd like to see how those are added, check out The Full Stack screencast series. Creating the oData Service using WCF Data Services Add a new WCF Data Service to the project (right-click the project / Add New Item / Web / WCF Data Service). We’ll be exposing all the data as read/write.  Remember to reconfigure to control and minimize access as appropriate for your own application. Open the code behind for your service. In our case, the service was called PersonTestDataService.svc so the code behind class file is PersonTestDataService.svc.cs. using System.Data.Services; using System.Data.Services.Common; using System.ServiceModel; using DeadSimpleServer.Models; namespace DeadSimpleServer { [ServiceBehavior( IncludeExceptionDetailInFaults = true )] public class PersonTestDataService : DataService<PersonContext> { // This method is called only once to initialize service-wide policies. public static void InitializeService( DataServiceConfiguration config ) { config.SetEntitySetAccessRule( "*", EntitySetRights.All ); config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2; config.UseVerboseErrors = true; } } } We're enabling a few additional settings to make it easier to debug if you run into trouble. The ServiceBehavior attribute is set to include exception details in faults, and we're using verbose errors. You can remove both of these when your service is working, as your public production service shouldn't be revealing exception information. You can view the output of the service by running the application and browsing to http://localhost:[portnumber]/PersonTestDataService.svc/: <service xml:base="http://localhost:49786/PersonTestDataService.svc/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:app="http://www.w3.org/2007/app" xmlns="http://www.w3.org/2007/app"> <workspace> <atom:title>Default</atom:title> <collection href="People"> <atom:title>People</atom:title> </collection> </workspace> </service> This indicates that the service exposes one collection, which is accessible by browsing to http://localhost:[portnumber]/PersonTestDataService.svc/People <?xml version="1.0" encoding="iso-8859-1" standalone="yes"?> <feed xml:base=http://localhost:49786/PersonTestDataService.svc/ xmlns:d="http://schemas.microsoft.com/ado/2007/08/dataservices" xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata" xmlns="http://www.w3.org/2005/Atom"> <title type="text">People</title> <id>http://localhost:49786/PersonTestDataService.svc/People</id> <updated>2010-12-29T01:01:50Z</updated> <link rel="self" title="People" href="People" /> <entry> <id>http://localhost:49786/PersonTestDataService.svc/People(1)</id> <title type="text"></title> <updated>2010-12-29T01:01:50Z</updated> <author> <name /> </author> <link rel="edit" title="Person" href="People(1)" /> <category term="DeadSimpleServer.Models.Person" scheme="http://schemas.microsoft.com/ado/2007/08/dataservices/scheme" /> <content type="application/xml"> <m:properties> <d:ID m:type="Edm.Int32">1</d:ID> <d:Name>George Washington</d:Name> </m:properties> </content> </entry> <entry> ... </entry> </feed> Let's recap what we've done so far. But enough with services and XML - let's get this into our Windows Phone client application. Creating the DataServiceContext for the Client Use the latest DataSvcUtil.exe from http://odata.codeplex.com. As of today, that's in this download: http://odata.codeplex.com/releases/view/54698 You need to run it with a few options: /uri - This will point to the service URI. In this case, it's http://localhost:59342/PersonTestDataService.svc  Pick up the port number from your running server (e.g., the server formerly known as Cassini). /out - This is the DataServiceContext class that will be generated. You can name it whatever you'd like. /Version - should be set to 2.0 /DataServiceCollection - Include this flag to generate collections derived from the DataServiceCollection base, which brings in all the ObservableCollection goodness that handles your INotifyPropertyChanged events for you. Here's the console session from when we ran it: <ListBox x:Name="MainListBox" Margin="0,0,-12,0" ItemsSource="{Binding}" SelectionChanged="MainListBox_SelectionChanged"> Next, to keep things simple, change the Binding on the two TextBlocks within the DataTemplate to Name and ID, <ListBox x:Name="MainListBox" Margin="0,0,-12,0" ItemsSource="{Binding}" SelectionChanged="MainListBox_SelectionChanged"> <ListBox.ItemTemplate> <DataTemplate> <StackPanel Margin="0,0,0,17" Width="432"> <TextBlock Text="{Binding Name}" TextWrapping="Wrap" Style="{StaticResource PhoneTextExtraLargeStyle}" /> <TextBlock Text="{Binding ID}" TextWrapping="Wrap" Margin="12,-6,12,0" Style="{StaticResource PhoneTextSubtleStyle}" /> </StackPanel> </DataTemplate> </ListBox.ItemTemplate> </ListBox> Getting The Context In the code-behind you’ll first declare a member variable to hold the context from the Entity Framework. This is named using convention over configuration. The db type is Person and the context is of type PersonContext, You initialize it by providing the URI, in this case using the URL obtained from the Cassini web server, PersonContext context = new PersonContext( new Uri( "http://localhost:49786/PersonTestDataService.svc/" ) ); Create a second member variable of type DataServiceCollection<Person> but do not initialize it, DataServiceCollection<Person> people; In the constructor you’ll initialize the DataServiceCollection using the PersonContext, public MainPage() { InitializeComponent(); people = new DataServiceCollection<Person>( context ); Finally, you’ll load the people collection using the LoadAsync method, passing in the fully specified URI for the People collection in the web service, people.LoadAsync( new Uri( "http://localhost:49786/PersonTestDataService.svc/People" ) ); Note that this method runs asynchronously and when it is finished the people  collection is already populated. Thus, since we didn’t need or want to override any of the behavior we don’t implement the LoadCompleted. You can use the LoadCompleted event if you need to do any other UI updates, but you don't need to. The final code is as shown below: using System; using System.Data.Services.Client; using System.Windows; using System.Windows.Controls; using DeadSimpleServer.Models; using Microsoft.Phone.Controls; namespace WindowsPhoneODataTest { public partial class MainPage : PhoneApplicationPage { PersonContext context = new PersonContext( new Uri( "http://localhost:49786/PersonTestDataService.svc/" ) ); DataServiceCollection<Person> people; // Constructor public MainPage() { InitializeComponent(); // Set the data context of the listbox control to the sample data // DataContext = App.ViewModel; people = new DataServiceCollection<Person>( context ); people.LoadAsync( new Uri( "http://localhost:49786/PersonTestDataService.svc/People" ) ); DataContext = people; this.Loaded += new RoutedEventHandler( MainPage_Loaded ); } // Handle selection changed on ListBox private void MainListBox_SelectionChanged( object sender, SelectionChangedEventArgs e ) { // If selected index is -1 (no selection) do nothing if ( MainListBox.SelectedIndex == -1 ) return; // Navigate to the new page NavigationService.Navigate( new Uri( "/DetailsPage.xaml?selectedItem=" + MainListBox.SelectedIndex, UriKind.Relative ) ); // Reset selected index to -1 (no selection) MainListBox.SelectedIndex = -1; } // Load data for the ViewModel Items private void MainPage_Loaded( object sender, RoutedEventArgs e ) { if ( !App.ViewModel.IsDataLoaded ) { App.ViewModel.LoadData(); } } } } With people populated we can set it as the DataContext and run the application; you’ll find that the Name and ID are displayed in the list on the Mainpage. Here's how the pieces in the client fit together: Complete source code available here

    Read the article

  • SQL SERVER – Checklist for Analyzing Slow-Running Queries

    - by pinaldave
    I am recently working on upgrading my class Microsoft SQL Server 2005/2008 Query Optimization and & Performance Tuning with additional details and more interesting examples. While working on slide deck I realized that I need to have one solid slide which talks about checklist for analyzing slow running queries. A quick search on my saved [...]

    Read the article

  • ADNOC talks about 50x increase in performance

    - by KLaker
    If you are still wondering about how Exadata can revolutionise your business then I would recommend watching this great video which was recorded at this year's OpenWorld. First a little background...The Abu Dhabi National Oil Company for Distribution (ADNOC) is an integrated energy company that was founded in 1973. ADNOC Distribution markets and distributes petroleum products and services within the United Arab Emirates and internationally. As one of the largest and most innovative government-owned petroleum companies in the Arab Gulf, ADNOC Distribution is renowned and respected for the exceptional quality and reliability of its products and services. Its five corporate divisions include more than 200 filling stations (a number that is growing at 8% annually), more than 150 convenience stores, 10 vehicle inspection stations, as well as wholesale and retail sales of bulk fuel, gas, oil, diesel, and lubricants. ADNOC selected Oracle Exadata Database Machine after extensive research because it provided them with a single platform that can run mixed workloads in a single unified machine: "We chose Oracle Exadata Database Machine because it.offered a fully integrated and highly engineered system that was ready to deploy. With our infrastructure running all the same technology, we can operate any type of Oracle Database without restrictions and be prepared for business growth," said Ali Abdul Aziz Al-Ali, IT division manager, ADNOC Distribution. ".....we could consolidate our transaction processing and business intelligence onto one platform. Competing solutions are just not capable of doing that." - Awad Ahmed Ali El-Sidiq, Senior Database Administrator, ADNOC Distribution In this new video Awad Ahmen Ali El Sidddig, Senior DBA at ADNOC, talks about the impact that Exadata has had on his team and the whole business. ADNOC is using our engineered systems to drive and manage all their workloads: from transaction systems to payments system to data warehouse to BI environment. A true Disk-to-Dashboard revolution using Engineered Systems. This engineered approach is delivering 50x improvement in performance with one queries running 100x faster! The IT has even revolutionised some of their data warehouse related processes with the help of Exadata and now jobs that were taking over 4 hours now run in a few minutes.  To watch the video click on the image below which will take you to our Oracle YouTube page: (if the above link does not work, click here: http://www.youtube.com/watch?v=zcRpxc6u5Ic) Now that queries are running 100x faster and jobs are completing in minutes not hours, what is next for the IT team at ADNOC? Like many of our customers ADNOC is now looking to take advantage of big data to help them better align their business operations with customer behaviour and customer insights. To help deliver this next level of insight the IT team is looking at the new features in Oracle Database 12c such as the new in-memory feature to deliver even more performance gains.  The great news is that Awad Ahmen Ali El Sidddig was awarded DBA of the Year - EMEA within our Data Warehouse Global Leaders programme and you can see the badge for this award pop-up at the start of video. Well done to everyone at ADNOC and thanks for spending the time with us at OOW to create this great video.

    Read the article

  • Learn about MySQL with the Authentic MySQL for Beginners course

    - by Antoinette O'Sullivan
    Learn about the MySQL Server and other MySQL products by taking the authentic MySQL for Beginners course. This course covers all the basics from MySQL download and installation, to relational database concepts and database design. This course is your first step to becoming a MySQL administrator. You can take this course through one of the following delivery types: Training-on-Demand: Start the class from your desk, at your base and within 24 hrs of registering. Read Ben Krug on Day 3 of his experience taking the MySQL for Beginners course Training-on-Demand option. Live-Virtual Class: Attend this live class from your own office - no travel required. Choose from a selection of events on the schedule to suit different timezones. Delivery languages include English and German. In-Class event: Attend this class in an education center. Events already on the schedule include:  Location  Date  Delivery Language  Mechelen, Belgium  14 January 2013  English  London, England  5 March 2013  English  Hamburg, Germany  25 March 2013  German  Munich, Germany  3 June 2013  German  Budapest, Hungary  5 February 2013  Hungary  Milan, Italy  11 February 2013  Italian  Rome, Italy  4 March 2013  Italian  Riga, Latvia  18 February 2013  Latvian  Amsterdam, Netherlands  21 May 2013  Dutch  Nieuwegein, Netherlands  18 February 2013  Dutch  Warsaw, Poland  18 February 2013  Polish  Lisbon, Portugal  25 March 2013  European Portugese  Porto, Portugal  25 March 2013  European Portugese  Barcelona, Spain  11 February 2013  Spanish  Madrid, Spain  22 April 2013  Spanish  Nairobi, Kenya  14 January 2013  English  Capetown, South Africa  22 July 2013  English  Pretoria, South Africa  22 April 2013  English  Petaling Jaya, Malaysia  28 January 2013  English  Ottawa, Canada  25 March 2013  English  Toronto, Canada  25 March 2013  English  Montreal, Canada 25 March 2013   English Mexico City, Mexico  14 January 2013   Spanish  San Pedro Garza Garcia, Mexico  5 February 2013  Spanish  Sao Paolo, Brazil  29 January 2013  Brazilian Portugese For more information on this or other courses on the authentic MySQL Curriculum, go to http://oracle.com/education/mysql. Note, many organizations deploy both Oracle Database and MySQL side by side to serve different needs, and as a database professional you can find training courses on both topics at Oracle University! Check out the upcoming Oracle Database training courses and MySQL training courses. Even if you're only managing Oracle Databases at this point of time, getting familiar with MySQL will broaden your career path with growing job demand.

    Read the article

  • SQLBits VI – The sixth sets

    - by Rob Farley
    My involvement stopped with the tagline, but SQLBits VI is on tomorrow. The theme of the event is Performance Tuning, which has nothing to do with Bruce Willis or dead people – unless Bruce Willis has just become a database expert and been shot for doing a dropping an index (some would say that’s a crime worthy of the death penalty). It’s a shame my involvement hasn’t been more, because it’s such a terrific event, and it would’ve been good to have been there for a second time. It’s a long way to...(read more)

    Read the article

  • SQL Saturday Richmond, VA

    - by Mike
    Very excited to announce that I’ll be holding 2 sessions at SQL Saturday in VA on April 10th. If there are any frequent readers of SQLTeam.com attending, please make sure to say hi! Topics I’m covering are partitioning & loading data real time and an introduction to performance tuning. Hope to see you there! SQL Saturday Richmond Schedule

    Read the article

  • Guidance and Pricing for MSDN 2010

    - by John Alexander
    Sorry for the rather lengthy post here. I get asked this all the time so I decided to post it…Visual Studio 2010 editions will be available on April 12, 2010. Product Features Professional with MSDN Essentials Professional with MSDN Premium with MSDN Ultimate with MSDN Test Professional with MSDN Debugging and Diagnostics IntelliTrace (Historical Debugger)         Static Code Analysis       Code Metrics       Profiling       Debugger   Testing Tools Unit Testing   Code Coverage       Test Impact Analysis       Coded UI Test       Web Performance Testing         Load Testing1         Microsoft Test Manager 2010       Test Case Management2       Manual Test Execution       Fast-Forward for Manual Testing       Lab Management Configuration3       Integrated Development Environment Multiple Monitor Support   Multi-Targeting   One Click Web Deployment   JavaScript and jQuery Support   Extensible WPF-Based Environment Database Development Database Deployment       Database Change Management2       Database Unit Testing       Database Test Data Generation       Data Access   Development Platform Support Windows Development   Web Development   Office and SharePoint Development   Cloud Development   Customizable Development Experience   Architecture and Modeling Architecture Explorer         UML® 2.0 Compliant Diagrams (Activity, Use Case, Sequence, Class, Component)         Layer Diagram and Dependency Validation         Read-only diagrams (UML, Layer, DGML Graphs)         Lab Management Virtual environment setup & tear down3       Provision environment from template3       Checkpoint environment3       Team Foundation Server Version Control2   Work Item Tracking2   Build Automation2   Team Portal2   Reporting & Business Intelligence2   Agile Planning Workbook2   Microsoft Visual Studio Team Explorer 2010   Test Case Management2       MSDN Subscription – Software and Services for Production Use Windows Azure Platform 20 hrs/mo † 50 hrs/mo † 100 hrs/mo † 250 hrs/mo † n/a Microsoft Visual Studio Team Foundation Server 2010   Microsoft Visual Studio Team Foundation Server 2010 CAL   1 1 1 1 Microsoft Expression Studio 3       Microsoft Office Professional Plus 2010, Project Professional 2010, Visio Premium 2010 (following Office 2010 launch)       MSDN Subscription – Software for Development and Testing 4 Windows 7, Windows Server 2008 R2 and SQL Server 2008 Toolkits, Software Development Kits, Driver Development Kits Previous versions of Windows (client and server operation systems)   Previous versions of Microsoft SQL Server   Microsoft Office       Microsoft Dynamics       All other Servers       Windows Embedded operating systems       Teamprise         MSDN Subscription – Other Benefits Technical support incidents 0 2 4 4 2 Priority support in MSDN Forums Microsoft e-learning collections (typically 10 courses or 20 hours) 0 1 2 2 1 MSDN Flash newsletter MSDN Online Concierge MSDN Magazine   System Requirements View View View View View Buy from (MSRP) $799 $1,199 $5,469 $11,899 $2,169 Renew from (MSRP) $549 (upgrade) $799 $2,299 $3,799 $899 † Availability varies by country and subscription level.  Details available on the MSDN site 1. May require one or more Microsoft Visual Studio Load Test Virtual User Pack 2010 2. Requires Team Foundation Server and a Team Foundation Server CAL 3. Requires Microsoft Visual Studio Lab Management 2010 4. Per-user license allows unlimited installations and use for designing, developing, testing, and demonstrating applications. UML is a registered trademark of Object Management Group, Inc. Windows is either a registered trademark or trademark of Microsoft Corporation in the United States and/or other countries.

    Read the article

  • SQL SERVER – DMV – sys.dm_exec_query_optimizer_info – Statistics of Optimizer

    - by pinaldave
    Incredibly, SQL Server has so much information to share with us. Every single day, I am amazed with this SQL Server technology. Sometimes I find several interesting information by just querying few of the DMV. And when I present this info in front of my client during performance tuning consultancy, they are surprised with my findings. Today, I am going to share one of the hidden gems of DMV with you, the one which I frequently use to understand what’s going on under the hood of SQL Server. SQL Server keeps the record of most of the operations of the Query Optimizer. We can learn many interesting details about the optimizer which can be utilized to improve the performance of server. SELECT * FROM sys.dm_exec_query_optimizer_info WHERE counter IN ('optimizations', 'elapsed time','final cost', 'insert stmt','delete stmt','update stmt', 'merge stmt','contains subquery','tables', 'hints','order hint','join hint', 'view reference','remote query','maximum DOP', 'maximum recursion level','indexed views loaded', 'indexed views matched','indexed views used', 'indexed views updated','dynamic cursor request', 'fast forward cursor request') All occurrence values are cumulative and are set to 0 at system restart. All values for value fields are set to NULL at system restart. I have removed a few of the internal counters from the script above, and kept only documented details. Let us check the result of the above query. As you can see, there is so much vital information that is revealed in above query. I can easily say so many things about how many times Optimizer was triggered and what the average time taken by it to optimize my queries was. Additionally, I can also determine how many times update, insert or delete statements were optimized. I was able to quickly figure out that my client was overusing the Query Hints using this dynamic management view. If you have been reading my blog, I am sure you are aware of my series related to SQL Server Views SQL SERVER – The Limitations of the Views – Eleven and more…. With this, I can take a quick look and figure out how many times Views were used in various solutions within the query. Moreover, you can easily know what fraction of the optimizations has been involved in tuning server. For example, the following query would tell me, in total optimizations, what the fraction of time View was “reference“. As this View also includes system Views and DMVs, the number is a bit higher on my machine. SELECT (SELECT CAST (occurrence AS FLOAT) FROM sys.dm_exec_query_optimizer_info WHERE counter = 'view reference') / (SELECT CAST (occurrence AS FLOAT) FROM sys.dm_exec_query_optimizer_info WHERE counter = 'optimizations') AS ViewReferencedFraction Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL DMV, SQL Optimization, SQL Performance, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • .NET Reflector 7.2 Early Access Build 2 Released: Performance Critical

    - by Bart Read
    I've just posted a write-up of some of the performance tuning I've done to improve .NET Reflector 7.2's start-up time here: http://www.reflector.net/2011/05/net-reflector-7-start-up-time-running-out-of-gas-or-pedal-to-the-metal/ You can get the new build from the .NET Reflector homepage at http://www.reflector.net/. Please remember to give us your feedback in the forum, at http://forums.reflector.net/, using the tags #7.2 and #eap. Technorati Tags: reflector,early access,7.2

    Read the article

< Previous Page | 406 407 408 409 410 411 412 413 414 415 416 417  | Next Page >