Search Results

Search found 14148 results on 566 pages for '2008'.

Page 245/566 | < Previous Page | 241 242 243 244 245 246 247 248 249 250 251 252  | Next Page >

  • (Re)Enabling JavaScript debugger in IE7 with Visual Studio 2008

    - by masterik
    Visual Studio 2008 comes with nice javascript debugging features. But I have played a little with NetBeans debugger wich has installed an ugly Script Debugger from Microsoft to my IE... Normally IE should ask what do I want to use for debugging, but now I can't start debugging with Visual Studio, the Script Debugger is started automatically... After uninstalling the Script Debugger I can't debug in IE at all. Even attaching to iexplore.exe process doesn't helps... Has installed the Script Debugger again... :((((( How can I get back my Visual Studio debugging working in IE again?

    Read the article

  • Importing a CSV file without headers into SQL 2008

    - by Luiggi
    I want to import a CSV with 4,8M records into a SQL 2008 table. I'm trying to do it with the Management Studio wizard but it keeps trying to recognize a header row which the CSV doesnt have. I don't find any option to skip this and although I specify the columns myself, the wizard still tries to find a header row and doesnt import anything without it. The structure of the CSV is "818180","25529","Dario","Pereyra","Rosario","SF","2010-09-02" I've also tried alternatives like BULK INSERT but then I find out that with BULK INSERT I can't import files with a text qualifier.

    Read the article

  • Buying Microsoft SQL Server 2008 Web Edition

    - by Marco
    Hello, Probably this isn't the right place... but ill give it a try. I want to buy Microsoft SQL Server 2008 Web Edition in order to remotly install it on the server. The question is: Can i buy a licence in USA? and pay in dollars? or do i have to buy it in my country (Portugal)? However since the servers are in Germany, should i buy the licence in Germany? (And if anyone know a good reseller i would apreciate) Thx in advance

    Read the article

  • SQL Server 2008 - Get Geography From Record

    - by user336786
    Hello, I am new to using the geography types in SQL Server 2008. I have a table in my database called "Location". "Location" has the following columns and types: Location -------- City nvarchar(256) State nvarchar(256) PostalCode nvarchar(25) Latitude decimal(9, 6) Longitude decimal(9, 6) Each Location is related to a Store record in my database. I am trying to find the stores within a 10 mile radius or postal code or city/state that a user enters. To accomplish this, I know that I need to rely on geographies. At this time I have: DECLARE @startingPoint geography; SET @startingPoint=geography::STGeomFromText('POINT(-122.34900 47.65100)', 4326); That gives me the starting point from a hard-coded textual value. However, I do not know how to convert a lat/long from my Location table into a geography instance. How do I convert a lat/long in my database to a geography instance so I can continue to work on my query? Thank you!

    Read the article

  • Converting VS 2008 Project to VS 2010 - now .aspx won't load

    - by coffeeaddict
    I converted all my other projects fine from VS 2008 to 2010 and they run great. There is one project however for some reason after converting, when I try to run one of the .aspx pages in it, I get nothing...no error, just that it cannot display the page. Nothing has changed. The path is still the same, and the IIS website is still the same. I even recreated the site in IIS using the VS option to create it in the web project properties. This is a testing project..only has like one .aspx in it. Not sure why I get nothing after converting this. I did not convert it to .NET 4.0, it's still in v3.5 in VS 2010.

    Read the article

  • Doing extra initialisations on a MFC Dialog in Visual Studio C++ 2008 Pro

    - by theunanonim
    How do I make extra initializations on a modal dialog before calling DoModal(); ? The whole application is created using VS wizards. I have a main Dialog (the one that is created automatically when I select new MFC Application in Visual Studio 2008 Professional). When I click a button on this dialog I want to open another dialog and set a CString value into a CEdit control. my code: ... void MainDlg::OnClickedButtonX(){ SecondDialogClass Dlg2; Dlg2.asocVar2Cedit.SetWindowTextW(L"my text"); Dlg2.DoModal(); } //asocVar2Cedit is the associeted control variable to the //CEdit control on the second Dialog (Right Click > Add Variable.. in VSC++) ... this code generates at runtime a "Debug Assertion" error in winocc... Any ideas ? Thank you in advance.

    Read the article

  • Migrating Data to MSSQL 2008

    - by Fred Clown
    I am trying to migrate data from an Informix database to MSSQL 2008. I've got quite a lot of data to move. I've been try multiple methods to get the data over, and so far SQLBulkCopy in multiple chunks seems to be the fastest that I can find. Does anyone know of a faster means of getting the data over? I'm trying to cut down on the transfer time so that on my cut-over date I don't run out of time to do the full cut-over. Thanks.

    Read the article

  • TFS 2008 ignores team project check-in settings

    - by JoshEarl
    I'm trying to set up our TFS 2008 instance to require that projects build before they can be checked in. I have created a check-in policy using the out of the box "Builds" policy, but I'm still able to check broken projects in after mangling the code and attempting to build the project. We're a small shop, and TFS was originally set up with our team's Active Directory group listed as TFS admins. Is this the problem? Do check-in policies apply to TFS admins? Any other suggestions?

    Read the article

  • View changes nvarchars to varchars in SQL Server 2008

    - by Traples
    I have a view in a SQL Server 2008 db that simply exposes about 20 fields of one table to be consumed via ODBC to a client. When I tried to replicate this view in another database, the client could not consume the data source. Then I noticed some weirdness. The columns in the view are shown, in SQL Server Management Studio, to be varchar(100), while the columns in the table are defined as nvarchar(100). There are no CAST or CONVERT statements in the view, it is a simple SELECT statement. Example: Table - Columns: Desc1 (nvarchar(100), null) View - SELECT TOP 100 PERCENT Desc1 FROM... Columns: Desc1 (varchar(100), null) Any ideas why the columns are defined as varchar in the view instead of nvarchar?

    Read the article

  • Asp.net MVC database connection setting in Visual studio 2008

    - by AK47
    I am trying to learn Asp.net MVC framework. I was looking at the video tutorial at the link below http://www.asp.net/learn/mvc-videos/video-395.aspx In this video the very first step is to add a new database to the example application. I have visual studio installed on my development machine but the SqlServer Express is running on a different machine, so when I try and add a new database following the same steps as the video I get the following error "Connections to SQL Server files(*.mdf) require SQL express 2005 to funciton properly. Please verify the installation of the component or download from the url" I am assuming this is because Visual studio is looking for an instance of SQL express on my local machine and since it doesnt exist on the local machine,it errors out. So how do i tell visual studio, to connect to a different machine and create the database there? I am using Visual studio 2008 with .net 3.5 Sp1

    Read the article

  • TFS 2008 and Common libraries folder structure.

    - by Doerr
    TFS 2008 and Common Libraries I have created a Team Project called "Common Library" that will host code used in numerous different Team Projects throughout TFS. For sake of argument, lets say we have 2 distinct Librarys under the "Common Library" Team Projects, MailProject and LoggingProject. Other projects throughout TFS will be using the binary representation of these projects via branching and not the actual source code. What is the best way to set up the folder structure for this Team Project? Do I add the project to the "Common Library" and simply "include" the bin/release folder as part of the project? I have seen some examples of people creating a seperate "Deploy" folder. I assume this is synonamous with the bin/release folder?

    Read the article

  • Visual Studio 2008 error while debugging an app with "uiAccess=true" in the manifest

    - by Jon Tackabury
    I have a C# WinForms application that has "uiAccess" set to "True" in it's manifest file. When I try to start/debug it in Visual Studio 2008 SP1 under Windows 7 x64 (RTM) I get this error: Running an Accessibility application requires following the steps described in Help. The help button is a broken link, and clicking ok just closes the application. It is digitally signed, and I can start it just fine in Windows Explorer. Here is the same bug in MS Connect, but unfortunately it's closed: https://connect.microsoft.com/VisualStudio/feedback/ViewFeedback.aspx?FeedbackID=384183 Question: Can anyone else using Vista/Win7 x64 (with UAC enabled) confirm that they experience the same problem? Has anyone seen this problem before and have any idea how to work around it?

    Read the article

  • CruiseControl.NET Silverlight Unit Tests Interact with Desktop Windows Server 2008

    - by user292195
    Hi, Currently we have CCService running as a Domain account because the build scripts deploy to a network location. However this causes any unit tests that test the view to fail. Due to not being allowed to interact with desktop. I can change the CCService to run as local system which works however i loose network connectivity. I also have tried setting up a /interactive cmd.exe but this has been deprecated in Windows Server 2008. Any Ideas on this one? Thanks Jeremy

    Read the article

  • Search server 2008 express working with WSS 3.0 (error when crawl second web application (website) s

    - by tberube
    My search server 2008 express crawls 3 sharepoint servers and 1 windows file server. The file server and 2 other sharepoint server crawl all content and master and sub sites just fine. On the 3rd sharepoint server the default site crawl just fine...The second web application site (content database) crawls the top site and crawls all sub sites BUT The subsites get the below error in the crawl log. Deleted by the gatherer (The start address or content source that contained this item was deleted and hence this item was deleted.) I have checked the security rights (OK) I have checked the setting to crawl a SharePoint site (if wrong the top site would not work)... ???? Last part = I can crawl 3 other sharepoint sites (web applications/other content databases) on the same server...It seems to be just this one site...

    Read the article

  • SQL Server 2008 - Conditional Query

    - by Villager
    Hello, SQL is not one of my strong suits. I have a SQL Server 2008 database. This database has a stored procedure that takes in eight int parameters. For the sake of keeping this question focused, I will use one of these parameters for reference: @isActive int Each of these int parameters will be -1, 0, or 1. -1 means "Unknown" or "Don't Care". Basically, I need to query a table such that if the int parameter is NOT -1, I need to consider it in my WHERE clause. Because there are eight int parameters, an IF-ELSE statement does not seem like a good idea. At the same time, I do not know how else to do this? Is there an elegant way in SQL to add a WHERE conditional if a parameter does NOT equal a value? Thank you!

    Read the article

  • SQL Server 2008 Management Studio Activity Monitor

    - by Angelo
    Hi, I tried to turn on the Activity Monitor using SQL Server 2008 Management Studio (SSMS) through the options window of the application (Tools | Options | Environment | General | At Startup). I restarted SSMS and I am getting the following message: "This operation does not support connections to Microsoft SQL Server Standard Edition version 8.00.2249." I need to be able to monitor the processes and activities inside the database since I am investigating a particular application which takes a lot of time in its database data retrieval access and I am thinking it may be due to some locks or some processes. How do I resolve this? Inputs highly appreciated. Thanks.

    Read the article

  • SQL Compact 2008 Connection String Problem

    - by Seth
    I have the following code to connect to a sql server compact edition 2008: private SqlConnection sqlConn; public void createConnection() { String connectionString = @"Data Source=C:\Projects\somefile.sdf;Persist Security Info=False"; sqlConn = new SqlConnection(connectionString); sqlConn.Open(); } However, I keep getting the following error when sqlConn.Open() is executed: "A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)" Does anyone have any ideas what the problem might be? I can create a connection to the db in the database explorer but it doesn't seem to work in code.

    Read the article

  • Adding a new power scheme to Windows Server 2008 R2

    - by user296933
    I am trying to add a new power scheme to Windows Server 2008 R2. However, the new power scheme has the same settings as the current one. What am I doing something wrong? PGLOBAL_POWER_POLICY globalPolicy = new GLOBAL_POWER_POLICY(); PPOWER_POLICY powerPolicy = new POWER_POLICY(); printf("Idle sensitivity: %d\n", powerPolicy->user.IdleSensitivityAc); GetCurrentPowerPolicies(globalPolicy, powerPolicy); powerPolicy->user.IdleSensitivityAc = 100; PUINT pwrScheme = new UINT(); *pwrScheme = 10; WritePwrScheme(pwrScheme,(LPCWSTR)"MyScheme",(LPCWSTR)"MyScheme", powerPolicy); SetActivePwrScheme(pwrScheme,NULL,NULL); GetCurrentPowerPolicies(globalPolicy, powerPolicy); printf("Idle sensitivity: %d\n", powerPolicy->user.IdleSensitivityAc);

    Read the article

  • VS 2008 "Choose Data Source" wizard

    - by ELM
    Good Day, I'm using Visual Studio Professional 2008 SP 1. When I create a connection via the designer, the "Choose Data Source" dialog only lists the following data sources: Microsoft SQL Server Compact 3.5 Microsoft SQL Server Database File When I create a connection on the Server explorer the list is complete with : Microsoft SQL Server Compact 3.5, Microsoft SQL Server Database File, Microsoft SQL Server Compact, ODBC etc. Please help me out. I need to use SQL Server Compact. I have posted the same problem on the following thread with some screenshots: http://social.msdn.microsoft.com/Forums/en/vssetup/thread/906845c3-69e9-431a-ad07-7da2de684d33

    Read the article

  • SQL Server 2008 - Get Latest Record from Joined Table

    - by user336786
    Hello, I have a SQL Server 2008 database. This database has two tables called Customer and Order. These tables are defined as follows: Customer -------- ID, First Name, Last Name Order ----- ID, CustomerID, Date, Description I am trying to write a query that returns all of the customers in my database. If the user has placed at least one order, I want to return the information associated with the most recent order placed. Currently, I have the following: SELECT * FROM Customer c LEFT OUTER JOIN Order o ON c.[ID]=o.[CustomerID] As you can imagine, this will return all of the orders associated with a customer. In reality though, I only want the most recent one. How do I do this in SQL? Thank you!

    Read the article

  • What can you do in ::OnInitDialog() Visual Studio 2008 C++

    - by flirishman
    What can or cannot you do in ::OnInitDialog() Visual Studio 2008 C++ I would like to write out some text on the dialog at the dialog startup. If I put the same code in a PUSH-BUTTON OnBnClicked it works. If I put it in the OnInit, it does not give me the text on the screen. I'm assuming at the OnInit, my dialog box is not completely up, so I cannot write on it? CRect drawRect; drawRect.left = 00; // Shifts text to right drawRect.right = 300; drawRect.top = 00; // How Far Down drawRect.bottom = 300; // Clear out any previous name CString strBlank = "Book Name"; SSTextOut(this->GetDC(), strBlank, &drawRect, DT_LEFT); The function I am writing to is described in http://www.codeproject.com/KB/GDI/SSTextOut.aspx

    Read the article

  • What are good design practices when working with Entity Framework

    - by AD
    This will apply mostly for an asp.net application where the data is not accessed via soa. Meaning that you get access to the objects loaded from the framework, not Transfer Objects, although some recommendation still apply. This is a community post, so please add to it as you see fit. Applies to: Entity Framework 1.0 shipped with Visual Studio 2008 sp1. Why pick EF in the first place? Considering it is a young technology with plenty of problems (see below), it may be a hard sell to get on the EF bandwagon for your project. However, it is the technology Microsoft is pushing (at the expense of Linq2Sql, which is a subset of EF). In addition, you may not be satisfied with NHibernate or other solutions out there. Whatever the reasons, there are people out there (including me) working with EF and life is not bad.make you think. EF and inheritance The first big subject is inheritance. EF does support mapping for inherited classes that are persisted in 2 ways: table per class and table the hierarchy. The modeling is easy and there are no programming issues with that part. (The following applies to table per class model as I don't have experience with table per hierarchy, which is, anyway, limited.) The real problem comes when you are trying to run queries that include one or many objects that are part of an inheritance tree: the generated sql is incredibly awful, takes a long time to get parsed by the EF and takes a long time to execute as well. This is a real show stopper. Enough that EF should probably not be used with inheritance or as little as possible. Here is an example of how bad it was. My EF model had ~30 classes, ~10 of which were part of an inheritance tree. On running a query to get one item from the Base class, something as simple as Base.Get(id), the generated SQL was over 50,000 characters. Then when you are trying to return some Associations, it degenerates even more, going as far as throwing SQL exceptions about not being able to query more than 256 tables at once. Ok, this is bad, EF concept is to allow you to create your object structure without (or with as little as possible) consideration on the actual database implementation of your table. It completely fails at this. So, recommendations? Avoid inheritance if you can, the performance will be so much better. Use it sparingly where you have to. In my opinion, this makes EF a glorified sql-generation tool for querying, but there are still advantages to using it. And ways to implement mechanism that are similar to inheritance. Bypassing inheritance with Interfaces First thing to know with trying to get some kind of inheritance going with EF is that you cannot assign a non-EF-modeled class a base class. Don't even try it, it will get overwritten by the modeler. So what to do? You can use interfaces to enforce that classes implement some functionality. For example here is a IEntity interface that allow you to define Associations between EF entities where you don't know at design time what the type of the entity would be. public enum EntityTypes{ Unknown = -1, Dog = 0, Cat } public interface IEntity { int EntityID { get; } string Name { get; } Type EntityType { get; } } public partial class Dog : IEntity { // implement EntityID and Name which could actually be fields // from your EF model Type EntityType{ get{ return EntityTypes.Dog; } } } Using this IEntity, you can then work with undefined associations in other classes // lets take a class that you defined in your model. // that class has a mapping to the columns: PetID, PetType public partial class Person { public IEntity GetPet() { return IEntityController.Get(PetID,PetType); } } which makes use of some extension functions: public class IEntityController { static public IEntity Get(int id, EntityTypes type) { switch (type) { case EntityTypes.Dog: return Dog.Get(id); case EntityTypes.Cat: return Cat.Get(id); default: throw new Exception("Invalid EntityType"); } } } Not as neat as having plain inheritance, particularly considering you have to store the PetType in an extra database field, but considering the performance gains, I would not look back. It also cannot model one-to-many, many-to-many relationship, but with creative uses of 'Union' it could be made to work. Finally, it creates the side effet of loading data in a property/function of the object, which you need to be careful about. Using a clear naming convention like GetXYZ() helps in that regards. Compiled Queries Entity Framework performance is not as good as direct database access with ADO (obviously) or Linq2SQL. There are ways to improve it however, one of which is compiling your queries. The performance of a compiled query is similar to Linq2Sql. What is a compiled query? It is simply a query for which you tell the framework to keep the parsed tree in memory so it doesn't need to be regenerated the next time you run it. So the next run, you will save the time it takes to parse the tree. Do not discount that as it is a very costly operation that gets even worse with more complex queries. There are 2 ways to compile a query: creating an ObjectQuery with EntitySQL and using CompiledQuery.Compile() function. (Note that by using an EntityDataSource in your page, you will in fact be using ObjectQuery with EntitySQL, so that gets compiled and cached). An aside here in case you don't know what EntitySQL is. It is a string-based way of writing queries against the EF. Here is an example: "select value dog from Entities.DogSet as dog where dog.ID = @ID". The syntax is pretty similar to SQL syntax. You can also do pretty complex object manipulation, which is well explained [here][1]. Ok, so here is how to do it using ObjectQuery< string query = "select value dog " + "from Entities.DogSet as dog " + "where dog.ID = @ID"; ObjectQuery<Dog> oQuery = new ObjectQuery<Dog>(query, EntityContext.Instance)); oQuery.Parameters.Add(new ObjectParameter("ID", id)); oQuery.EnablePlanCaching = true; return oQuery.FirstOrDefault(); The first time you run this query, the framework will generate the expression tree and keep it in memory. So the next time it gets executed, you will save on that costly step. In that example EnablePlanCaching = true, which is unnecessary since that is the default option. The other way to compile a query for later use is the CompiledQuery.Compile method. This uses a delegate: static readonly Func<Entities, int, Dog> query_GetDog = CompiledQuery.Compile<Entities, int, Dog>((ctx, id) => ctx.DogSet.FirstOrDefault(it => it.ID == id)); or using linq static readonly Func<Entities, int, Dog> query_GetDog = CompiledQuery.Compile<Entities, int, Dog>((ctx, id) => (from dog in ctx.DogSet where dog.ID == id select dog).FirstOrDefault()); to call the query: query_GetDog.Invoke( YourContext, id ); The advantage of CompiledQuery is that the syntax of your query is checked at compile time, where as EntitySQL is not. However, there are other consideration... Includes Lets say you want to have the data for the dog owner to be returned by the query to avoid making 2 calls to the database. Easy to do, right? EntitySQL string query = "select value dog " + "from Entities.DogSet as dog " + "where dog.ID = @ID"; ObjectQuery<Dog> oQuery = new ObjectQuery<Dog>(query, EntityContext.Instance)).Include("Owner"); oQuery.Parameters.Add(new ObjectParameter("ID", id)); oQuery.EnablePlanCaching = true; return oQuery.FirstOrDefault(); CompiledQuery static readonly Func<Entities, int, Dog> query_GetDog = CompiledQuery.Compile<Entities, int, Dog>((ctx, id) => (from dog in ctx.DogSet.Include("Owner") where dog.ID == id select dog).FirstOrDefault()); Now, what if you want to have the Include parametrized? What I mean is that you want to have a single Get() function that is called from different pages that care about different relationships for the dog. One cares about the Owner, another about his FavoriteFood, another about his FavotireToy and so on. Basicly, you want to tell the query which associations to load. It is easy to do with EntitySQL public Dog Get(int id, string include) { string query = "select value dog " + "from Entities.DogSet as dog " + "where dog.ID = @ID"; ObjectQuery<Dog> oQuery = new ObjectQuery<Dog>(query, EntityContext.Instance)) .IncludeMany(include); oQuery.Parameters.Add(new ObjectParameter("ID", id)); oQuery.EnablePlanCaching = true; return oQuery.FirstOrDefault(); } The include simply uses the passed string. Easy enough. Note that it is possible to improve on the Include(string) function (that accepts only a single path) with an IncludeMany(string) that will let you pass a string of comma-separated associations to load. Look further in the extension section for this function. If we try to do it with CompiledQuery however, we run into numerous problems: The obvious static readonly Func<Entities, int, string, Dog> query_GetDog = CompiledQuery.Compile<Entities, int, string, Dog>((ctx, id, include) => (from dog in ctx.DogSet.Include(include) where dog.ID == id select dog).FirstOrDefault()); will choke when called with: query_GetDog.Invoke( YourContext, id, "Owner,FavoriteFood" ); Because, as mentionned above, Include() only wants to see a single path in the string and here we are giving it 2: "Owner" and "FavoriteFood" (which is not to be confused with "Owner.FavoriteFood"!). Then, let's use IncludeMany(), which is an extension function static readonly Func<Entities, int, string, Dog> query_GetDog = CompiledQuery.Compile<Entities, int, string, Dog>((ctx, id, include) => (from dog in ctx.DogSet.IncludeMany(include) where dog.ID == id select dog).FirstOrDefault()); Wrong again, this time it is because the EF cannot parse IncludeMany because it is not part of the functions that is recognizes: it is an extension. Ok, so you want to pass an arbitrary number of paths to your function and Includes() only takes a single one. What to do? You could decide that you will never ever need more than, say 20 Includes, and pass each separated strings in a struct to CompiledQuery. But now the query looks like this: from dog in ctx.DogSet.Include(include1).Include(include2).Include(include3) .Include(include4).Include(include5).Include(include6) .[...].Include(include19).Include(include20) where dog.ID == id select dog which is awful as well. Ok, then, but wait a minute. Can't we return an ObjectQuery< with CompiledQuery? Then set the includes on that? Well, that what I would have thought so as well: static readonly Func<Entities, int, ObjectQuery<Dog>> query_GetDog = CompiledQuery.Compile<Entities, int, string, ObjectQuery<Dog>>((ctx, id) => (ObjectQuery<Dog>)(from dog in ctx.DogSet where dog.ID == id select dog)); public Dog GetDog( int id, string include ) { ObjectQuery<Dog> oQuery = query_GetDog(id); oQuery = oQuery.IncludeMany(include); return oQuery.FirstOrDefault; } That should have worked, except that when you call IncludeMany (or Include, Where, OrderBy...) you invalidate the cached compiled query because it is an entirely new one now! So, the expression tree needs to be reparsed and you get that performance hit again. So what is the solution? You simply cannot use CompiledQueries with parametrized Includes. Use EntitySQL instead. This doesn't mean that there aren't uses for CompiledQueries. It is great for localized queries that will always be called in the same context. Ideally CompiledQuery should always be used because the syntax is checked at compile time, but due to limitation, that's not possible. An example of use would be: you may want to have a page that queries which two dogs have the same favorite food, which is a bit narrow for a BusinessLayer function, so you put it in your page and know exactly what type of includes are required. Passing more than 3 parameters to a CompiledQuery Func is limited to 5 parameters, of which the last one is the return type and the first one is your Entities object from the model. So that leaves you with 3 parameters. A pitance, but it can be improved on very easily. public struct MyParams { public string param1; public int param2; public DateTime param3; } static readonly Func<Entities, MyParams, IEnumerable<Dog>> query_GetDog = CompiledQuery.Compile<Entities, MyParams, IEnumerable<Dog>>((ctx, myParams) => from dog in ctx.DogSet where dog.Age == myParams.param2 && dog.Name == myParams.param1 and dog.BirthDate > myParams.param3 select dog); public List<Dog> GetSomeDogs( int age, string Name, DateTime birthDate ) { MyParams myParams = new MyParams(); myParams.param1 = name; myParams.param2 = age; myParams.param3 = birthDate; return query_GetDog(YourContext,myParams).ToList(); } Return Types (this does not apply to EntitySQL queries as they aren't compiled at the same time during execution as the CompiledQuery method) Working with Linq, you usually don't force the execution of the query until the very last moment, in case some other functions downstream wants to change the query in some way: static readonly Func<Entities, int, string, IEnumerable<Dog>> query_GetDog = CompiledQuery.Compile<Entities, int, string, IEnumerable<Dog>>((ctx, age, name) => from dog in ctx.DogSet where dog.Age == age && dog.Name == name select dog); public IEnumerable<Dog> GetSomeDogs( int age, string name ) { return query_GetDog(YourContext,age,name); } public void DataBindStuff() { IEnumerable<Dog> dogs = GetSomeDogs(4,"Bud"); // but I want the dogs ordered by BirthDate gridView.DataSource = dogs.OrderBy( it => it.BirthDate ); } What is going to happen here? By still playing with the original ObjectQuery (that is the actual return type of the Linq statement, which implements IEnumerable), it will invalidate the compiled query and be force to re-parse. So, the rule of thumb is to return a List< of objects instead. static readonly Func<Entities, int, string, IEnumerable<Dog>> query_GetDog = CompiledQuery.Compile<Entities, int, string, IEnumerable<Dog>>((ctx, age, name) => from dog in ctx.DogSet where dog.Age == age && dog.Name == name select dog); public List<Dog> GetSomeDogs( int age, string name ) { return query_GetDog(YourContext,age,name).ToList(); //<== change here } public void DataBindStuff() { List<Dog> dogs = GetSomeDogs(4,"Bud"); // but I want the dogs ordered by BirthDate gridView.DataSource = dogs.OrderBy( it => it.BirthDate ); } When you call ToList(), the query gets executed as per the compiled query and then, later, the OrderBy is executed against the objects in memory. It may be a little bit slower, but I'm not even sure. One sure thing is that you have no worries about mis-handling the ObjectQuery and invalidating the compiled query plan. Once again, that is not a blanket statement. ToList() is a defensive programming trick, but if you have a valid reason not to use ToList(), go ahead. There are many cases in which you would want to refine the query before executing it. Performance What is the performance impact of compiling a query? It can actually be fairly large. A rule of thumb is that compiling and caching the query for reuse takes at least double the time of simply executing it without caching. For complex queries (read inherirante), I have seen upwards to 10 seconds. So, the first time a pre-compiled query gets called, you get a performance hit. After that first hit, performance is noticeably better than the same non-pre-compiled query. Practically the same as Linq2Sql When you load a page with pre-compiled queries the first time you will get a hit. It will load in maybe 5-15 seconds (obviously more than one pre-compiled queries will end up being called), while subsequent loads will take less than 300ms. Dramatic difference, and it is up to you to decide if it is ok for your first user to take a hit or you want a script to call your pages to force a compilation of the queries. Can this query be cached? { Dog dog = from dog in YourContext.DogSet where dog.ID == id select dog; } No, ad-hoc Linq queries are not cached and you will incur the cost of generating the tree every single time you call it. Parametrized Queries Most search capabilities involve heavily parametrized queries. There are even libraries available that will let you build a parametrized query out of lamba expressions. The problem is that you cannot use pre-compiled queries with those. One way around that is to map out all the possible criteria in the query and flag which one you want to use: public struct MyParams { public string name; public bool checkName; public int age; public bool checkAge; } static readonly Func<Entities, MyParams, IEnumerable<Dog>> query_GetDog = CompiledQuery.Compile<Entities, MyParams, IEnumerable<Dog>>((ctx, myParams) => from dog in ctx.DogSet where (myParams.checkAge == true && dog.Age == myParams.age) && (myParams.checkName == true && dog.Name == myParams.name ) select dog); protected List<Dog> GetSomeDogs() { MyParams myParams = new MyParams(); myParams.name = "Bud"; myParams.checkName = true; myParams.age = 0; myParams.checkAge = false; return query_GetDog(YourContext,myParams).ToList(); } The advantage here is that you get all the benifits of a pre-compiled quert. The disadvantages are that you most likely will end up with a where clause that is pretty difficult to maintain, that you will incur a bigger penalty for pre-compiling the query and that each query you run is not as efficient as it could be (particularly with joins thrown in). Another way is to build an EntitySQL query piece by piece, like we all did with SQL. protected List<Dod> GetSomeDogs( string name, int age) { string query = "select value dog from Entities.DogSet where 1 = 1 "; if( !String.IsNullOrEmpty(name) ) query = query + " and dog.Name == @Name "; if( age > 0 ) query = query + " and dog.Age == @Age "; ObjectQuery<Dog> oQuery = new ObjectQuery<Dog>( query, YourContext ); if( !String.IsNullOrEmpty(name) ) oQuery.Parameters.Add( new ObjectParameter( "Name", name ) ); if( age > 0 ) oQuery.Parameters.Add( new ObjectParameter( "Age", age ) ); return oQuery.ToList(); } Here the problems are: - there is no syntax checking during compilation - each different combination of parameters generate a different query which will need to be pre-compiled when it is first run. In this case, there are only 4 different possible queries (no params, age-only, name-only and both params), but you can see that there can be way more with a normal world search. - Noone likes to concatenate strings! Another option is to query a large subset of the data and then narrow it down in memory. This is particularly useful if you are working with a definite subset of the data, like all the dogs in a city. You know there are a lot but you also know there aren't that many... so your CityDog search page can load all the dogs for the city in memory, which is a single pre-compiled query and then refine the results protected List<Dod> GetSomeDogs( string name, int age, string city) { string query = "select value dog from Entities.DogSet where dog.Owner.Address.City == @City "; ObjectQuery<Dog> oQuery = new ObjectQuery<Dog>( query, YourContext ); oQuery.Parameters.Add( new ObjectParameter( "City", city ) ); List<Dog> dogs = oQuery.ToList(); if( !String.IsNullOrEmpty(name) ) dogs = dogs.Where( it => it.Name == name ); if( age > 0 ) dogs = dogs.Where( it => it.Age == age ); return dogs; } It is particularly useful when you start displaying all the data then allow for filtering. Problems: - Could lead to serious data transfer if you are not careful about your subset. - You can only filter on the data that you returned. It means that if you don't return the Dog.Owner association, you will not be able to filter on the Dog.Owner.Name So what is the best solution? There isn't any. You need to pick the solution that works best for you and your problem: - Use lambda-based query building when you don't care about pre-compiling your queries. - Use fully-defined pre-compiled Linq query when your object structure is not too complex. - Use EntitySQL/string concatenation when the structure could be complex and when the possible number of different resulting queries are small (which means fewer pre-compilation hits). - Use in-memory filtering when you are working with a smallish subset of the data or when you had to fetch all of the data on the data at first anyway (if the performance is fine with all the data, then filtering in memory will not cause any time to be spent in the db). Singleton access The best way to deal with your context and entities accross all your pages is to use the singleton pattern: public sealed class YourContext { private const string instanceKey = "On3GoModelKey"; YourContext(){} public static YourEntities Instance { get { HttpContext context = HttpContext.Current; if( context == null ) return Nested.instance; if (context.Items[instanceKey] == null) { On3GoEntities entity = new On3GoEntities(); context.Items[instanceKey] = entity; } return (YourEntities)context.Items[instanceKey]; } } class Nested { // Explicit static constructor to tell C# compiler // not to mark type as beforefieldinit static Nested() { } internal static readonly YourEntities instance = new YourEntities(); } } NoTracking, is it worth it? When executing a query, you can tell the framework to track the objects it will return or not. What does it mean? With tracking enabled (the default option), the framework will track what is going on with the object (has it been modified? Created? Deleted?) and will also link objects together, when further queries are made from the database, which is what is of interest here. For example, lets assume that Dog with ID == 2 has an owner which ID == 10. Dog dog = (from dog in YourContext.DogSet where dog.ID == 2 select dog).FirstOrDefault(); //dog.OwnerReference.IsLoaded == false; Person owner = (from o in YourContext.PersonSet where o.ID == 10 select dog).FirstOrDefault(); //dog.OwnerReference.IsLoaded == true; If we were to do the same with no tracking, the result would be different. ObjectQuery<Dog> oDogQuery = (ObjectQuery<Dog>) (from dog in YourContext.DogSet where dog.ID == 2 select dog); oDogQuery.MergeOption = MergeOption.NoTracking; Dog dog = oDogQuery.FirstOrDefault(); //dog.OwnerReference.IsLoaded == false; ObjectQuery<Person> oPersonQuery = (ObjectQuery<Person>) (from o in YourContext.PersonSet where o.ID == 10 select o); oPersonQuery.MergeOption = MergeOption.NoTracking; Owner owner = oPersonQuery.FirstOrDefault(); //dog.OwnerReference.IsLoaded == false; Tracking is very useful and in a perfect world without performance issue, it would always be on. But in this world, there is a price for it, in terms of performance. So, should you use NoTracking to speed things up? It depends on what you are planning to use the data for. Is there any chance that the data your query with NoTracking can be used to make update/insert/delete in the database? If so, don't use NoTracking because associations are not tracked and will causes exceptions to be thrown. In a page where there are absolutly no updates to the database, you can use NoTracking. Mixing tracking and NoTracking is possible, but it requires you to be extra careful with updates/inserts/deletes. The problem is that if you mix then you risk having the framework trying to Attach() a NoTracking object to the context where another copy of the same object exist with tracking on. Basicly, what I am saying is that Dog dog1 = (from dog in YourContext.DogSet where dog.ID == 2).FirstOrDefault(); ObjectQuery<Dog> oDogQuery = (ObjectQuery<Dog>) (from dog in YourContext.DogSet where dog.ID == 2 select dog); oDogQuery.MergeOption = MergeOption.NoTracking; Dog dog2 = oDogQuery.FirstOrDefault(); dog1 and dog2 are 2 different objects, one tracked and one not. Using the detached object in an update/insert will force an Attach() that will say "Wait a minute, I do already have an object here with the same database key. Fail". And when you Attach() one object, all of its hierarchy gets attached as well, causing problems everywhere. Be extra careful. How much faster is it with NoTracking It depends on the queries. Some are much more succeptible to tracking than other. I don't have a fast an easy rule for it, but it helps. So I should use NoTracking everywhere then? Not exactly. There are some advantages to tracking object. The first one is that the object is cached, so subsequent call for that object will not hit the database. That cache is only valid for the lifetime of the YourEntities object, which, if you use the singleton code above, is the same as the page lifetime. One page request == one YourEntity object. So for multiple calls for the same object, it will load only once per page request. (Other caching mechanism could extend that). What happens when you are using NoTracking and try to load the same object multiple times? The database will be queried each time, so there is an impact there. How often do/should you call for the same object during a single page request? As little as possible of course, but it does happens. Also remember the piece above about having the associations connected automatically for your? You don't have that with NoTracking, so if you load your data in multiple batches, you will not have a link to between them: ObjectQuery<Dog> oDogQuery = (ObjectQuery<Dog>)(from dog in YourContext.DogSet select dog); oDogQuery.MergeOption = MergeOption.NoTracking; List<Dog> dogs = oDogQuery.ToList(); ObjectQuery<Person> oPersonQuery = (ObjectQuery<Person>)(from o in YourContext.PersonSet select o); oPersonQuery.MergeOption = MergeOption.NoTracking; List<Person> owners = oPersonQuery.ToList(); In this case, no dog will have its .Owner property set. Some things to keep in mind when you are trying to optimize the performance. No lazy loading, what am I to do? This can be seen as a blessing in disguise. Of course it is annoying to load everything manually. However, it decreases the number of calls to the db and forces you to think about when you should load data. The more you can load in one database call the better. That was always true, but it is enforced now with this 'feature' of EF. Of course, you can call if( !ObjectReference.IsLoaded ) ObjectReference.Load(); if you want to, but a better practice is to force the framework to load the objects you know you will need in one shot. This is where the discussion about parametrized Includes begins to make sense. Lets say you have you Dog object public class Dog { public Dog Get(int id) { return YourContext.DogSet.FirstOrDefault(it => it.ID == id ); } } This is the type of function you work with all the time. It gets called from all over the place and once you have that Dog object, you will do very different things to it in different functions. First, it should be pre-compiled, because you will call that very often. Second, each different pages will want to have access to a different subset of the Dog data. Some will want the Owner, some the FavoriteToy, etc. Of course, you could call Load() for each reference you need anytime you need one. But that will generate a call to the database each time. Bad idea. So instead, each page will ask for the data it wants to see when it first request for the Dog object: static public Dog Get(int id) { return GetDog(entity,"");} static public Dog Get(int id, string includePath) { string query = "select value o " + " from YourEntities.DogSet as o " +

    Read the article

  • SQL SERVER – Interview Questions & Answers Needs Your Help

    - by pinaldave
    About an year ago, I had posted SQL Server related Interview Questions and Answers. It was very well received in community. I have received many comments, suggestions and emails on this subject. I am planning to upgrade the Interview Questions and Answers and take it to next level. Here, I need your help. Please your comments, suggestions, expectation or potential interview Question (along with answer) here. Your input will be very valuable. As time goes by we all learn and get better. There were few things missing at that time when those interview questions and answers were prepared, now is the time to complete the gap and make this interview questions more useful. If you know all, this Question and Answers are not for you. This are for those who is eager to learn and need help in the area. If you do not want to leave a comment, I suggest to send me email at pinal “at” SQLAuthority.com Following is the reproduction of original consolidation post for quick reference. SQL SERVER – 2008 – Interview Questions and Answers – Part 1 SQL SERVER – 2008 – Interview Questions and Answers – Part 2 SQL SERVER – 2008 – Interview Questions and Answers – Part 3 SQL SERVER – 2008 – Interview Questions and Answers – Part 4 SQL SERVER – 2008 – Interview Questions and Answers – Part 5 SQL SERVER – 2008 – Interview Questions and Answers – Part 6 SQL SERVER – 2008 – Interview Questions and Answers – Part 7 SQL SERVER – 2008 – Interview Questions and Answers – Part 8 Download SQL Server 2008 Interview Questions and Answers Complete List Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, Readers Contribution, Readers Question, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Interview Questions and Answers, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Windows HPC Server links

    I've already described how to setup a Windows HPC Server for development. Before you dive into developing for the cluster, if you are new to this it is probably a good idea to learn the basics by reading some overview material. Below is a list of links.Direct Links to Windows HPC content1. Windows HPC Server 2008 Overview Datasheet (4 page pdf).2. Windows HPC Server 2008 Technical Overview (32 page doc).3. Windows HPC Server 2008 Getting Started Guide (26 page doc) which actually is available online as part of the TechNet technical library section on Windows HPC Server 2008, which includes much more useful data.4. Windows HPC Server 2008 Job Scheduler (38 page doc).5. Windows HPC Server 2008 Job Templates (56 page doc).6. Developing for the Windows HPC Server 2008 Platform (16 page doc or pdf version).Windows HPC sites7. Windows HPC Forums.8. HPC Developer Resources.9. Windows HPC Server 2008 Resource Kit - Developer.10. Windows HPC Server 2008 - TechNet.11. The Windows HPC Team Blog.HPC Course12. High-Performance Computing Fundamentals Course (pluralisight)13. Classic HPC Development using Visual C++ (course slides and materials in a ZIP). Author's blog post.14. From sequential to parallel code (course slides and materials in a ZIP). Author's blog post. Next time I will post resources specific to the most popular programming models for the cluster today: MPI and Cluster SOA - until then, happy reading! Comments about this post welcome at the original blog.

    Read the article

  • Unable to install SQL 2008 on Windows 7

    - by Axel
    SQL 2008 install hangs on Windows 7 The story: Trying to install SQL2008 on Windows 7 hangs on SqlEngineDBStartconfigAction_install_configrc_Cpu32. What I Tried: Uninstall hangs on validation Manual uninstall using msiinv.exe and msiexec /x works Added SQL service accounts to local admins no help Turn of UAC no help Last lines in setup log: 2010-04-01 16:18:05 SQLEngine: : Checking Engine checkpoint 'GetSqlServerProcessHandle' 2010-04-01 16:18:05 SQLEngine: --SqlServerServiceSCM: Waiting for nt event 'Global\sqlserverRecComplete' to be created 2010-04-01 16:18:07 SQLEngine: --SqlServerServiceSCM: Waiting for nt event 'Global\sqlserverRecComplete' or sql process handle to be signaled 2010-04-01 16:18:07 SQLEngine: : Checking Engine checkpoint 'WaitSqlServerStartEvents' 2010-04-01 16:18:53 Slp: Sco: Attempting to initialize script 2010-04-01 16:18:53 Slp: Sco: Attempting to initialize default connection string 2010-04-01 16:18:53 Slp: Sco: Attempting to set script connection protocol to NotSpecified 2010-04-01 16:18:53 Slp: Sco: Attempting to set script connection protocol to NamedPipes 2010-04-01 16:18:53 SQLEngine: --SqlDatabaseServiceConfig: Connection String: Data Source=\.\pipe\SQLLocal\MSSQLSERVER;Initial Catalog=master;Integrated Security=True;Pooling=False;Network Library=dbnmpntw;Application Name=SqlSetup 2010-04-01 16:18:53 SQLEngine: : Checking Engine checkpoint 'ServiceConfigConnect' 2010-04-01 16:18:53 SQLEngine: --SqlDatabaseServiceConfig: Connecting to SQL.... 2010-04-01 16:18:53 Slp: Sco: Attempting to connect script 2010-04-01 16:18:53 Slp: Connection string: Data Source=\.\pipe\SQLLocal\MSSQLSERVER;Initial Catalog=master;Integrated Security=True;Pooling=False;Network Library=dbnmpntw;Application Name=SqlSetup And now comes the fun part: When I open conf mgr I can see the service running, I enabled named pipes and TCP/IP, restarted the service I'm able to connect to the server using an OLE DB connection but not with the Native Client. And what I find suspicious is the following error in my app log: .NET Runtime Optimization Service (clr_optimization_v2.0.50727_32) - Failed to compile: C:\Program Files\Microsoft SQL Server\100\Tools\Binn\VSShell\Common7\Tools\VDT\DataProjects.dll . Error code = 0x8007000b In MS connect this is reported as a bug but MS is unable to reproduce the problem altough when you search the fora I'm not the only one with this problem. So any help is appreciated.

    Read the article

< Previous Page | 241 242 243 244 245 246 247 248 249 250 251 252  | Next Page >