Search Results

Search found 6744 results on 270 pages for 'linq to entities'.

Page 113/270 | < Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >

  • How does OfType<T>() Work?

    - by TheCloudlessSky
    How does OfType() Work? I read this link about what's going on but how exactly does the LINQ provider know how to get all objects matching the specified type. I know the IQueryable<T> "chains" up requests and then evaluates when GetEnumerator() is called (right?). Specifically I want to know how does the framework quickly do type comparison? I wrote a method in a .NET 2.0 project that went like this (since 2.0 doesn't support these kind of features): public IEnumerable<TResult> OfType<TResult>() where TResult : class { foreach (TItem item in this.InnerList) { TResult matchItem = item as TResult; if (matchItem != null) { yield return matchItem; } } } Is this the best implementation?

    Read the article

  • Using Large Arrays in VB.NET

    - by Tim
    I want to extract large amounts of data from Excel, manipulate it and put it back. I have found the best way to do this is to extract the data from an Excel Range in to a large array, change the contents on the array and write it back to the Excel Range. I am now rewriting the application using VB.NET 2008/2010 and wish to take advantage of any new features. Currently I have to loop through the contents of the array to find elements with certain values; also sorting large arrays is cumbersome. I am looking to use the new features, including LINQ to manipulate the data in my array. Does anybody have any advice on the easiest ways to filter / query, sort etc. data in a large array. Also what are the reasonable limits to the size of the array? ~Many Thanks

    Read the article

  • sum of timespans

    - by frenchie
    I have a collection of objects that include a timespan variable: MyObject { TimeSpan TheDuration {get;set;} } I want to use linq to sum those times. Of course, (from r in MyCollection select r.TheDuration).Sum(); doesn't work! I'm thinking of changing the datatype of TheDuration to an int and then summing it and converting the sum to a TimeSpan. That'll be messy because each TheDuration in my collection is used in as a timespan somewhere else. Any suggestion on this summation?

    Read the article

  • List.clear() followed by List.add() not working.

    - by Vincent
    I have the following C# Class/Function: class Hand { private List<Card> myCards = new List<Card>(); public void sortBySuitValue() { IEnumerable<Card> query = from s in myCards orderby (int)s.suit, (int)s.value select s; myCards = new List<Card>(); myCards.AddRange(query); } } On a card Game. This works fine, however, I had trouble at first, instead of using myCards = new List(); to 'reset' myCards, I would use myCards.clear(), however, once I called the clear function, I would not be able to call myCards.add() or myCards.addRange(). The count would stay at zero. Is my current approach good? Is using LINQ to sort my cards good/bad?

    Read the article

  • Any Way to Use a Join in a Lambda Where() on a Table<>?

    - by lush
    I'm in my first couple of days using Linq in C#, and I'm curious to know if there is a more concise way of writing the following. MyEntities db = new MyEntities(ConnString); var q = from a in db.TableA join b in db.TableB on a.SomeFieldID equals b.SomeFieldID where (a.UserID == CurrentUser && b.MyField == Convert.ToInt32(MyDropDownList.SelectedValue)) select new { a, b }; if(q.Any()) { //snip } I know that if I were to want to check the existence of a value in the field of a single table, I could just use the following: if(db.TableA.Where(u => u.UserID == CurrentUser).Any()) { //snip } But I'm curious to know if there is a way to do the lambda technique, but where it would satisfy the first technique's conditions across those two tables. Sorry for any mistakes or clarity, I'll edit as necessary. Thanks in advance.

    Read the article

  • Check If Stored Procedure Returns Value

    - by Eric
    Hello all, I am using Linq 2 Sql in VS 2010, and I have the following stored procedure to check a username and password ALTER PROCEDURE dbo.CheckUser ( @username varchar(50), @password varchar(50) ) AS SELECT * FROM Users Where UserName=@username AND Password=@password The problem I'm having is that it throws an exception if the username and password are incorrect. I'd like to perform a check to see if there is a return value, rather than using try/catch to determine whether the procedure returned a value. Should I do this check in code (C#)? Or is there a way to do it in SQL? Thanks.

    Read the article

  • Why does do this code do if(sz !=sz2) sz = sz2 !?!

    - by acidzombie24
    For the first time i created a linq to sql classes. I decided to look at the class and found this. What... why is it doing if(sz !=sz2) { sz = sz2; }. I dont understand. Why isnt the set generated as this._Property1 = value? private string _Property1; [Column(Storage="_Property1", CanBeNull=false)] public string Property1 { get { return this._Property1; } set { if ((this._Property1 != value)) { this._Property1 = value; } } }

    Read the article

  • Performance due to entity update

    - by Rizzo
    I always think about 2 ways to code the global Step() function, both with pros and cons. Please note that AIStep is just to provide another more step for whoever who wants it. // Approach 1 step foreach( entity in entities ) { entity.DeltaStep( delta_time ); if( time_for_fixed_step ) entity.FixedStep(); if( time_for_AI_step ) entity.AIStep(); ... // all kind of updates you want } PRO: you just have to iterate once over all entities. CON: fidelity could be lower at some scenarios, since the entity.FixedStep() isn't going all at a time. // Approach 2 step foreach( entity in entities ) entity.DeltaStep( delta_time ); if( time_for_fixed_step ) foreach( entity in entities ) entity.FixedStep(); if( time_for_AI_step ) foreach( entity in entities ) entity.FixedStep(); // all kind of updates you want SEPARATED PRO: fidelity on FixedStep is higher, shouldn't be much time between all entities update, rather than Approach 1 where you may have to wait other updates until FixedStep() comes. CON: you iterate once for each kind of update. Also, a third approach could be a hybrid between both of them, something in the way of foreach( entity in entities ) { entity.DeltaStep( delta_time ); if( time_for_AI_step ) entity.AIStep(); // all kind of updates you want BUT FixedStep() } if( time_for_fixed_step ) { foreach( entity in entities ) { entity.FixedStep(); } } Just two loops, don't caring about time fidelity in nothing other than at FixedStep(). Any thoughts on this matter? Should it really matters to make all steps at once or am I thinking on problems that don't exist?

    Read the article

  • How do you cast a LinqToSql Table<TEntity> as a Table<IEntity> where TEntity : IEntity?

    - by DanM
    I'm trying to use DbLinq with a SQLite database, but I'm running into a problem when I try to cast an ITable as a Queryable<TEntity>. There is a known bug in DbLinq (Issue 211), which might be the source of my problem, but I wanted to make sure my code is sound and, if it is, find out if there might be something I can do to work around the bug. Here is the generic repository method that attempts to do the cast: public IQueryable<TEntity> GetAll() { return Table.Cast<TEntity>(); // Table is an ITable } This compiles, but if I pass in the interface IPerson for TEntity and the type of the entities in the table is Person (where Person : IPerson), I'm getting this error from DbLinq: S0133: Implement QueryMethod Queryable.Cast. Why am I trying to do this? I have a library project that doesn't know the type of the entity until runtime, but it does know the interface for the entity. So, I'm trying to cast to the interface type so that my library project can consume the data. Questions: Am I attempting an impossible cast or is this definitely a bug in DbLinq? How else could I go about solving my problem?

    Read the article

  • Entity Framework with ASP.NET MVC. Updating entity problem

    - by Kitaly
    Hi people. I'm trying to update an entity and its related entities as well. For instance, I have a class Car with a property Category and I want to change its Category. So, I have the following methods in the Controller: public ActionResult Edit(int id) { var categories = context.Categories.ToList(); ViewData["categories"] = new SelectList(categories, "Id", "Name"); var car = context.Cars.Where(c => c.Id == id).First(); return PartialView("Form", car); } [AcceptVerbs(HttpVerbs.Post)] public ActionResult Edit(Car car) { var category = context.Categories.Where(c => c.Id == car.Category.Id).First(); car.Category = category; context.UpdateCar(car); context.SaveChanges(); return RedirectToAction("Index"); } The UpdateCar method, in ObjectContext class, follows: public void UpdateCar(Car car) { var attachedCar = Cars.Where(c => c.Id == car.Id).First(); ApplyItemUpdates(attachedCar, car); } private void ApplyItemUpdates(EntityObject originalItem, EntityObject updatedItem) { try { ApplyPropertyChanges(originalItem.EntityKey.EntitySetName, updatedItem); ApplyReferencePropertyChanges(updatedItem, originalItem); } catch (InvalidOperationException ex) { Console.WriteLine(ex.ToString()); } } public void ApplyReferencePropertyChanges(IEntityWithRelationships newEntity, IEntityWithRelationships oldEntity) { foreach (var relatedEnd in oldEntity.RelationshipManager.GetAllRelatedEnds()) { var oldRef = relatedEnd as EntityReference; if (oldRef != null) { var newRef = newEntity.RelationshipManager.GetRelatedEnd(oldRef.RelationshipName, oldRef.TargetRoleName) as EntityReference; oldRef.EntityKey = newRef.EntityKey; } } } The problem is that when I set the Category property after the POST in my controller, the entity state changes to Added instead of remaining as Detached. How can I update one-to-one relationship with Entity Framework and ASP.NET MVC without setting all the properties, one by one like this post?

    Read the article

  • How to access a method on a generic datacontext which is only created at runtime

    - by Jeremy Holt
    I'm creating my generic DataContext using only the connectionString in the ctor. I have no issues in retrieving the table using DataContext.GetTable(). However, I need to also be able to retrieve entities of inline table functions. The dbml designer generates public IQueryable<testFunctionResult> testFunction() { return this.CreateMethodCallQuery<testFunctionResult>(this, ((MethodInfo)(MethodInfo.GetCurrentMethod()))); } The question is how do I get the MethodInfo.GetCurrentMethod() when the DataContext has no method called "testFunction", i.e.typeof(DataContext).GetMethod("testFunction") returns null? What I'm trying to achieve is something like: public class UnitofWork<T> { public UnitofWork(string connectionString) { this.DataContext = new DataContext(connectionString); } public UnitofWork(IQueryable<T> tableEntity) { _tableEntity = tableEntity; } public IQueryable<T> TableEntity { get { if (DataContext == null) return _tableEntity; var metaType = DataContext.Mapping.GetMetaType(typeof (T)); if (metaType.IsEntity) _tableEntity = DataContext.GetTable<T>(); else { var s = typeof(T).Name; string methodName = s.Substring(0, s.IndexOf("Result")) + "()"; // the designer automatically affixes "Result" to the type name // Make a method from methodName // _tableEntity = DataContext.CreateMethodCallQuery(DataContext, method, new object[]{}); } return _tableEntity; } set { _tableEntity = value; } } ) Thanks in advance for any insight Jeremy

    Read the article

  • Entity framework 4.0 compiled query with Where() clause issue

    - by Andrey Salnikov
    Hello, I encountered with some strange behavior of System.Data.Objects.CompiledQuery.Compile function - here is my code for compile simple query: private static readonly Func<DataContext, long, Product> productQuery = CompiledQuery.Compile((DataContext ctx, long id) => ctx.Entities.OfType<Data.Product>().Where(p => p.Id == id) .Select(p=>new Product{Id = p.Id}).SingleOrDefault()); where DataContext inherited from ObjectContext and Product is a projection of POCO Data.Product class. My data context in first run contains Data.Product {Id == 1L} and in second Data.Product {Id == 2L}. First using of compilled query productQuery(dataContext, 1L) works perfect - in result I have Product {Id == 1L} but second run productQuery(dataContext, 2L) always returns null, instead of context in second run contains single product with id == 2L. If I remove Where clause I will get correct product (with id == 2L). It seems that first id value caching while first run of productQuery, and therefore all further calls valid only when dataContext contains Data.Product {id==1L}. This issue can't be reproduced if I've used direct query instead of its precompiled version. Also, all tests I've performed on test mdf base using SQL Server 2008 express and Visual studio 2010 final from my ASP.net application.

    Read the article

  • ReSharper 5.0s LINQ Refactoring Continues to be Amazing!

    In this post, well take a straight forward procedure based set of code and convert it to LINQ using a ReSharper from JetBrains suggestion.   Ive found that in general, when I do things with foreach syntax, there is often a better way in Linq to do this.  The better does not jump out [...]...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Fin de LINQ to HPC : Microsoft abandonne sa plateforme de traitement de gros volumes de données pour son concurrent Hadoop

    Fin de LINQ to HPC : Microsoft abandonne sa plateforme de traitement de gros volumes de données pour se concentrer sur le support de son concurrent Hadoop Microsoft abandonne LINQ to HPC (High performance computing), nom de code Dryad, sa propre plateforme haute performance pour des calculs distribués et la gestion intensive des données, pour se concentrer sur le support de son concurrent Hadoop dans ses produits. L'éditeur avait récemment manifesté son intérêt pour la plateforme Java de stockage et de traitement par lot de très grandes quantités de données (Big Data) Hadoop, en publiant notamment deux connecteurs

    Read the article

  • Benchmark Linq2SQL, Subsonic2, Subsonic3 - Any other ideas to make them faster ?

    - by Aristos
    I am working with Subsonic 2 more than 3 years now... After Linq appears and then Subsonic 3, I start thinking about moving to the new Linq futures that are connected to sql. I must say that I start move and port my subsonic 2 with SubSonic 3, and very soon I discover that the speed was so slow thats I didn't believe it - and starts all that tests. Then I test Linq2Sql and see also a delay - compare it with Subsonic 2. My question here is, especial for the linq2sql, and the up-coming dotnet version 4, what else can I do to speed it up ? What else on linq2sql settings, or classes, not on this code that I have used for my messures I place here the project that I make the tests, also the screen shots of the results. How I make the tests - and the accurate of my measures. I use only for my question Google chrome, because its difficult for me to show here a lot of other measures that I have done with more complex programs. This is the most simple one, I just measure the Data Read. How can I prove that. I make a simple Thread.Sleep(10 seconds) and see if I see that 10 seconds on Google Chrome Measure, and yes I see it. here are more test with this Sleep thead to see whats actually Chrome gives. 10 seconds delay 100 ms delay Zero delay There is only a small 15ms thats get on messure, is so small compare it with the rest of my tests that I do not care about. So what I measure I measure just the data read via each method - did not count the data or database delay, or any disk read or anything like that. Later on the image with the result I show that no disk activity exist on the measures See this image to see what really I measure and if this is correct Why I chose this kind of test Its simple, it's real, and it's near my real problem that I found the delay of subsonic 3 in real program with real data. Now lets tests the dals Start by see this image I have 4-5 calls on every method, the one after the other. The results are. For a loop of 100 times, ask for 5 Rows, one not exist, approximatively.. Simple adonet:81ms SubSonic 2 :210ms linq2sql :1.70sec linq2sql using CompiledQuery.Compile :239ms Subsonic 3 :15.00sec (wow - extreme slow) The project http://www.planethost.gr/DalSpeedTests.rar Can any one confirm this benchmark, or make any optimizations to help me out ? Other tests Some one publish here this link http://ormbattle.net/ (and then remove it - don not know why) In this page you can find a really useful advanced tests for all, except subsonic 2 and subsonic 3 that I have here ! Optimizing What I really ask here is if some one can now any trick how to optimize the DALs, not by changing the test code, but by changing the code and the settings on each dal. For example... Optimizing Linq2SQL I start search how to optimize Linq2sql and found this article, and maybe more exist. Finally I make the tricks from that page to run, and optimize the code using them all. The speed was near 1.50sec from 1.70.... big improvement, but still slow. Then I found a different way - same idea article, and wow ! the speed is blow up. Using this trick with CompiledQuery.Compile, the time from 1.5sec is now 239ms. Here is the code for the precompiled... Func<DataClassesDataContext, int, IQueryable<Product>> compiledQuery = CompiledQuery.Compile((DataClassesDataContext meta, int IdToFind) => (from myData in meta.Products where myData.ProductID.Equals(IdToFind) select myData)); StringBuilder Test = new StringBuilder(); int[] MiaSeira = { 5, 6, 10, 100, 7 }; using (DataClassesDataContext context = new DataClassesDataContext()) { context.ObjectTrackingEnabled = false; for (int i = 0; i < 100; i++) { foreach (int EnaID in MiaSeira) { var oFindThat2P = compiledQuery(context, EnaID); foreach (Product One in oFindThat2P) { Test.Append("<br />"); Test.Append(One.ProductName); } } } } Optimizing SubSonic 3 and problems I make many performance profiling, and start change the one after the other and the speed is better but still too slow. I post them on subsonic group but they ignore the problem, they say that everything is fast... Here is some capture of my profiling and delay points inside subsonic source code I have end up that subsonic3 make more call on the structure of the database rather than on data itself. Needs to reconsider the hole way of asking for data, and follow the subsonic2 idea if this is possible. Try to make precompile to subsonic 3 like I did in linq2Sql but fail for the moment... Optimizing SubSonic 2 After I discover that subsonic 3 is extreme slow, I start my checks on subsonic 2 - that I have never done before believing that is fast. (and it is) So its come up with some points that can be faster. For example there are many loops like this ones that actually is slow because of string manipulation and compares inside the loop. I must say to you that this code called million of times ! on a period of few minutes ! of data asking from the program. On small amount of tables and small fields maybe this is not a big think for some people, but on large amount of tables, the delay is even more. So I decide and optimize the subsonic 2 by my self, by replacing the string compares, with number compares! Simple. I do that almost on every point that profiler say that is slow. I change also all small points that can be even a little faster, and disable some not so used thinks. The results, 5% faster on NorthWind database, near 20% faster on my database with 250 tables. That is count with 500ms less in 10 seconds process on northwind, 100ms faster on my database on 500ms process time. I do not have captures to show you for that because I have made them with different code, different time, and track them down on paper. Anyway this is my story and my question on all that, what else do you know to make them even faster... For this measures I have use Subsonic 2.2 optimized by me, Subsonic 3.0.0.3 a little optimized by me, and Dot.Net 3.5

    Read the article

  • Circular reference error when outputting LINQ to SQL entities with relationships as JSON in an ASP.N

    - by roosteronacid
    Here's a design-view screenshot of my dbml-file. The relationships are auto-generated by foreign keys on the tables. When I try to serialize a query-result into JSON I get a circular reference error..: public ActionResult Index() { return Json(new DataContext().Ingredients.Select(i => i)); } But if I create my own collection of "bare" Ingredient objects, everything works fine..: public ActionResult Index() { return Json(new Entities.Ingredient[] { new Entities.Ingredient(), new Entities.Ingredient(), new Entities.Ingredient() }); } ... Also; serialization works fine if I remove the relationships on my tables. How can I serialize objects with relationships, without having to turn to a 3rd-party library? I am perfectly fine with just serializing the "top-level" objects of a given collection.. That is; without the relationships being serialized as well.

    Read the article

  • Querying a Cassandra column family for rows that have not been updated in X days

    - by knorv
    I'm moving an existing MySQL based application over to Cassandra. So far finding the equivalent Cassandra data model has been quite easy, but I've stumbled on the following problem for which I'd appreciate some input: Consider a MySQL table holding millions of entities: CREATE TABLE entities ( id INT AUTO_INCREMENT NOT NULL, entity_information VARCHAR(...), entity_last_updated DATETIME, PRIMARY KEY (id), KEY (entity_last_updated) ); The table is regularly queried for entities that need to be updated: SELECT id FROM entities WHERE entity_last_updated IS NULL OR entity_last_updated < DATE_ADD(NOW(), INTERVAL -7*24 HOUR) ORDER BY entity_last_updated ASC; The entities returned by this queries are then updated using the following query: UPDATE entities SET entity_information = ?, entity_last_updated = NOW() WHERE id = ?; What would be the corresponding Cassandra data model that would allow me to store the given information and effectively query the entities table for entities that need to be updated (that is: entities that have not been updated in the last seven days)?

    Read the article

  • ObjectContext ConnectionString Sqlite

    - by codegarten
    I need to connect to a database in Sqlite so i downloaded and installed System.Data.SQLite and with the designer dragged all my tables. The designer created a .cs file with public class Entities : ObjectContext and 3 constructors: 1st public Entities() : base("name=Entities", "Entities") this one load the connection string from App.config and works fine. App.config <connectionStrings> <add name="Entities" connectionString="metadata=res://*/Db.TracModel.csdl|res://*/Db.TracModel.ssdl|res://*/Db.TracModel.msl;provider=System.Data.SQLite;provider connection string=&quot;data source=C:\Users\Filipe\Desktop\trac.db&quot;" providerName="System.Data.EntityClient" /> </connectionStrings> 2nd public Entities(string connectionString) : base(connectionString, "Entities") 3rd public Entities(EntityConnection connection) : base(connection, "Entities") Here is the problem, i already tried n configuration, already used EntityConnectionStringBuilder to make the connection string with no luck. Can you please point me in the right direction!? EDIT(1) How can i construct a valid connection string?!

    Read the article

  • How to avoid StaleObjectStateException when transaction updates thousands of entities?

    - by ThinkFloyd
    We are using Hibernate 3.6.0.Final with JPA 2 and Spring 3.0.5 for a large scale enterprise application running on tomcat 7 and MySQL 5.5. Most of the transactions in application, lives for less than a second and update 5-10 entities but in some use cases we need to update more than 10-20K entities in single transaction, which takes few minutes and hence more than 70% of times such transaction fails with StaleObjectStateException because some of those entities got updated by some other transaction. We generally maintain version column in all tables and in case of StaleObjectStateException we generally retry but since these longs transactions are anyways very long so if we keep on retrying then also I am not very sure that we'll be able to escape StaleObjectStateException. Also lot of activities keep updating these entities in busy hours so we cannot go with pessimistic approach because it can potentially halt many activities in system. Please suggest how to fix such long transaction issue because we cannot spawn thousands of independent and small transactions because we cannot afford messed up data in case of some failed & some successful transactions.

    Read the article

  • Get an IDataReader from a typed List

    - by Jason Kealey
    I have a List<MyObject> with a million elements. (It is actually a SubSonic Collection but it is not loaded from the database). I'm currently using SqlBulkCopy as follows: private string FastInsertCollection(string tableName, DataTable tableData) { string sqlConn = ConfigurationManager.ConnectionStrings[SubSonicConfig.DefaultDataProvider.ConnectionStringName].ConnectionString; using (SqlBulkCopy s = new SqlBulkCopy(sqlConn, SqlBulkCopyOptions.TableLock)) { s.DestinationTableName = tableName; s.BatchSize = 5000; s.WriteToServer(tableData); s.BulkCopyTimeout = SprocTimeout; s.Close(); } return sqlConn; } I use SubSonic's MyObjectCollection.ToDataTable() to build the DataTable from my collection. However, this duplicates objects in memory and is inefficient. I'd like to use the SqlBulkCopy.WriteToServer method that uses an IDataReader instead of a DataTable so that I don't duplicate my collection in memory. What's the easiest way to get an IDataReader from my list? I suppose I could implement a custom data reader (like here http://blogs.microsoft.co.il/blogs/aviwortzel/archive/2008/05/06/implementing-sqlbulkcopy-in-linq-to-sql.aspx) , but there must be something simpler I can do without writing a bunch of generic code. Edit: It does not appear that one can easily generate an IDataReader from a collection of objects. Accepting current answer even though I was hoping for something built into the framework.

    Read the article

  • Error using Dynamic Data Filtering: missing datasource

    - by sebastiaan
    I am trying to use the ASP.NET Dynamic Data Filtering project, but I'm running into a problem during the configuration. I'm following the instructions on the author's blog, and everything works like described. Then it tells me to change the datasource using the designer view. I am told to select the "GridDataSource" in the "Configure data source" wizard. This option is not there though. I get all of the classes in my project, including the DataContext that was generated by Linq. When I choose "Show only DataContext objects", the dropdown ("Choose your context object:") is completely empty. When I turn of the checkbox and choose my DataContext class, I get asked which table I want and all that. But, as the whole purpose of a Dynamic Data site is NOT to use one single table, that's not much help. So I've looked at the instructions again and copied the resulting datasource from the example: <asp:DynamicLinqDataSource ID="GridDataSource" runat="server" EnableDelete="True" EnableUpdate="True"></asp:DynamicLinqDataSource> Which is exactly what I had, without the "WhereParameters" nodes in there. Now, when I run the list page however, I get an exception about a missing datasource from the filtering component. Of course when I remove the DynamicFilterRepeater, it works again. This is the meat of the exception: [InvalidOperationException: Missing DataSource] Catalyst.Web.DynamicData.DynamicFilterRepeater.GetTable() in D:\Catalyst\Projects\DynamicData\Project\Trunk\DynamicData\DynamicData\DynamicFilterRepeater.cs:74 Catalyst.Web.DynamicData.DynamicFilterRepeater.GetFilters() in D:\Catalyst\Projects\DynamicData\Project\Trunk\DynamicData\DynamicData\DynamicFilterRepeater.cs:81 Catalyst.Web.DynamicData.DynamicFilterRepeater.OnInit(EventArgs e) in D:\Catalyst\Projects\DynamicData\Project\Trunk\DynamicData\DynamicData\DynamicFilterRepeater.cs:106 How do I make the DynamicFilterRepeater recognize my datasource? I'm using VS2010 Pro, on a Win7 machine.

    Read the article

  • Unicode Collations problem ?

    - by Bayonian
    (.NET 3.5 SP1, VS 2008, VB.NET, MSSQL Server 2008) I'm writing a small web app to test the Khmer Unicode and Lao Unicode. I have a table that store text in Khmer Unicode with the following structure : [t_id] [int] IDENTITY(1,1) NOT NULL [t_chid] [int] NOT NULL [t_vn] [int] NOT NULL [t_v] [nvarchar](max) NOT NULL I can use Linq to SQL to do CRUD normally. The text display properly on the web page, even though I didn't change the default collation of MSSQL Server 2008. When it comes to search the column [t_v], the page will take a very long time to load and in fact, it loads every row of that column. It never compares with the "key word" criteria that I use for the search. Here's my query for the search : Public Shared Function SearchTestingKhmerTable(ByVal keyword As String) As DataTable Dim db As New BibleDataClassesDataContext() Dim query = From b In db.khmer_books _ From ch In db.khmer_chapters _ From v In db.testing_khmers _ Where v.t_v.Contains(keyword) And ch.kh_book_id = b.kh_b_id And v.t_chid = ch.kh_ch_id _ Select b.kh_b_id, b.kh_b_title, ch.kh_ch_id, ch.kh_ch_number, v.t_id, v.t_vn, v.t_v Dim dtDataTableOne = New DataTable("dtOne") dtDataTableOne.Columns.Add("bid", GetType(Integer)) dtDataTableOne.Columns.Add("btitle", GetType(String)) dtDataTableOne.Columns.Add("chid", GetType(Integer)) dtDataTableOne.Columns.Add("chn", GetType(Integer)) dtDataTableOne.Columns.Add("vid", GetType(Integer)) dtDataTableOne.Columns.Add("vn", GetType(Integer)) dtDataTableOne.Columns.Add("verse", GetType(String)) For Each r In query dtDataTableOne.Rows.Add(New Object() {r.kh_b_id, r.kh_b_title, r.kh_ch_id, r.kh_ch_number, r.t_id, r.t_vn, r.t_v}) Next Return dtDataTableOne End Function Please note that I use the exact same code and database design with Lao Unicode and it works just fine. I get the returned query as expected for the search. I can't figure out what the problem with searching for query in Khmer table.

    Read the article

  • How to Process Lambda Expressions Passed as Argument Into Method - C# .NET 3.5

    - by Sunday Ironfoot
    My knowledge of Lambda expressions is a bit shaky, while I can write code that uses Lambda expressions (aka LINQ), I'm trying to write my own method that takes a few arguments that are of type Lambda Expression. Background: I'm trying to write a method that returns a Tree Collection of objects of type TreeItem from literally ANY other object type. I have the following so far: public class TreeItem { public string Id { get; set; } public string Text { get; set; } public TreeItem Parent { get; protected set; } public IList<TreeItem> Children { get { // Implementation that returns custom TreeItemCollection type } } public static IList<TreeItem> GetTreeFromObject<T>(IList<T> items, Expression<Func<T, string>> id, Expression<Func<T, string>> text, Expression<Func<T, IList<T>>> childProperty) where T : class { foreach (T item in items) { // Errrm!?? What do I do now? } return null; } } ...which can be called via... IList<TreeItem> treeItems = TreeItem.GetTreeFromObject<Category>( categories, c => c.Id, c => c.Name, c => c.ChildCategories); I could replace the Expressions with string values, and just use reflection, but I'm trying to avoid this as I want to make it strongly typed. My reasons for doing this is that I have a control that accepts a List of type TreeItem, whereas I have dozens of different types that are all in a tree like structure, and don't want to write seperate conversion methods for each type (trying to adhere to the DRY principle). Am I going about this the right way? Is there a better way of doing this perhaps?

    Read the article

  • Subsonic 3, MySql, won't update record.

    - by Warspawn
    [WebMethod] public string GetAuthToken(string username, string password) { var db = new LogicDB(); //var results = from u in db.Users // where u.Username == username && u.Password == password // select u; User u = db.Select .From<User>() .Where(UsersTable.UsernameColumn).IsEqualTo(username) .And(UsersTable.PasswordColumn).IsEqualTo(password) .ExecuteSingle<User>(); if (u == null) { return "{'success': false, 'reason': 'Invalid username and/or password.'}"; } else { // really there should only be one match... Guid code = Guid.NewGuid(); u.Securitycode = code.ToString(); u.Securityexp = System.DateTime.Now.AddHours(24); //u.Save(db.Provider); return "{'id':'" + u.Id.ToString() + "', 'code':'" + code.ToString() + "', 'exp':'" + u.Securityexp.ToString() + "'}" + "\n\n<br/><br/>" + u.GetDirtyColumns().ToArray().ToString(); } } When I run that, I keep getting: System.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary. This is when u.Save(db.Provider); is uncommented. And happens even with just u.Save(); or using the linq query above results instead.

    Read the article

  • Enabling Service Broker in SQL Server 2008

    - by Truegilly
    Hello, I am integrating SqlCacheDependency to use in my LinqToSQL datacontext. I am using an extension class for Linq querys found here - http://code.msdn.microsoft.com/linqtosqlcache I have wired up the code and when I open the page I get this exception - "The SQL Server Service Broker for the current database is not enabled, and as a result query notifications are not supported. Please enable the Service Broker for this database if you wish to use notifications." its coming from this event in the global.asax protected void Application_Start() { RegisterRoutes(RouteTable.Routes); //In Application Start Event System.Data.SqlClient.SqlDependency.Start(new dataContextDataContext().Connection.ConnectionString); } my question is... how do i enable Service Broker in my SQL server 2008 database? I have tried to run this query.. ALTER DATABASE tablename SET ENABLE_BROKER but it never ends and runs for ever, I have to manually stop it. once I have this set in SQL server 2008, will it filter down to my DataContext, or do I need to configure something there too ? thanks for any help Truegilly

    Read the article

< Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >